Battlefield Hardline VGA graphics performance review

Published by

Click here to post a comment for Battlefield Hardline VGA graphics performance review on our message forum
data/avatar/default/avatar40.webp
THX Hilbert for the great comparison! Are you going to add Benchmarks using Mantle API on Radeon GPUs later? (I hope you get full access to your account / game soon, this type of DRM definitly suxx -.-)
https://forums.guru3d.com/data/avatars/m/242/242371.jpg
I won't be buying this game, but seeing how close the 290x is to the 980 at 2560x1440p makes me cry. Is it because of Mantle? I could've almost bought two 290x's for the price of 1 980. Don't get me wrong I still love my 980, I only bought it because my 780 went wrong. Other games aren't this close though.
data/avatar/default/avatar07.webp
Thanks for the benchmarks, Hilbert! Do you think it still makes sense to use MSAA at UHD resolution? I was expecting better 970 and 980 performance compared to 290(x), and it .
Why ? The GTX970 is way faster than 290X 🙂
https://forums.guru3d.com/data/avatars/m/258/258801.jpg
Why ? The GTX970 is way faster than 290X 🙂
That guy has no idea what he's doing, look at the frametime for the 290X. There's something wrong in his system. He states himself frametime is actually better for AMD in multiplayer and worse for nVidia. I'll wait for Hilbert's frametime results.
data/avatar/default/avatar25.webp
https://forums.guru3d.com/data/avatars/m/258/258801.jpg
Come on, this Benchmark is way better http://www.pcgameshardware.de/Battlefield-Hardline-PC-258805/Specials/Benchmark-Test-1154059/ Non-Reference cards, min-framerate etc.:) ...take a look on the frametimes (SINGLEPLAYER) 🙂 GTX970 http://www.pcgameshardware.de/screenshots/original/2015/03/Frametimes_SP_GTX_970-pcgh.png 290x http://www.pcgameshardware.de/screenshots/original/2015/03/Frametimes_SP_290X_DX_20150325105837-pcgh.png AMD have not even a chance.
Did you just listen to what I said? Frametime seems to be completely brokenly stuttery for AMD in single player and the opposite in multiplayer.
data/avatar/default/avatar23.webp
There is a big difference currently in the gameplay experience between NVIDIA GPUs and AMD GPUs in Battlefield Hardline. This comes down to what we have tested as the frame times. For whatever reason AMD GPUs are doing very poorly in terms of frame time in this game. There are large peaks of longer periods of time between frames. There is wild inconsistency, and a general higher frame time average than with NVIDIA GPUs. This translates into the game feeling choppy even though the frame rates are high on AMD GPUs. It affects every AMD GPU we tested, Radeon R9 290X, Radeon R9 290 and Radeon R9 285. It affects every map we played, the larger maps like Dust Bowl were the worst. This is a dramatic and devastating detriment to gameplay on AMD GPUs. As it is right now, NVIDIA GPUs simply offer smoother gameplay in Battlefield Hardline There is one month until release of this game. AMD right now has the opportunity to improve gameplay on its video cards by the time the game releases. Otherwise, there will be a lot of complaints from AMD video card users.
http://www.hardocp.com/article/2015/02/09/battlefield_hardline_video_card_performance_preview/7#.VNrVdkLFtBw
https://forums.guru3d.com/data/avatars/m/258/258801.jpg
The guy from the original article you posted is contradicting himself then. "Weniger beeindruckend als die Frameraten sind die Frametimes. Diese zeigen einige Auffälligkeiten, sowohl bei AMD als auch bei Nvidia. Interessanterweise ist die Bildausgabe mit einer Geforce-GPU im Singleplayer deutlich ausgeglichener als mit einer Radeon, im Multiplayer verkehrt sich das Bild. Hier zeigen die Nvidia-Grafikkarten lästiges Microruckeln, während die AMD-GPUs ein ausgeglichenes Bild liefern. Sowohl unser Einzel- als auch Multiplayer-Benchmarks wurden beim Führen eines Fahrzeugs bei hoher Geschwindigkeit aufgezeichnet, sind daher sehr Streaming-intensiv. Offenbar gibt es dabei noch einige Probleme. Das ist insbesondere deswegen ärgerlich, weil zumindest das Mikroruckeln mit Geforce-GPUs bereits in der letzten Beta-Phase auftrat und sowohl Entwickler Visceral Games als auch Nvidia bekannt sein müsste - und dennoch nicht behoben wurde."
https://forums.guru3d.com/data/avatars/m/248/248721.jpg
Thanks for nice review Hilbert... Quote from review, last page: "...The new DRM even keeps an eye on CPU changes...". I hope that CPU OC up or down is not considered as "CPU change" so that DRM would be triggered by that "change", that would be maximum stupidity...
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
bigosik Ok we get it AMD runs worse, let them make a driver latency optimization for it. I think you've proven your point few posts above
data/avatar/default/avatar38.webp
As is all AMD GPUs. AMD/ATi has always been at a disadvantage at release date because their drivers always start out very poor. By the time the GPUs perform the way the hardware was intended, they're already obsolete. I would argue that AMD usually releases better hardware than nvidia, but nvidia's drivers are definitely superior. On release date (of either the hardware or the games), drivers are what matters most, and that's why nvidia tends to rank better. I think they just need to re-write their drivers from the ground up. Anyway, I'll be interested to see Hardline's results if/when they release a Mantle version. I have no intention on getting the game but it still interests me.
Not really! GTX970 is in almost every game faster http://www.hardware.fr/articles/932-21/benchmark-tomb-raider.html ....consume less power, much quieter, PHYSX GPU, HBAO+ bits /AA bits etc. way better card imo.
https://forums.guru3d.com/data/avatars/m/261/261554.jpg
Hilbert used 15.3 Beta compared to others 14.12 (omega) There should be any improvement, no ?
https://forums.guru3d.com/data/avatars/m/227/227853.jpg
Not really! GTX970 is in almost every game faster http://www.hardware.fr/articles/932-21/benchmark-tomb-raider.html ....consume less power, much quieter, PHYSX GPU, HBAO+ bits /AA bits etc. way better card imo.
Dude we get it, you got your point across. It's not necessarily about the cards, it's about the drivers. We know AMD tends to screw up drivers from time to time.
https://forums.guru3d.com/data/avatars/m/248/248472.jpg
GTX Titan is listed, but nothing about which version. I have a pair of original Titan's and a pair of Titan Blacks. Which are you using? I may get a pair of Titan X, after they've been tested a while longer, and sell off some of my other cards. I can't help but like the extra memory of the Titan X. I can't help but wonder how my ASUS ARES 2 and 3 handle this most modern of games. I'm also wondering what this game has to do with Battlefield. I play the Battlefield 1942 series. The newer ones just plain suck. This game appears to be more about police than the Army, Navy and Air Forces and Marines. It looks as if they put the wrong name on the game.
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
... MP too.:puke2: AMD have huge problems with the frametimes http://www.hardocp.com/article/2015/02/09/battlefield_hardline_video_card_performance_preview/6#.VRNVGOExmUk The game runs way better on NV cards.
No, they just do not measure with FCAT. Meaning the spikes you see are internally measured in the game engine, not what you see with your own eyes. E.g. they measure the rendering engine and not display output. This is the sole reason why FCAT was introduced as it plots what you actually see. http://www.guru3d.com/index.php?ct=articles&action=file&id=15352 The article has been updated with extra UHD results and fcat you guys. Indeed a tiny hint more frame pacing latency in-between frames, nothing dramatic though.
data/avatar/default/avatar10.webp
No, they just do not measure with FCAT. Meaning the spikes you see are internally measured in the game engine, not what you see with your own eyes. E.g. they measure the rendering engine and not display output. This is the sole reason why FCAT was introduced as it plots what you actually see.
Well thats unfortunate. Sometimes fraps, internal engine and FCAT will yield identical results. But of course the proper way to measure frametimes is FCAT.