Battlefield Hardline VGA graphics performance review

Published by

Click here to post a comment for Battlefield Hardline VGA graphics performance review on our message forum
https://forums.guru3d.com/data/avatars/m/130/130124.jpg
I was hopping to see AMD 290X in 3840x2160 - Ultra - 4xMSAA. Maybe you can add them in the future?
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
I was hopping to see AMD 290X in 3840x2160 - Ultra - 4xMSAA. Maybe you can add them in the future?
Oh buddy, can you just please read the text in articles as well ?
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
I was hopping to see AMD 290X in 3840x2160 - Ultra - 4xMSAA. Maybe you can add them in the future?
He's locked out of the game.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
My 280X still sits between 770 and 780, not bad. Let's just hope Witcher 3 will be playable on it.
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
I was hopping to see AMD 290X in 3840x2160 - Ultra - 4xMSAA. Maybe you can add them in the future?
So that Hilbert dude is an Nvidia fanboy is the first thing that came to you mind eh ? Yeah, well remember the HardwareID DRM thingy ? I am locked out of the game at this point and have not been able to test the AMD Radeon cards in Ultra HD. And that kind of blows as we even have a Radeon R9 290X with 8 GB of graphics memory in the house that I wanted to pitch in with Ultra HD compared towards the GTX 980 and Titan X. AMD is doing surprisingly well in this game with much cheaper hardware. BTW look at the performance CRUMBLE in Ultra HD, these cards all will need to drop down from 4xMSAA towards 2xMSAA. But again, I can't show you that due to the lockout. The GTX 780 Ti crumbles, it is a 3GB VRAM card, this resolution versus our settings eat close to 4 GB of graphics memory. We'll show you that on the next page. Anyway, I'll owe you the Radeon Ultra HD results as well as FCAT frametime results as currently my EA accounts are a dead ship in the water due to the DRM HardwareID lockouts.
Here you go.
https://forums.guru3d.com/data/avatars/m/248/248203.jpg
the irony is you'd better better off benching w/an *evaluation* coughtorrentcough version . .
https://forums.guru3d.com/data/avatars/m/130/130124.jpg
Oh buddy, can you just please read the text in articles as well ?
I did, that's why i said maybe in the future when the EA crap get's fixed you can add them to the review.
data/avatar/default/avatar40.webp
The 290/290x seem to be getting better with age. 🙂
https://forums.guru3d.com/data/avatars/m/260/260828.jpg
Does this game have Mantle compatibility? Sorry for my english
https://forums.guru3d.com/data/avatars/m/251/251033.jpg
Does this game have Mantle compatibility? Sorry for my english
Yes, Mantle is supported by Hardline. ...Buying a 290X was the best decision I made in 2013!
https://forums.guru3d.com/data/avatars/m/224/224796.jpg
Thanks for the review Hilbert! *EA stinks like a strip club after Mardi Gras
https://forums.guru3d.com/data/avatars/m/201/201426.jpg
The 290/290x seem to be getting better with age. 🙂
Im very happy to see how the 290 did. Though I only purchased my card late last year, its already proven to be quite strong still. I wanted to switch around this time. Last two setups were AMD. I usually switch between both sides every couple of cards. But no way in hell I was spending $300 for a GTX 780 3GB when I got my R9 290 4GB OC for $242 with shipping.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
The 290/290x seem to be getting better with age. 🙂
As is all AMD GPUs. AMD/ATi has always been at a disadvantage at release date because their drivers always start out very poor. By the time the GPUs perform the way the hardware was intended, they're already obsolete. I would argue that AMD usually releases better hardware than nvidia, but nvidia's drivers are definitely superior. On release date (of either the hardware or the games), drivers are what matters most, and that's why nvidia tends to rank better. I think they just need to re-write their drivers from the ground up. Anyway, I'll be interested to see Hardline's results if/when they release a Mantle version. I have no intention on getting the game but it still interests me.
https://forums.guru3d.com/data/avatars/m/209/209401.jpg
Thanks for the review Hilbert! *EA stinks like a strip club after Mardi Gras
I dont know the smell...
https://forums.guru3d.com/data/avatars/m/115/115616.jpg
Thanks for the benchmarks, Hilbert! Do you think it still makes sense to use MSAA at UHD resolution? I was expecting better 970 and 980 performance compared to 290(x), and it turns out, that the AMD cards can handle this game nicely. Also, this DRM sucks. I'm glad I haven't bought this game with preorder. Now I can vote with my wallet.
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
Thanks for the benchmarks, Hilbert! Do you think it still makes sense to use MSAA at UHD resolution?
No, I wanted to test some other AA possibilities, e.g. a lower MSAA (2x) value and for Nvidia MFAA. However ... I can't start-up the game and test/continue with this article until the DRM protection gives me permission to access the game again.
https://forums.guru3d.com/data/avatars/m/115/115616.jpg
No, I wanted to test some other AA possibilities, e.g. a lower MSAA (2x) value and for Nvidia MFAA. However ... I can't start-up the game and test/continue with this article until the DRM protection gives me permission to access the game again.
Thanks again. I really get your frustration and I hope they'll resolve this issue.
https://forums.guru3d.com/data/avatars/m/260/260886.jpg
Nice review Hilbert, One question, what about the 295x or crosfire or sli?
data/avatar/default/avatar35.webp
This game seems to use a lot of screen space reflection. Does anyone know if the SSR covers dynamic objects as well as static? I want to play the SP campaign, but I don't want to pay full price for the game as I have no interest in MP..
data/avatar/default/avatar15.webp
The 290/290x seem to be getting better with age. 🙂
I'm certain it's because of Hawaii's compute performance why it's doing so well in this game.. Kepler was relatively much weaker in compute, whilst Maxwell improved compute performance for NVidia by a large margin.. Frostbite 3 engine uses lots of compute shaders for all sorts of things to speed up performance.