Dragon Age: Inquisition VGA graphics performance review

Game reviews 126 Page 1 of 1 Published by

Click here to post a comment for Dragon Age: Inquisition VGA graphics performance review on our message forum
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Seeing as Mantle is designed to reduce CPU overhead and isn't necessarily targeted toward improving GPU performance, I can't say the results that that surprising. Run this on a crappy CPU like an A6 and I'm sure you'll see some good results. Anyway, I likely won't be getting this game. Dragon Age: Origins was very disappointing - I'm glad I got it free on Origin. It basically felt like a severely dumbed-down version of Neverwinter Nights and pretended to have a much more interesting story than it really had. I never even bothered to complete the game because it felt like a chore.
https://forums.guru3d.com/data/avatars/m/200/200207.jpg
Yep those results look spot on. Been actually wondering myself about perf as been noticing the odd dip to 45fps or so... Might trying lowering from ultra... (needs another 970).
https://forums.guru3d.com/data/avatars/m/258/258688.jpg
Wait what? GTX 970 on full HD Ultra quality only 50 fps? I don't get it. What's going on with the games these days? I just ordered a GTX 970 and I'm already disappointed, this is quite a high-end card. My 560 ran Deus Ex Human Revolution on 60 fps. And that was a 250-dollar card. I bought this rig exactly when DXHR came out. I was expecting at least 70 fps from current games on Full HD from cards around 970 performance. The 970 also barely runs AC:Unity at 40 fps. Seriously do I have mental issues or games are starting to trash single-card setups?
Mini-rant follows...;) Your hardware is fine, but... What you're seeing is what I call "console-rationalization" on the part of software developers these days...or, better yet, "consolitis"....;) Some developers & publishers have really come down with a bad case of it. Everywhere we're seeing artificial 30 fps/60 fps limits set in the game software so that even if you want to see what sort of frame rates your spiffy new card is capable of you *can't.* At least, you can't without a bit of investigation and some configuration work (and then maybe you can.) Blame it on the consoles--whether it's the lousy xBone's gpu-that-struggles-for-1080P-in-2014--or the PS4, which is still slower than my 1GHz 2GB HD7850 + & hobbled by a shared system/vram memory bus--that's the problem. Years ago, only Carmack at id liked to throw-in artificial frame-rate-limiters to cover over some poor engine design or other--including but not limited to vsync issues--but with these newer consoles things are really getting out of hand. What you are seeing is, imo, a deliberate attempt on the part of developers and publishers to blur the lines as much as possible between wide-open, free and unfettered PC gaming software that can make use of the hardware you throw at it, and these bottom-feeding consoles. The publishers and developers (mainly Microsoft & Sony) are *paying money* (just my guess) to get new releases *dumbed down* or hobbled or whatever you want to call it--just so the public won't immediately see how much more gaming power they can buy in a computer over either console for just a wee bit more cash up front. Now that both consoles are literally low-end x86 gaming PCs--only closed off and proprietary, too, it's become especially important to publishers & developers like Microsoft & Sony to keep the general markets as ignorant of these things for as long as possible. Scream loud & long enough at your favorite developers and I think they will listen...eventually...;) Meantime, fire up some games developed from a time when frame-rate was king and watch those fps meters spin...! Edit: One of the first things anyone should do with a new 3d card that is purportedly much more powerful than his old one is (a) crank up the resolution, if possible, and (b) lay on the FSAA & eye candy. That's where the power of a new gpu will really shine through...if you happen to be running a cpu-limited game and you leave the eye-candy where it used to be on the old card and you run at the same resolution, but you expect to see a dramatic frame-rate improvement, you are most likely going to be really disappointed. The new card should allow you to run smoothly at resolutions and with layers of eye candy that the your old card *choked* on...that's really how you will see a big difference, imo.
https://forums.guru3d.com/data/avatars/m/164/164033.jpg
Seeing as Mantle is designed to reduce CPU overhead and isn't necessarily targeted toward improving GPU performance, I can't say the results that that surprising. Run this on a crappy CPU like an A6 and I'm sure you'll see some good results. Anyway, I likely won't be getting this game. Dragon Age: Origins was very disappointing - I'm glad I got it free on Origin. It basically felt like a severely dumbed-down version of Neverwinter Nights and pretended to have a much more interesting story than it really had. I never even bothered to complete the game because it felt like a chore.
And it has the best story and atmosphere out of DA games so yeah 😀
https://forums.guru3d.com/data/avatars/m/164/164033.jpg
Bad coding and bad optimization that all, lower some settings will gain u good fps. most games have settings that take fps and do barely anything. 2005-2006 era is gone, where high end GTX could reach 100fps at all games maxed out even the most demanding, i recon the money we pay for all gpus is overpriced way overpriced.
Yeah they sure did at resolution of 800x600. Nope they weren't that fast when gaming at 1600x1200+ it was around the same as of now really we are just gaming at way higher resolution and image quality... While 7900 gtx and ati x1950xtx were good cards they did not play nearly everything even at 60fps... 7800 gtx barely managed 60fps on doom 3 with AA and AF and Doom3 had been out for a year. They were nowhere near close to 60 fps when it came to 1920x1200 resolution on the games that came out on 2006 the demanding ones in some you barely broke 30. So nope never heard of such era 😀 Nostalgia is an interesting thing.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Even the "Super" era of 8800GTX/Ultra could not maxed out Crysis original so yeah, i dont remember it either.
https://forums.guru3d.com/data/avatars/m/90/90667.jpg
Even the "Super" era of 8800GTX/Ultra could not maxed out Crysis original so yeah, i dont remember it either.
Crysis is not a real example, i do remember at HD resolution and 1600 high end cards shine more than today, but the could be attributed to the fact that games were more optimized than today, which most of them broken on day1.
https://forums.guru3d.com/data/avatars/m/164/164033.jpg
Crysis is not a real example, i do remember at HD resolution and 1600 high end cards shine more than today, but the could be attributed to the fact that games were more optimized than today, which most of them broken on day1.
That is like saying AC Unity is not and real example or that DA 3 is not and real example. The cards failed to run many games at steady 60fps back then the norm was more or less around 40-50fps for the high end cards. Some games like Crysis took long to run 60fps being a bit ahead of it's time. STALKER series being most notorious. Tho I give you that AC Unity is the most crappy thing ever performance side and really bad coding.
https://forums.guru3d.com/data/avatars/m/235/235352.jpg
MSAA on top of Post AA for that "cinematic vaseline effect".
data/avatar/default/avatar21.webp
Nice article HH. So this is a second game that we see this mix up performance. 280X fast as 780, 290X faster 780ti. And that poor 680/770 left behind. Whats happening with kepler cards and their drivers? Did they do that on purpose, lol?
Yeah suddenly the 780ti is lacking , before the 980 was released it was the top dog.
https://forums.guru3d.com/data/avatars/m/90/90667.jpg
Yeah suddenly the 780ti is lacking , before the 980 was released it was the top dog.
NVIDIA messed up the driver big time, should fix it in the second one. many people running rumors it done on "purpose".
data/avatar/default/avatar05.webp
Wait what? GTX 970 on full HD Ultra quality only 50 fps? I don't get it. What's going on with the games these days? I just ordered a GTX 970 and I'm already disappointed, this is quite a high-end card. My 560 ran Deus Ex Human Revolution on 60 fps. And that was a 250-dollar card. I bought this rig exactly when DXHR came out. I was expecting at least 70 fps from current games on Full HD from cards around 970 performance. The 970 also barely runs AC:Unity at 40 fps. Seriously do I have mental issues or games are starting to trash single-card setups?
Nah, I like what I'm seeing here. The reason is that I think you're actually getting something for your money. You look at something like AC3, you need a mighty system to run it at all and it's because of terrible release quality. That's entirely the wrong sort of taking advantage of modern hardware - "Hey, people have 290's these days, let's not bother optimizing". Whereas what I have seen of this game is "Hey, lets write something that really makes use of that hardware". You can still run DA:I on lesser hardware just fine and it looks alright. AC3 apparently melts anything less than cutting edge GPUs. I think people have just got used to "Ultra" meaning normal and expecting to run things at it. Bioware seem to be treating High as *actually* meaning High and Ultra really being that.
https://forums.guru3d.com/data/avatars/m/94/94450.jpg
You advised putting in the 60fps cutscene hack, just a note, my game has had two separate problems with that on while trying to load cutscenes at the start of missions, meaning they would never load. Had to remove it to resolve the issue.
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
Even the "Super" era of 8800GTX/Ultra could not maxed out Crysis original so yeah, i dont remember it either.
Sure they could, it just didn't run as expected. =] I can't wait for this game!!! Just gotta wait a week...
https://forums.guru3d.com/data/avatars/m/66/66219.jpg
most games have settings that take fps and do barely anything.
I'm thinking that this is the situation here with DA3, because those numbers are... very wtf. Full scene reflections or shadows.... or something! Would be surprising if its a driver problem. I'll find out soon enough I guess, once I get my damn monitor back(being replaced) lol.
https://forums.guru3d.com/data/avatars/m/209/209146.jpg
For texture quality and also the VRAM usage I see you're using Ultra quality but it actually goes up one step higher to something Bioware has decided to call "Fade Touched" (Which is sort of a in-game term for being crazy.) but I guess it doesn't makes much of a difference in framerate besides some lower VRAM amount GPU's perhaps running into some stuttering. 🙂 (I guess it's like say Far Cry 4 where you can select the Ultra quality preset but then some settings - god-rays, ambient occlusion - can be pushed higher still.) Nice test. 🙂
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
Yeah suddenly the 780ti is lacking , before the 980 was released it was the top dog.
It still is with mini OC, my factory OC'ed 780GTX is like Titan perf. so ~50-51fps @ 1080p and that's just 4fps lower then 980gtx, nothing special 😀 Could get more then that with 1200mhz OC for sure 😯c: As for everyone wondering why it runs like that, overdone DOF, blur, shadows and some probably unwanted fx to deliberately cripple gpu, nothing new.. Check DeadRising3 DOF cut scenes, uber killer DOF, same by Metro2033 back then, Crysis1 shading @ v.high or now by Crysis3 @ v.high shading, etc
https://forums.guru3d.com/data/avatars/m/90/90726.jpg
All I know is: SLI970 @ 1440p @ 100HZ All settings at Ultra, no AA - SLI Performance fluctuates wildly - FPS ranges between 40-100 (wtf) and non sli though being much more stable, gets a paltry 35-60. Cutscenes are the largest offender, and they stutter regardless of sli on or off.. I'm managed to minimize it somewhat, but it still makes the game virtually unplayable since it causes the audio to skip as well. My GPU usage during the cutscenes looks like a heartbeat, up and down literally every second. Believe me when I say.. I've tried adjusting settings, adding user.cfg, changing drivers, you name it.. I have played barely an hour of this game as I refuse to play it in its current condition and have patiently been waiting on a patch and new driver as I am 100% certain that the problem is not with my hardware. What's funny though is reading and participating on the Bioware forums as you have a legion of fanboys who refuse to believe that it could be the game and insist that "you must be doing something wrong since it plays great for me". Also hilarious is to read users who are happy with a "nice smooth 30-40fps" just lol
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
For the RPG gamer/aficionado, this is a good title my man.
I loved Dragon age , have to to beat it, and still have yet to get 2, but most definitely get inquisition. Weath I get it for the PC or i get for the PS4 if and when i get that is another question. On brights side there are actual results for 1080p and results for more then just 4 cards and i didnt have to go digging to find said results like usual
It still is with mini OC, my factory OC'ed 780GTX is like Titan perf. so ~50-51fps @ 1080p and that's just 4fps lower then 980gtx, nothing special 😀 Could get more then that with 1200mhz OC for sure 😯c: As for everyone wondering why it runs like that, overdone DOF, blur, shadows and some probably unwanted fx to deliberately cripple gpu, nothing new.. Check DeadRising3 DOF cut scenes, uber killer DOF, same by Metro2033 back then, Crysis1 shading @ v.high or now by Crysis3 @ v.high shading, etc
You left out ubersampling on witcher 2
data/avatar/default/avatar31.webp
I've got a tip for people applying the 30fps fix. If you do not intend to play MP and want to for example play SP first you can do the fix through advanced commands in Origin so you do not need to change the shortcut, or even need to run the shortcut. Go to Origin>Game Properties> "Command line arguments" and apply the Review's fix there.(by only adding the 2 - commands. And while you're at it, disable Origin in-game for another 4-5FPS improvement on low vRAM gpu's right beneath the command line arguments/ I'm waiting to play this game after my Dualshock 4 controller shipment tomorrow, i tried playing with mouse & keyboard but i just can't get used to it, mainly because i played DA1 and 2 with a controller(a Logitech Fxxx in those times)