So on the following two pages I want to show you a couple of in-game screenshots and basically focus on actual in-game quality settings that we sue and which you can change. We always peek as well in-between AMD and NVIDIA's quality differences e.g. in-between AMD Radeon and NVIDIA GeForce graphics cards. Honestly, image quality these days for both companies are near equal. You can zoom in and blow up the still screenshots and detect minor differences if you look with a very professional eye as yes -- both companies do optimize games for better performance. We'll leave that as is as there is much more to see quality wise with the in-game modes
Why should you use change or alter internal visual quality modes ? Well for one simple reason, performance .. if you have a lower grade graphics card and would like to plat in Full HD then changes are pretty slim your graphics card would be fast enough. Throttling down towards a lower image quality mode can significantly boost your performance, at the cost of visual quality though. But it creates a good balance in-between what your graphics card can render and what you find acceptable for your average in frames per second.
COD-BO2 has many modes, settings and preferences you can choose from. The thing is you should easily be able to run this game at even the higest settings. In our example below we will use a relatively simple GeForce GTX 660 including a wide varyity of AA modes.
First this, for our default test mode will have the very best quality in textures, shadows and terrain. Pretty much any modern DX11 card can run the game at 1920x1080/1200 as you are about to find out.
I'll even stretch that though, look what happens with a very generic DX11 class GeForce GTX 660 in the chart below:
So as you can see, if we take a GeForce GTX 660 (not the Ti model, the regular one) we can enable all quality settings to maximum AND enable 8xMSAA / FXAA enabled. It still renders 98 FPS on average in 1920x1200.
If you lack performance however triggering down in image quality modes does offer heaps of extra performance, at the cost of quality of course. Truth be told, you shouldn't need to do that.
We test with our image quality settings set at 4xMSAA / FXAA enabled. Above you can see the output quality.
Now overall compared to other games I wish the visual experience would have been greater with the new DX11 engine. MOH with the new Frostbite engine simply looks so much better. But Activion's or better yet Treyarch's philosophy always has been to push for a 60 FPS experience, that is always at the cost of details and graphics quality. Still -- the engine and output quality definitely improved of the previous versions but we had hoped for a little more.
Call Of Duty Black Ops II VGA Graphics performance review Call Of Duty Black Ops II VGA Graphics benchmark performance with 21 graphics cards. The new and massively popular Activision title is a great looking game. But how will it perform on a selection of different graphics cards ? This review will cover all these basics and then more.
Call of Duty Modern Warfare 2 VGA performance Last week the long awaited sequel to Call of Duty Modern Warfare was released -- COD Modern Warfare 2 probably has been one of the biggest anticipated PC game titles this year. This is Guru3D.com we have a fetish for graphics cards and games, as such we took no less than twenty-three graphics configurations and put performance to the test.
Over the next few pages a word or two about the game, some image quality comparisons and obviously a nice lengthy VGA chart ranging from monitor resolution 1280x1024 up-to 2560x1600 measured with and with out AA.
Call of Duty World at War VGA graphics performance test Here we test 16 graphics cards on performance with Call of Duty 5: World at War. Over the next pages we'll compare some images quality settings in-between ATI and NVIDIA, look at a couple of screenshots and then head onwards towards our benchmark test.