Exploring ATI Image Quality Optimizations

Graphics cards 1048 Page 1 of 1 Published by

teaser

ATI's Image Quality Optimizations

 

A quick word ...

AMD ATI's Image Quality Optimizations

square-default.jpgA somewhat heated topic amongst graphics card manufacturers is how to get as much performance out of a graphics card with as little as possible image quality loss. In the past both ATI and NVIDIA have been found guilt on cheating in applications to gain better performance. Both companies promised from there on to stay away from cheats.

Topic of discussion for many weeks now is that AMD has been applying a series of optimizations that could easily be seen and explained as a cheat. 

Currently with the Radeon HD 6000 series release the Catalyst drivers (Catalyst AI) have a new setting which allows control over control texture filtering with settings for 'High Quality', 'Quality' and 'Performance'.

High Quality turns off all optimizations and lets the software run exactly as it was originally intended to. Quality, which is now the default setting - applies some optimizations that AMD believes remains objective and keeps the integrity of the image quality at high levels while gaining some performance. The last setting is the Performance setting which applies supplementary optimizations to gain even more performance.

Image1.png

So what's the problem ?

Well that's simple; both NVIDIA and AMD ATI should deliver (at default driver settings) the very same image quality. The problem is that with the ATI Catalyst 'Quality'  setting, which has been defaulted, ATI applies a performance enhancement to be found in texture filtering amongst others.

The optimization can be seen, it really is visible image quality degradation. But here's the dilemma, it remains difficult to spot if you do not know exactly what you are looking for. Would this question not have been raised then 99.99% of you guys would have never noticed it.

The optimization however allows ATI to gain 6% up to 10% performance at very little image quality cost. And I choose the words 'very little' here very wisely. The bigger issue and topic at hand here is no-matter what, image quality should be 100% similar in-between the graphics card vendors for objective reasons.

Now there's more going on then the two primary optimizations you guys know of, but this is the root of the discussion: The Quality setting enables a Trilinear optimization as well as an anisotropic optimization, both are designed to have no visible impact on image quality while offering improved performance.

So what does that optimization look like in real world gaming. Well it's really hard to find actually. In environments with a lot of depth, say a road with a textured grid at the end of that road you would be able to spot some anomalies, but only if you look really carefully and with the scene properly lit.

hd5000_filterqualitaet_pic4_preview.jpg

The above screenshot is courtesy of 3DCenter.org and their excellent article, it shows the complexity and the optimizations really well. We seriously had a hard time finding an application where the optimizations show well. So mind you, the above example is catered to show the image quality anomaly.

To your left you can see NVIDIA at work to the right an ATI Radeon 5000 series card, look very close at the depth texture or the road. Now to be able to spot the optimization you need simplicity in the scene and a surface area like shown in the screenshot.

You can see that the right image from the Radeon 5000 card shows more dilapidation -- the banding on the grid. Obviously in motion you can see the effect better. But above is a perfect example of what the optimization is all about image quality wise. So the question here is, is it considered a cheat to apply the optimizations or is this something you can life with ?

Download

But let's take a more real-world example then, please download all three images, and load them up in say Photoshop. Make sure you zoom in at all screenshots to 100% and then start to compare the three images (each lossless 12 MB BMP files).

This is Mass effect, 16xAF and Trinlinear filtering enabled in the configuration file. We are in the spaceship and have positioned ourselves with a surface area where the optimization should show really well. Yet we have a more complex scene with nice textures and lots of colors, much less bland then the simple example shown previously. This time just a Radeon HD 6850 with the optimization on and off and a GeForce GTX 580.

We have a hard time spotting differences as much as you do, and while making the screenshots we increased gamma settings to 50% and applied a resolution of 2560x1600 to try it look more visible.

Do you spot the difference ? Probably not, that is the rule we life by here at Guru3D, if you can not see it without blowing up the image or altering gamma settings and what not, it's not a cheat. And sure, we know .. this game title is not a perfect example, it however is a good real world example.

What about the performance hit ?

What kind of performance gain does the optimization deliver ? That we can measure more easily, let's run Far Cry 2 with the Quality setting enabled and disabled (High quality - no optimizations):

  • Level: Ranch Small
  • High-quality DX10 mode
  • 8x AA (Anti-Aliasing)
  • 16x AF (Anisotropic Filtering).

So here  we have Far Cry 2, we run 8xAA and 16xAF in Dx10 mode. The overall performance difference is roughly 8% maybe 9%, and that is a ample difference  in the graphics arena alright.

But let's take another title, Dirt2:

Image Quality setting:

  • Baja Iron Route 1
  • 8x Anti-Aliasing
  • 16x Anisotropic Filtering
  • All settings maxed out

Again we are seeing similar behavior and observe again a performance benefit hovering around 8%

What does this mean for Guru3D.com ?

We will uphold the policy that we test games and benchmarks at the very same driver settings, meaning we leave driver settings at default and nothing else. We never forfeit on that, neither are we willing to compensate and adapt to different preferences in drivers. Why? Because it is the sole responsibility of parties like ATI and NVIDIA to offer you objective comparable quality settings.

If a graphics card manufacturers chooses to forfeit on image quality then that is that company's sole decision and they will have to put their product into retail with the knowledge that the end-users KNOW they forfeit on image quality. In the end that's going to haunt, taunt and blow up in ATI's or NVIDIA's face without doubt. Take NVIDIA's 3DMark 03 cheat for example, it haunted them for years and is still the first thing people think of when we touch the topic of optimizations.

Forfeiting on image quality will cost the manufacturer business as end-users want the best product. Especially in the high-end performance graphics area people really care about image quality.

For weeks now here at Guru3D we have been discussing and debating whether or not to disable the ATI optimization manually. But that one question remains, is it a valid optimization or an unacceptable cheat ? Well, we conclude that it's not a cheat ... but it remains to be a trivial optimization that can be seen if you know how to look, seek and find it. And that does not sit right with us.

We urge and recommend AMD/ATI to disable the optimization at default in future driver releases and deal with the performance loss, as in the end everything is about objectivity and when you loose consumer trust, which (as little as it is) has been endangered, that in the end is going to do more harm then good. The drop in 3-4 FPS on average is much more acceptable then getting a reputation of being a company that compromises on image quality. And sure it raises other questions, does ATI compromise on other things as well ? See, the cost already outweigh the benefits.

So the moral right thing to do for AMD/ATI is to make the High Quality setting the standard default. But again here we have to acknowledge that it remains a hard to recognize and detect series of optimizations, but it is there and it can be detected.

That's it for this too short article. It is silly season and it's busy as heck, I would have liked to spend more time on this matter with more examples, but time does not allow me to do so. We published this quick one page article to explain you where we stand but most of all to make you understand what the ATI optimization looks like and how can recognize it.

With that said I would like to plea this, it would be very wise to see the graphics industry move to a gentlemen's agreement where visible optimizations are simply not a default preference.

We'd love to hear from you in our forums as to how big the image quality impact matters to you. What do you think, a subjective choice from ATI or valid optimization ?

Share this content
Twitter Facebook Reddit WhatsApp Email Print