Rumor: AMD FidelityFX Super Resolution 2.0 is scheduled to be introduced

Published by

Click here to post a comment for Rumor: AMD FidelityFX Super Resolution 2.0 is scheduled to be introduced on our message forum
data/avatar/default/avatar14.webp
I hope upscaling becomes part of DirectX/Vulkan standard, so that it's up to the GPU how it handles it. That would be win-win scenario for consumers.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
I always find that "better than native" hilarious, but of course since what looks good is psychological, a matter of taste, it's not even automatically incorrect. Cameras (smartphone camera software) can have all kinds of filters to supposedly make an image better. Some simple things in Photoshop can potentially make images look better in human eyes. However, the fundamental problem is that game video is artificially created or reproduced (for example prerendered or recorded cutscene footage) in the first place. So, if a graphics card driver can make it look better, why didn't the game itself already make it look like that? And that's exactly where lies the problem in my opinion: If the driver makes the game look significantly/fundamentally different, is it not different from what the game studio intended? Of course a gamer's random screen might benefit from individual adjustment, which is not something game devs could easily handle, but automatic settings in a graphics card driver are no wiser, either.
https://forums.guru3d.com/data/avatars/m/277/277673.jpg
Kaarme:

I always find that "better than native" hilarious, but of course since what looks good is psychological, a matter of taste, it's not even automatically incorrect. Cameras (smartphone camera software) can have all kinds of filters to supposedly make an image better. Some simple things in Photoshop can potentially make images look better in human eyes. However, the fundamental problem is that game video is artificially created or reproduced (for example prerendered or recorded cutscene footage) in the first place. So, if a graphics card driver can make it look better, why didn't the game itself already make it look like that? And that's exactly where lies the problem in my opinion: If the driver makes the game look significantly/fundamentally different, is it not different from what the game studio intended? Of course a gamer's random screen might benefit from individual adjustment, which is not something game devs could easily handle, but automatic settings in a graphics card driver are no wiser, either.
Seems you are missing the point.
https://forums.guru3d.com/data/avatars/m/264/264183.jpg
yeah like mgs v on pc. the ultra DOF setting (I think it was in post processing setting?) was not the one Kojima intended for the game and ruined most of the cinematic shots. you had to put it on high and everything else on ultra. at that time nvidia advertised it as the better option but srsly it was not.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Digilator:

Seems you are missing the point.
That it's just advertisement speech? Perhaps. Maybe it's because I'm from a culture where ads have traditionally been fact-based, instead of being based on outrageous claims and superlatives.
https://forums.guru3d.com/data/avatars/m/180/180081.jpg
Kaarme:

That it's just advertisement speech? Perhaps. Maybe it's because I'm from a culture where ads have traditionally been fact-based, instead of being based on outrageous claims and superlatives.
On your initial point, the makers of high-res mods, remasters, upscalers, SweetFX, Reshade and so on would like a word 😛 Though I do agree that changing the initial vision of the game to look different than the developers intended is .. no longer delivering the intended look. However it seems that throughout the general course of development we have strived for sharper looking graphics. Higher resolution, higher antialiasing counts, less jaggies and so on as much as possible. This is a step on the way that lets developers utilize it.
data/avatar/default/avatar15.webp
Kaarme:

That it's just advertisement speech? Perhaps. Maybe it's because I'm from a culture where ads have traditionally been fact-based, instead of being based on outrageous claims and superlatives.
I believe you are right.... If it were true.. in a month at least ....
https://forums.guru3d.com/data/avatars/m/275/275355.jpg
"We shall see what we shall see"
data/avatar/default/avatar20.webp
Digilator:

Seems you are missing the point.
What you really meant was that he is missing the pixels.
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
Hilbert Hagedoorn:

AMD even claims that it can be better than native.
lol... i hope this guy doesn't believe this 🙂
https://forums.guru3d.com/data/avatars/m/275/275921.jpg
A lot of factors contribute to what people end up with on their screens, including the screens. Images and videos get manipulated to suit individual tastes every day. As do games. Anything that makes the overall gaming experience better for a player is a good thing.
https://forums.guru3d.com/data/avatars/m/260/260048.jpg
FSR driver was to be expected at some point in spring - is this the announcement we are waiting for, or Super Resolution that was demonstrated/promised on a driver level will be a whole different thing in comparison to this?
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
cryohellinc:

FSR driver was to be expected at some point in spring - is this the announcement we are waiting for, or Super Resolution that was demonstrated/promised on a driver level will be a whole different thing in comparison to this?
RSR is based on fsr1.0 so this is something different.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
AlmondMan:

On your initial point, the makers of high-res mods, remasters, upscalers, SweetFX, Reshade and so on would like a word 😛 Though I do agree that changing the initial vision of the game to look different than the developers intended is .. no longer delivering the intended look. However it seems that throughout the general course of development we have strived for sharper looking graphics. Higher resolution, higher antialiasing counts, less jaggies and so on as much as possible. This is a step on the way that lets developers utilize it.
It just means that game studios and devs have been doing something wrong for 25 years if a simple (relatively speaking), universal upscaler can make games look better than native. What comes to the per game modifications, they have their value for those who aim to change the atmosphere by making the game look more pleasing in their own eyes (others might not like it, though). High-res mods don't necessarily mean someone simply runs the old textures through an upscaler. Sometimes dedicated modders have actually recreated higher resolution textures. But even upscaling the old textures might make a difference if the algorithms are better than in the game engine/graphics card.
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
AMD FSR 2.0 ‘next-level temporal upscaling’ officially launches Q2 2022, RSR launches March 17th The information in this post is based on official slides, hence we will only report on what is confirmed thus far. AMD is announcing its FSR 2.0 technology on March 17th. This announcement comes 6 days ahead of the official GDC 2022 showcase where next-generation upscaling technology will be discussed. As it turns out, the tip we received a few days ago about new upscaling tech in Deathloop is actually FSR 2.0. This title specifically is mentioned in the official slides, with a few side-by-side comparisons that we cannot share yet. https://videocardz.com/ezoimgfmt/cdn.videocardz.com/1/2022/03/AMD-FSR-2.0-Features-768x289.png?ezimgfmt=rs:768x289/rscb1/ng:webp/ngcb1 Temporal data and anti-aliasing The FSR 2.0 will offer better image quality in all presents and resolutions, but AMD does not confirm anything about performance benefits. This technology will be based on temporal data, and it will feature optimized anti-aliasing. In this regard, it will be a proper competitor for NVIDIA DLSS 2.0. No ML cores required but no confirmation on supported GPUs AMD confirms FSR 2.0 will not require dedicated Machine Learning hardware. However, AMD does not mention which GPUs will be supported. Instead, they confirm it will boost frame rate in supported games across ‘a wide range of products and platforms, both AMD and competitors’. The footnotes attached to this sentence do not confirm support for non-AMD hardware. So what is confirmed: FSR 2.0 is using temporal data, has built-in antialiasing, and it will offer higher image quality than FSR. What we can’t confirm yet: open-source code and support for non-AMD hardware. The slides are only a teaser of the GDC 2022 session, so the vast majority of news will be presented on March 23rd, it seems. With Intel XeSS going open-source, one could not imagine locking FSR 2.0 code though. AMD Radeon Super Resolution on March 17th Furthermore, AMD confirms it will finally launch RSR on March 17th, this is the official date for the new driver to launch. The RSR is basically FSR but working on a driver-level for all games. There are two things to consider though, it only works on Radeon RX 5000+ GPUs, and it will offer lower quality than FSR because the algorithm upscales the whole frame (including user interface and menus).
https://forums.guru3d.com/data/avatars/m/271/271560.jpg
even though i know better... Temporal Data reminds me of a two part Star Trek The Next Generation episode.
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
I wasn't expecting AMD to release FSR 2.0 so soon. They are basically doing the presentation and releasing it days after. I wonder what games are they going to showcase. Deathloop seems to be one of them. Seems like AMDs FSr 2.0 is going to be similar to Epic's TAAU. Here is a comparison I made in Supraland. It uses UE4.26, so I enabled support for TAAU+TAA Gen5. I also updated DLSS to 2.3.7 I normalized screenpercentage to frame rate. This means DLSS Quality is running at 66%, as usual. But TAAU+TAA GEN5 is running at 70%. So in these screenshots, with both solutions, the game was running at 132 fps. Here is the comparison in Juxtapose https://cdn.knightlab.com/libs/juxtapose/latest/embed/index.html?uid=59392fb2-a38b-11ec-b5bb-6595d9b17862 DLSS Quality https://live.staticflickr.com/65535/51938182655_a128681903_o_d.png TAAU+TAA Gen5 https://live.staticflickr.com/65535/51937885334_7091774712_o_d.png
data/avatar/default/avatar26.webp
Lol @ the noobs who cant understand that ML can produce a better end result at native. Ever hear of something called supersampling? Thats not native either.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Horus-Anhur:

I wasn't expecting AMD to release FSR 2.0 so soon. They are basically doing the presentation and releasing it days after. I wonder what games are they going to showcase. Deathloop seems to be one of them. Seems like AMDs FSr 2.0 is going to be similar to Epic's TAAU. Here is a comparison I made in Supraland. It uses UE4.26, so I enabled support for TAAU+TAA Gen5. I also updated DLSS to 2.3.7 I normalized screenpercentage to frame rate. This means DLSS Quality is running at 66%, as usual. But TAAU+TAA GEN5 is running at 70%. So in these screenshots, with both solutions, the game was running at 132 fps. Here is the comparison in Juxtapose https://cdn.knightlab.com/libs/juxtapose/latest/embed/index.html?uid=59392fb2-a38b-11ec-b5bb-6595d9b17862 DLSS Quality
https://screenshotcomparison.com/comparison/22212 Maybe its me but I prefer mouse over comparisons so you can focus on one spot and see the difference more clearly. DLSS is doing a better job here imo but it could just be a difference in sharpness (DLSS looks more sharp). It's very very close though - close enough that if it wasn't in a comparison tool i'd probably not notice.
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
Denial:

https://screenshotcomparison.com/comparison/22212 Maybe its me but I prefer mouse over comparisons so you can focus on one spot and see the difference more clearly. DLSS is doing a better job here imo but it could just be a difference in sharpness (DLSS looks more sharp). It's very very close though - close enough that if it wasn't in a comparison tool i'd probably not notice.
I agree, DLSS seems sharper. But it also has a bit more aliasing.