AMD FidelityFX Super Resolution (FSR) to launch this year (to fight off DLSS)
For PCs that is. If you read yesterday's Radeon RX 6700 reviews you will have noticed our comments on the fact that AMD still does not have a DLSS alternative available, and that's increasingly hard to defend.
Prior to the reviews got released media asked AMD about this, and two weeks ago, AMD did not have any answer other than we're working on it, with no timeline. It seems that pressure has been built, and AMD is now communicating more clearly about the topic. In an interview with PCWorld, AMD’s Scott Herkelman now has made a statement that AMD’s Super Resolution technology would be released this year. And yes, that's a rather wide margin to name.
It’s progressing very well internally in our lab, but it’s our commitment to the gaming community that it needs to be open that it needs to work across all things and game developers need to adopt it. Even though it’s progressing well, we still have more work to do and not only internally but with our game developer partners. We want to launch it this year. We belive we can do that this year, but at the same time we a lot more work ahead of us. We need to make sure the image quality is there. We need to make sure it can scale from different resolutions. And at the same time that our game developers are happy with what we are producing.
It’s probably one of the biggest software initiatives we have internally because we know how important it is if you want to turn on ray tracing that you don’t just wanna have that competitive hit or your GPU get hit so hard. The FSR (that will be called the acronym), is something key to us to launch this year, but it’s gonna a little bit more time. We are progressing well, but we still have some work to do.
— Scott Herkelman
AMD is to name the technology FidelityFX Super Resolution, aka FSR. Unfortunately, it is still unknown how this tech will work; logic indicates it'll be a port of full MLAA (DirectML) as developed by Microsoft. AMD does not have Tensor cores or an equivalent to them; the methodology still needs to run over the existing CUs, which will require GPU resources, more so than DLSS. However, since there are no details, this might not even become a machine learning algorithm-based solution. We state this because this is what Mr. Herkelman mentions:
You don’t need machine learning to do it, you can do this many different ways and we are evaluating many different ways. What matters the most to us is what game developers want to use because if at the end of the day it is just for us, we force people to do it, it is not a good outcome. We would rather say: gaming community, which one of these techniques would you rather see us implement so that this way it can be immediately spread across the industry and hopefully cross-platform.
— Scott Herkelman
We can learn here that the primary focus is shifting to PC; as previously, this same man mentioned they'd release it only until all platforms (consoles) are compatible and ready.
Senior Member
Posts: 2866
Joined: 2016-08-01
It would be interesting when they release it. People compare it to dlss they expect to look like dlss2 if the first iteration comes out and the image quality is closer to dlss 1.x then amd is in for a big roasting . We will see when and if comes out .
Senior Member
Posts: 9729
Joined: 2008-01-06
They said consoles and wide range of radeon gpus whatever that means. It would be nice to have toggle button inside the software instead of waiting and hoping developers will add it to your game like with nvidia.
The "toggle" switch won't happen, this is meant to be incorporated within the game it self. It won't be a "switch on for 2x performance".
It's been closely developed with Microsoft and partly Sony. But rumours point to this being a Microsoft and AMD collaboration. I expect all first party Microsoft games to use this. Maybe something like off loading the A.I processing to the Azure cloud servers?? Which would mean it would require an internet connection to function and then you have latency penalties too.
Maybe it could even be open source like they did with FreeSync.
Freesync wasn't really AMD's technology, it was branding for them supporting VESA's "Adaptive Sync" .
That makes even less sense. A half decade ago, AMD was nearing bankruptcy. Why would it be a good idea for them to dump money into what was effectively a rumor at a time when GPU-accelerated AI was still in its infancy? Seems rather unfair to say they should have known better in hindsight.
I can't emphasize this enough: DLSS wasn't worth using until 2020. So to put this another way, if PhysX were a success, would it be fair to say AMD screwed up and didn't invest in GPU-accelerated physics? The way I see it, the only reason people are irritated about AMD being behind is because Nvidia succeeded in making DLSS a legitimately good feature. If DLSS never got any better than v1.0, I feel fairly confident we would not be having this discussion.
So all that being said, AMD didn't have a compelling reason to dump resources on this until Q2 2020. AMD took a gamble about whether Nvidia would succeed to make something worthwhile and they lost. Now they're playing catch-up, and it's not realistic for them to get a worthwhile competitor in less than 2 years. Keep in mind, if this technology were so promising, I'm sure Sony and MS would have been pushing AMD much harder to get it done. Both of them promised 4K-capable consoles, and this is a way to get that done.
This is pretty much spot on. Nvidia took a risk and a gamble with DLSS. They needed another way to use their tensor cores which were initially made to do A.I processing for the automotive industry in self driving cars and other A.I based workloads in cloud computing.
Personally I hate DLSS, even the current version. For some reason I am super sensitive to upscaling of any kind and the image looks way less detailed to me than native resolution, with an almost softer less sharp image. Don't get me wrong, it has improved a lot but its still not there yet.
People seem to forget both AMD and Nvidia's past technology failures and seem to bow down to anything they shovels out the door. Even though 9x out of 10 they all disappear within 5 or so years. What interests me more is game engine improvements, gpu performance improvements (at native res), API improvements, etc. Things like mesh shaders, and variable rate shading will be what I think will truly push the next gen games. Probably allowing them to run things like RT without tanking the fps at native resolutions.
Senior Member
Posts: 9729
Joined: 2008-01-06
as always with RGT, a huge pinch of salt required
Member
Posts: 42
Joined: 2005-06-29
If they are targeting the whole year, that means they are pretty early with this and are probably having issues.