AMD FidelityFX Super Resolution (FSR) to launch this year (to fight off DLSS)
For PCs that is. If you read yesterday's Radeon RX 6700 reviews you will have noticed our comments on the fact that AMD still does not have a DLSS alternative available, and that's increasingly hard to defend.
Prior to the reviews got released media asked AMD about this, and two weeks ago, AMD did not have any answer other than we're working on it, with no timeline. It seems that pressure has been built, and AMD is now communicating more clearly about the topic. In an interview with PCWorld, AMD’s Scott Herkelman now has made a statement that AMD’s Super Resolution technology would be released this year. And yes, that's a rather wide margin to name.
It’s progressing very well internally in our lab, but it’s our commitment to the gaming community that it needs to be open that it needs to work across all things and game developers need to adopt it. Even though it’s progressing well, we still have more work to do and not only internally but with our game developer partners. We want to launch it this year. We belive we can do that this year, but at the same time we a lot more work ahead of us. We need to make sure the image quality is there. We need to make sure it can scale from different resolutions. And at the same time that our game developers are happy with what we are producing.
It’s probably one of the biggest software initiatives we have internally because we know how important it is if you want to turn on ray tracing that you don’t just wanna have that competitive hit or your GPU get hit so hard. The FSR (that will be called the acronym), is something key to us to launch this year, but it’s gonna a little bit more time. We are progressing well, but we still have some work to do.
— Scott Herkelman
AMD is to name the technology FidelityFX Super Resolution, aka FSR. Unfortunately, it is still unknown how this tech will work; logic indicates it'll be a port of full MLAA (DirectML) as developed by Microsoft. AMD does not have Tensor cores or an equivalent to them; the methodology still needs to run over the existing CUs, which will require GPU resources, more so than DLSS. However, since there are no details, this might not even become a machine learning algorithm-based solution. We state this because this is what Mr. Herkelman mentions:
You don’t need machine learning to do it, you can do this many different ways and we are evaluating many different ways. What matters the most to us is what game developers want to use because if at the end of the day it is just for us, we force people to do it, it is not a good outcome. We would rather say: gaming community, which one of these techniques would you rather see us implement so that this way it can be immediately spread across the industry and hopefully cross-platform.
— Scott Herkelman
We can learn here that the primary focus is shifting to PC; as previously, this same man mentioned they'd release it only until all platforms (consoles) are compatible and ready.
Senior Member
Posts: 11808
Joined: 2012-07-20
I know it's easy to say but this should have been ready for the RDNA 2.0 start.
So now those owners still have to wait a lot to get something similar to dlss.
It is last thing I would be waiting for.
I doubt AMD will be able to do anything non AI accelerated on par with DLSS, which is hardly perfect and has lots of problems. And the last thing we need is more artifacts from bad upscaling mixed with a TAA hybrid and it's artifacts, not to mention excusing poorly optimized games with "Oh just run it with DLSS/AMD equivalent." Which is already becoming a problem with current games supporting DLSS. (FFXV,Watch Dogs,Cyberpunk,Control,System Shock Remake which the previously released demos already performed very badly for how basic the game looked,etc.)
Last I checked AMD's DSR equivalent is still highly restrictive and you can't add custom resolutions unlike with DSR.
AMD has always been behind in AA (Aside from the one DX11 demo they made with their own equivalent of SGSSAA that looked fantastic. Yet it was never used in any games, nor was it put into the drivers with any method to inject it into games). And I doubt that will ever change.
Yesterday, few hours before you posted, new AMD's driver added "Anti-lag" and "Radeon Boost (VRS version)" for DX12 titles.
I think that they are changing things a bit around. They surely are invading low level API's space now. And that means they have people who are able to think how and get it implemented.
Senior Member
Posts: 1491
Joined: 2011-02-17
Why is this true? Why does it also need to work for consoles? I keep seeing people say this but I don't get why their solution can't be for PC only.
Presumably because that's the the only option that almost guarantees that their solution will be widely supported. If it's just for PC, the risk is it won't be supported unless AMD pays for it.
Senior Member
Posts: 1439
Joined: 2014-07-22
People should remember that DLSS 1.x was so poor nobody wanted it/used it. It took nVidia quite sometime to come up with what they have now in DLSS 2.0--over a year, IIRC. I would imagine that what AMD will finally unveil will likely be better, I should think, because it is newer, and based on newer hardware that was not the case for nVidia with DLSS 1.x (upon which DLSS 2.0 is based.)
If at long last I can ever get my 6800XT (been trying to buy one since November last year!) I can run every game at 4k, maxed image quality, with a substantial frame-rate, so I probably won't use what AMD comes up with, either. We shall see. As it is, if I need substantially more frame-rate right now with my 5700XT at 4k I can just drop down to 1440P which completely solves that problem in the 1-2 games in which I might occasionally do that.
I actually think AMD is way ahead of nVidia in FSAA, unlike Mr. Bonk...

Senior Member
Posts: 4869
Joined: 2009-08-29
People should remember that DLSS 1.x was so poor nobody wanted it/used it. It took nVidia quite sometime to come up with what they have now in DLSS 2.0--over a year, IIRC. I would imagine that what AMD will finally unveil will likely be better, I should think, because it is newer, and based on newer hardware that was not the case for nVidia with DLSS 1.x (upon which DLSS 2.0 is based.)
If at long last I can ever get my 6800XT (been trying to buy one since November last year!) I can run every game at 4k, maxed image quality, with a substantial frame-rate, so I probably won't use what AMD comes up with, either. We shall see. As it is, if I need substantially more frame-rate right now with my 5700XT at 4k I can just drop down to 1440P which completely solves that problem in the 1-2 games in which I might occasionally do that.
I actually think AMD is way ahead of nVidia in FSAA, unlike Mr. Bonk...

Nvidia's tensor cores are generic for matrix multiplication and therefore machine learning (and specifically neural network) purposes. They're not purpose specific. The only challenge is the size of the neural network used for inference and making sure predictions run fast enough that the process doesn't end up taking longer than just rendering at the target resolution.
AMD would have a similar challenge. They won't necessarily have a better algorithm just because their equivalent of the tensor core is newer. It boils down to how those cores would perform, not necessarily what they do in particular.
Senior Member
Posts: 1363
Joined: 2020-02-20
This comparison between intel and nvidia is the most illogical comparison i've ever seen.
While Intel has for the past 5+ years basically done nothing meaningful to their architecture and just released new CPU with new socket and minimal performance increase, nvidia has been releasing new GPUs with adequate(especially compared to intel) performance increases to very good performance increases, as well as providing new technologies such as ray tracing and DLSS and etc. They put innovation again and again into their cards.
Does that mean i hope AMD doesn't come out with GPUs that far exceed expectation and dethrone nvidia from the top down? Absolutely not, that'd be great, competition is and always will be good.
But nvidia is not Intel, and there is zero comparison there.
Monarchs, emperors, royals are destined to be dethroned sooner or later. Fate is inevitable. Changes are needed through history. Championships are shareable. I hope for the sake of consumers, Nvidia loses something, personally I do not own anything to JHH. Like most PC users bowing to him.