AMD files possible patent for NVIDIA DLSS alternative called Gaming Super Resolution
Click here to post a comment for AMD files possible patent for NVIDIA DLSS alternative called Gaming Super Resolution on our message forum
Venix
XenthorX
mikeysg
Coming close to end of May, getting kinda anxious to see what FSR brings to the table. Honestly, I don't expect framerate uplift to be that ridiculous 200% being bandied about, but if there's an uplift of even 40% or higher, with image quality being close to native (I don't expect it to match nVidia's DLSS 2.0 either, after all, nVidia has a head start, and it's proprietary, guaranteed to work well within its walled/enclosed ecosystem, much like Apple's IOS).
As long as I can get a decent performance uplift with as little loss of PQ (especially with RT enabled), then I'd be one happy gamer. And, IF the tech works with GTX 1000 series card owners, all the better!
PrMinisterGR
All of this will basically be a tradeoff between the amount of processing power the shaders will need to use to upscale, vs what they would have needed to "really" render. For even remotely equal results to DLSS, it will need to use some sort of ML.
BlindBison
If I had to guess, my bet would be that AMD's solution works akin to DLSS 1.9 (as it was called by Digital Foundry) in Control -- at the time, that version ran on the standard GPU cores (shader cores iirc? will have to rewatch their video) and did not run on the Tensor cores like the current version. A solution like that could theoretically be generalized I would expect.
Nvidia's requires dedicated hardware if I understand right (Tensor cores) and uses their AI training model. From what I gather there a big differences between 2.1 (current) and 1.9 (Control generic one), but I'm just specuating.
To be honest, I really have no idea how AMDs will work -- will it require Temporal AntiAliasing and Motion Vectors like DLSS does? Is it just a really good uspcaling algorithm with temporal upscampling like unreal? Really curious to see how it ends up working since Nvidia's solution requires the AI model.
Crazy Joe
BlindBison
cucaulay malkin
https://media1.fdncms.com/riverfronttimes/imager/u/slideshow/2600028/butt_head.jpg[/SPOILER]
even if it's half as good as dlss 2.0,not releasing it is far worse.
oh wait,wait,i'm getting some new info from my sources
[SPOILER]