AMD Teases FidelityFX Super Resolution 3.0 at GDC 2023: What You Need to Know

Published by

Click here to post a comment for AMD Teases FidelityFX Super Resolution 3.0 at GDC 2023: What You Need to Know on our message forum
https://forums.guru3d.com/data/avatars/m/278/278745.jpg
So they're making fake interpolated frames like Nvidia. Probably more blur on top of the upscaling blur. I am not interested in fake frames that give motion sickness and strange artifacts.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
reslore:

So they're making fake interpolated frames like Nvidia. Probably more blur on top of the upscaling blur. I am not interested in fake frames that give motion sickness and strange artifacts.
Yeah, I'm also not interested in seeing them, but in the bigger picture it could be unavoidable development. Although MCM should help things a lot, the end of the rope could still be approaching. Process nodes won't keep shrinking much longer, due to physics themselves. Power consumption can't keep rising. Prices are already up there, although I suppose they will still increase more. So, where does further performance development come from? Frame generation seems like the easiest answer, especially if the whole game scene doesn't change completely and escape into the cloud, which I'd dislike far more than frame generation. Before we have a real revolution, like optical computers or whatever, it's going to be all about squeezing the last drops out of the existing tech.
https://forums.guru3d.com/data/avatars/m/56/56004.jpg
I've not used FSR in any of the games I play on my main rig, but with my 2nd rig with the RX 6900 XT hooked up to a 4K TV, FSR would inevitably be needed. Played DSR with FSR set to 'Quality' and it does help. bringing framerate to playable at 55fps and higher (will have to check if I'd had RTAO enabled). For a game like DSR, even mid 40fps is playable, and since I don't pixel peep like Steve of DF, it looks pretty dang good! So, understandably, I'm curious as to what FSR3 brings to the table in terms of PQ and framerate, and I hope it runs well on the RX 6000/5000 series of AMD cards at the very least.
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
reslore:

So they're making fake interpolated frames like Nvidia. Probably more blur on top of the upscaling blur. I am not interested in fake frames that give motion sickness and strange artifacts.
nVidia's latest implementation so far is... usable, I was pleasantly surprised. I expected it to be dumpster fire in terms of input delay and artifacts, but surprisingly, as it is, I do give it a pass at least in CP2077. Make no mistake, they're still fake frames and have all the issues you've heard of. The question is, will AMD's implementation match that minimum level of quality that makes it worth using in some scenarios? Honestly I don't know why AMD are even bothering with consumer graphics anything, it's clear the share holders don't see it as a good venue for profit so it's been absolutely dogshit for years now. They didn't make nVidia compete on price at any price point, they just basically matched Jesen's "eat shit and die" prices. It's pretty much the same result as an absolute monopoly. I was huffing down all the hopium and copium, just praying that Intel might make a dent, but Raja Raja'd the world again. Yet at this point, I still have more hope for Intel's borderline-nonexistent (and possibly dead) graphics division making a difference resulting in less insane prices, than I do for AMD.
https://forums.guru3d.com/data/avatars/m/196/196426.jpg
nVidia: You need to buy our fancy new ADA 4000 series GPUs to get fake frames 😎 AMD: We have fake frames too, but our fake frames work on Ryzen 3 2200G 😀
https://forums.guru3d.com/data/avatars/m/246/246088.jpg
Option 1 - Buy a top end card and play 4k max eyecandy without this technology and without compromise. or Option 2 - Buy a mid range card and play 4k max eyecandy with this technology and with some compromise. Or sit in your mums basement eating crisps and tell multi billion dollar tech companies they are doing it all wrong. #justsaying
https://forums.guru3d.com/data/avatars/m/40/40458.jpg
Nvidia DLSS3 frame generation works really well by user reports I've read so long as you start with 60 or so fps. Amd as usual copies innovation and is way late with much less adoption. They said h1 2023 when they launched their 7900XT/7900XTX but here we are in late March and it's "too early to show"? Bleh. DLSS3 already confirmed for a ton of titles with more to be announced, is this going to be the same as with DLSS 2 where FSR adoption lags far behind Nvidia?
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
wavetrex:

nVidia: You need to buy our fancy new ADA 4000 series GPUs to get fake frames 😎 AMD: We have fake frames too, but our fake frames work on Ryzen 3 2200G 😀
can't see them confirming anywhere how many real frames are needed to generate one interpolated frame, let alone that this will work on vega.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Neo Cyrus:

Honestly I don't know why AMD are even bothering with consumer graphics anything, it's clear the share holders don't see it as a good venue for profit so it's been absolutely dogshit for years now. They didn't make nVidia compete on price at any price point, they just basically matched Jesen's "eat crap and die" prices. It's pretty much the same result as an absolute monopoly. I was huffing down all the hopium and copium, just praying that Intel might make a dent, but Raja Raja'd the world again.
you don't ? is selling 900-1100eur cards as "value alternatives" not profitable enough to keep doing that ? and rx6800s were great cards from a consumer standpoint, and still are, just got a 6800 fighter for 400e (not new, lightly used - though we'll see once it arrives and I take it apart). They just got greedy with rdna3, renamed a cut n31 as 7900xt and put a 250e premium on a 6800xt pricetag, despite those +50% performance/wat targets they teased were clearly missed.
https://forums.guru3d.com/data/avatars/m/246/246088.jpg
I've never used any of this tech to be honest, the most graphically demanding game I currently play is Meteo EE and at 4k and full RT with max settings really flys along. Even CP2077 does 60fps according to benchmarks so until something more demanding comes along il not be using it.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
pegasus1:

Or sit in your mums basement eating crisps and tell multi billion dollar tech companies they are doing it all wrong. #justsaying
Haha, multi-billion tech companies are making mistakes all the time. That's because there's no positronic brain making the decisions but ordinary humans. Things can go especially bad when you don't have engineers making the decisions in tech companies, but you have businessmen instead. How many thousands of people were telling Intel they are making a mistake during the 4-core maximum decade of degeneration? Yet Intel did nothing until AMD janked the carpet from under their feet with the Zen MCM tech. Intel still hasn't recovered completely because ever since the 7000 generation, Intel has been forced to compete by factory overclocking their CPUs. Only now Intel is getting its MCM pro CPUs out, but by the looks of it, they won't yet be in the same weight class as AMD's offerings. So, yeah, maybe those multi-billion companies actually should, occasionally, listen to your mum's basement eating sitting crisps. #justeatingcrisps
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
I think frame generation is fine in some cases, it just depends on the game. Same goes with supersampling, playing a game at 60FPS rather than 240FPS, playing a game at 1080p with AA vs 4K, and so on. When you have to make sacrifices, there is no one-size-fits-all solution for a better experience. Obviously if we all had our way, we'd be playing everything at native 4K+ with AA and 300FPS of genuine frames.
Kaarme:

Yeah, I'm also not interested in seeing them, but in the bigger picture it could be unavoidable development. Although MCM should help things a lot, the end of the rope could still be approaching. Process nodes won't keep shrinking much longer, due to physics themselves. Power consumption can't keep rising. Prices are already up there, although I suppose they will still increase more. So, where does further performance development come from? Frame generation seems like the easiest answer, especially if the whole game scene doesn't change completely and escape into the cloud, which I'd dislike far more than frame generation. Before we have a real revolution, like optical computers or whatever, it's going to be all about squeezing the last drops out of the existing tech.
I predict something like stackable GPU cores. The high frequencies we see today would have to be lowered for each additional layer but by being able to effectively double the compute power per square millimeter is a big deal. I presume this would be easier to implement than a chiplet design, and in fact might work even better since data has less distance to travel. It would be more expensive to manufacture, but ones facilities are equipped to do this regularly, costs would likely go down. Otherwise, I predict things will turn out like the old days, where people have to just simply learn how to stop being so lazy about coding. While I have griped many times here in the past about optimizing software to have a smaller disk and memory footprint, devs have also been super lazy about taking advantage of writing code to use fewer cycles. Sometimes performance losses come from things as simple as using a char variable/field to store an integer, or as complex as utilizing more hardware instructions. Usually though, it's just making code simpler, like doing 1+1+1+1=4 rather than just 1*4=4. Obviously that's a pretty stupid example but it's just to demonstrate how there's multiple ways to skin a cat but many of them are a lot worse than others. There have been times I've reduced code to 1/4 of its original size while retaining the exact same functionality. You may ask "how do you know this is a prevalent issue?" and the answer speaks for itself: if software has a lot of bugs/glitches, hacky hotfixes, memory leaks, or any warnings while compiling, those developers could not have possibly spent enough time writing more efficient code. I guess it is worth pointing out that you can have rock solid software that is very inefficient, but it's not possible to have unstable software that is very efficient. There's a lot of unstable software out there.
https://forums.guru3d.com/data/avatars/m/186/186805.jpg
reslore:

So they're making fake interpolated frames like Nvidia. Probably more blur on top of the upscaling blur. I am not interested in fake frames that give motion sickness and strange artifacts.
I hate FG but it doesn't give motion sickness, to me at least. Its the latency that is the issue and artefacts which are super noticeable.
GoldenTiger:

Nvidia DLSS3 frame generation works really well by user reports I've read so long as you start with 60 or so fps. Amd as usual copies innovation and is way late with much less adoption. They said h1 2023 when they launched their 7900XT/7900XTX but here we are in late March and it's "too early to show"? Bleh. DLSS3 already confirmed for a ton of titles with more to be announced, is this going to be the same as with DLSS 2 where FSR adoption lags far behind Nvidia?
On the other hand Nvidia uses previous technology to create their own branded version and then lock it behind their walled off garden and it costs a premium to enter that garden. Then after a few months/years they abandon it as it eventually becomes open source thanks to other companies. How well is GSYNC going now? PhysX? Gameworks? Hairworks? Also: AMD: Number of employees 25,000 (2022) Nvidia: Number of employees 26,196 (2023) This is the WHOLE company. Nvidia make GPU's, AMD make much more. Then factor in this: Nvidia: Net income: US$4.368 billion (2023) AMD: Net income US$1.32 billion (2022) If anything its a miracle that AMD can compete at any level
data/avatar/default/avatar36.webp
Extrapolation. If it was interpolation, frames would be double in every case, look like ****, and it would have been incorporated a long time ago.
https://forums.guru3d.com/data/avatars/m/196/196426.jpg
Just so everyone here knows, this has existed for a while: https://nmkd.itch.io/flowframes And in the most recent versions, it's really, really good, the glitches are very minimal and usually only occur when fixed text or logos are displayed over video. AMD might use a similar algorithm but with game movement vectors instead of determining video movement vectors by looking at several frames, and they don't need to worry about sharp text glitching since the UI is added after the AI interpolation. Ah yea, and it works with Vulkan, so GPU agnostic.
data/avatar/default/avatar05.webp
wavetrex:

Just so everyone here knows, this has existed for a while: https://nmkd.itch.io/flowframes And in the most recent versions, it's really, really good, the glitches are very minimal and usually only occur when fixed text or logos are displayed over video.
These types of interpolation methods are not timing critical, thus not suited for low-latency sensitive applications, i.e. games. The tight integration of DLSS in the rendering pipeline can also access the depth buffer data and further suppress interpolation artefacts above and beyond what any other pure post-process algorithm can achieve.
https://forums.guru3d.com/data/avatars/m/196/196426.jpg
FSR 2.x already has access to the rendering pipeline, and most likely 3.x will as well. I for one can't wait to try it on my "outdated" 6800 XT and see what this game frame generation fuss is all about.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
wavetrex:

FSR 2.x already has access to the rendering pipeline, and most likely 3.x will as well. I for one can't wait to try it on my "outdated" 6800 XT and see what this game frame generation fuss is all about.
It will bring a new life into rdna2 cards. Who needs an nvidia 40 series 😛
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
It seems AMD and Intel have no option but to copy Nvidia`s features, even if some of them don`t make much sense... Everyone, kneel before the true power of marketing!!!:(
https://forums.guru3d.com/data/avatars/m/147/147322.jpg
Undying:

It will bring a new life into rdna2 cards. Who needs an nvidia 40 series 😛
Sooo ... Which future games have FSR3 support planned? 😀