DeepCool LS720 (LCS) review
Fractal Design Pop Air RGB Black TG review
Palit GeForce GTX 1630 4GB Dual review
FSP Dagger Pro (850W PSU) review
Razer Leviathan V2 gaming soundbar review
Guru3D NVMe Thermal Test - the heatsink vs. performance
EnGenius ECW220S 2x2 Cloud Access Point review
Alphacool Eisbaer Aurora HPE 360 LCS cooler review
Noctua NH-D12L CPU Cooler Review
Silicon Power XPOWER XS70 1TB NVMe SSD Review
AMD FidelityFX Super Resolution 2.0 - preview





AMD has introduced FidelityFX Super Resolution 2.0 as a response to NVIDIA's DLSS technology. Will the refurbished FSR 2.0 offer sufficient image quality? Let's test the waters so you know what to anticipate.
Read article
Advertisement
Tagged as:
amd
« Guru3D NVMe Thermal Test - the heatsink vs. performance · AMD FidelityFX Super Resolution 2.0 - preview
· ASUS ROG Radeon RX 6750 XT STRIX review »
pages 1 2 3 4 > »
aufkrawall2
Senior Member
Posts: 1811
Senior Member
Posts: 1811
Posted on: 05/12/2022 05:36 PM
It's ok for 4k, but with 1440p screen I still find it too rough vs. DLSS. And that is with DLSS Balanced (+ReShade CAS) vs. FSR 2.0 Quality. FSR 2.0 Balanced already looks very bad with 1440p target resolution.
Looks like DL isn't all about marketing BS.
It's ok for 4k, but with 1440p screen I still find it too rough vs. DLSS. And that is with DLSS Balanced (+ReShade CAS) vs. FSR 2.0 Quality. FSR 2.0 Balanced already looks very bad with 1440p target resolution.
Looks like DL isn't all about marketing BS.
Denial
Senior Member
Posts: 13757
Senior Member
Posts: 13757
Posted on: 05/12/2022 05:38 PM
This is what I have been saying for years. Nvidia could of easily done this themselves with their insane R&D budgets and huge work force. But no, they create needless custom hardware and then charge the customer for the privilege of using it trying to create an even larger monopoly on the market. Much like they did with GSYNC and their custom hardware inside each monitor when they could of just supported Adaptive Sync, or buying PhysX off Ageia and locking it to their hardware only. Then community members were able to enable PhysX on AMD/ATi cards with modded drivers, proving again that Nvidia lied when they said it was only possible on their hardware. Now PhsyX is used inside tons of game engines.
Nvidia make exceptional GPU's, but their business practices are just ludicrous. AMD are not innocent either they have done a lot of shady crap too just seems to be a lesser extent and the immense support of open source software just benefits the industry as a whole.
EDIT:
***grabs popcorn***.....
Idk lol there's some rewriting of history here. GSync was out nearly a year before Freesync and years before Adaptive Sync. It provided more features than Freesync, even 2 years after Freesync released, and it also provided a more standard platform. You knew what you were getting when buying a gsync monitor. Freesync feature support was all over the place back then.
PhysX, AFAIK, was never enabled on AMD/ATi cards. You may have been thinking when Nvidia locked PhysX on it's own cards, when an AMD card was in the system - that was bypassed, but I don't think anyone ever got PhysX to actually run on the AMD card itself. I could be wrong. Either way PhysX has been completely rewritten like 4 times, so it now running on CPU in various games is kind of meaningless.
Tensors are a product of HPC shoved down into consumer GPUs. Most of that R&D was paid for by Nvidia HPC. Either way I don't think there's a reality where Nvidia releases a non-tensor card that's cheaper. We would have just gotten the same cards, with DLSS that doesn't use tensors, for the same price. Also while tensors aren't used for much stuff now (DLSS, Denoising in RT, Microphone cleanup stuff, some picture enhancements, etc) there's definitely a massive advantage with having them for potential future applications. Ignoring obvious improvements to current features (who's to say DLSS 3.0 won't massively increase performance?) DirectML is starting to take shape and it can leverage those cores. I think having the hardware creates a potential that AMD doesn't have. Whether or not we'll see something that utilizes that potential is a different story.. but Nvidia is king of value-add so I'm sure it's coming.
Also Nvidia does have a similar solution, it's NIS and it's also open source but it's not a temporal solution - it may become one though in response to this. Dunno.
This is what I have been saying for years. Nvidia could of easily done this themselves with their insane R&D budgets and huge work force. But no, they create needless custom hardware and then charge the customer for the privilege of using it trying to create an even larger monopoly on the market. Much like they did with GSYNC and their custom hardware inside each monitor when they could of just supported Adaptive Sync, or buying PhysX off Ageia and locking it to their hardware only. Then community members were able to enable PhysX on AMD/ATi cards with modded drivers, proving again that Nvidia lied when they said it was only possible on their hardware. Now PhsyX is used inside tons of game engines.
Nvidia make exceptional GPU's, but their business practices are just ludicrous. AMD are not innocent either they have done a lot of shady crap too just seems to be a lesser extent and the immense support of open source software just benefits the industry as a whole.
EDIT:
***grabs popcorn***.....
Idk lol there's some rewriting of history here. GSync was out nearly a year before Freesync and years before Adaptive Sync. It provided more features than Freesync, even 2 years after Freesync released, and it also provided a more standard platform. You knew what you were getting when buying a gsync monitor. Freesync feature support was all over the place back then.
PhysX, AFAIK, was never enabled on AMD/ATi cards. You may have been thinking when Nvidia locked PhysX on it's own cards, when an AMD card was in the system - that was bypassed, but I don't think anyone ever got PhysX to actually run on the AMD card itself. I could be wrong. Either way PhysX has been completely rewritten like 4 times, so it now running on CPU in various games is kind of meaningless.
Tensors are a product of HPC shoved down into consumer GPUs. Most of that R&D was paid for by Nvidia HPC. Either way I don't think there's a reality where Nvidia releases a non-tensor card that's cheaper. We would have just gotten the same cards, with DLSS that doesn't use tensors, for the same price. Also while tensors aren't used for much stuff now (DLSS, Denoising in RT, Microphone cleanup stuff, some picture enhancements, etc) there's definitely a massive advantage with having them for potential future applications. Ignoring obvious improvements to current features (who's to say DLSS 3.0 won't massively increase performance?) DirectML is starting to take shape and it can leverage those cores. I think having the hardware creates a potential that AMD doesn't have. Whether or not we'll see something that utilizes that potential is a different story.. but Nvidia is king of value-add so I'm sure it's coming.
Also Nvidia does have a similar solution, it's NIS and it's also open source but it's not a temporal solution - it may become one though in response to this. Dunno.
TimmyP
Senior Member
Posts: 934
Senior Member
Posts: 934
Posted on: 05/12/2022 05:56 PM
Nobody noticed yet that FSR2q (all FSR2 modes) either kills, or they have disabled particle effects? FSR2q shows no sparks, faeries, or added flames at all at 4k. Its impressive for static geometry Ill give it that.
Nobody noticed yet that FSR2q (all FSR2 modes) either kills, or they have disabled particle effects? FSR2q shows no sparks, faeries, or added flames at all at 4k. Its impressive for static geometry Ill give it that.
Krizby
Senior Member
Posts: 1415
Senior Member
Posts: 1415
Posted on: 05/12/2022 06:45 PM
As far as I'm concerned, FSR and DLSS aren't supposed to be on all the time. They're just another way of lowering visual fidelity for more performance, just like all other graphics settings. It's weird to me how much people make a fuss about this as if they're supposed to use it, or as if it's supposed to have a 1:1 level of detail.
DLSS and FSR2.0 allow for playable 4K gaming on mainstream GPU, I can assure you that 4K DLSS/FSR2.0 look way better than 1440p Native
Sure if you plan to stick to 1080p forever then yeah DLSS/FSR2.0 are not for you.
As far as I'm concerned, FSR and DLSS aren't supposed to be on all the time. They're just another way of lowering visual fidelity for more performance, just like all other graphics settings. It's weird to me how much people make a fuss about this as if they're supposed to use it, or as if it's supposed to have a 1:1 level of detail.
DLSS and FSR2.0 allow for playable 4K gaming on mainstream GPU, I can assure you that 4K DLSS/FSR2.0 look way better than 1440p Native

Sure if you plan to stick to 1080p forever then yeah DLSS/FSR2.0 are not for you.
pages 1 2 3 4 > »
Click here to post a comment for this article on the message forum.
Senior Member
Posts: 3238
May end up seeing this down the line.
FSR 2.0's quality mode definitely delivers here, much better than 1.0. Good job AMD.