DeepCool LS720 (LCS) review
Fractal Design Pop Air RGB Black TG review
Palit GeForce GTX 1630 4GB Dual review
FSP Dagger Pro (850W PSU) review
Razer Leviathan V2 gaming soundbar review
Guru3D NVMe Thermal Test - the heatsink vs. performance
EnGenius ECW220S 2x2 Cloud Access Point review
Alphacool Eisbaer Aurora HPE 360 LCS cooler review
Noctua NH-D12L CPU Cooler Review
Silicon Power XPOWER XS70 1TB NVMe SSD Review
AMD FidelityFX Super Resolution 2.0 - preview





AMD has introduced FidelityFX Super Resolution 2.0 as a response to NVIDIA's DLSS technology. Will the refurbished FSR 2.0 offer sufficient image quality? Let's test the waters so you know what to anticipate.
Read article
Advertisement
Tagged as:
amd
« Guru3D NVMe Thermal Test - the heatsink vs. performance · AMD FidelityFX Super Resolution 2.0 - preview
· ASUS ROG Radeon RX 6750 XT STRIX review »
pages « 2 3 4 5 > »
=GGC=Phantomblu
Senior Member
Posts: 138
Senior Member
Posts: 138
Posted on: 05/12/2022 07:25 PM
Idk lol there's some rewriting of history here. GSync was out nearly a year before Freesync and years before Adaptive Sync. It provided more features than Freesync, even 2 years after Freesync released, and it also provided a more standard platform. You knew what you were getting when buying a gsync monitor. Freesync feature support was all over the place back then.
PhysX, AFAIK, was never enabled on AMD/ATi cards. You may have been thinking when Nvidia locked PhysX on it's own cards, when an AMD card was in the system - that was bypassed, but I don't think anyone ever got PhysX to actually run on the AMD card itself. I could be wrong. Either way PhysX has been completely rewritten like 4 times, so it now running on CPU in various games is kind of meaningless.
Tensors are a product of HPC shoved down into consumer GPUs. Most of that R&D was paid for by Nvidia HPC. Either way I don't think there's a reality where Nvidia releases a non-tensor card that's cheaper. We would have just gotten the same cards, with DLSS that doesn't use tensors, for the same price. Also while tensors aren't used for much stuff now (DLSS, Denoising in RT, Microphone cleanup stuff, some picture enhancements, etc) there's definitely a massive advantage with having them for potential future applications. Ignoring obvious improvements to current features (who's to say DLSS 3.0 won't massively increase performance?) DirectML is starting to take shape and it can leverage those cores. I think having the hardware creates a potential that AMD doesn't have. Whether or not we'll see something that utilizes that potential is a different story.. but Nvidia is king of value-add so I'm sure it's coming.
Also Nvidia does have a similar solution, it's NIS and it's also open source but it's not a temporal solution - it may become one though in response to this. Dunno.
Do you ever wonder why AMD never took the path of NVIDIA? Do you really think that AMD is so stupid ... I remind you that it works hand in hand with Microsoft for the consoles, Nvidia has worked on it yes and no once if I'm not mistaken and it is the only one that fully supports Open Source even with Freesync , FSR and so on .. Explain to me why in the so-called gaming range of cards it does not have the Deep Learning that Nvidia has with DLSS ... and wanted a more friendly approach
Idk lol there's some rewriting of history here. GSync was out nearly a year before Freesync and years before Adaptive Sync. It provided more features than Freesync, even 2 years after Freesync released, and it also provided a more standard platform. You knew what you were getting when buying a gsync monitor. Freesync feature support was all over the place back then.
PhysX, AFAIK, was never enabled on AMD/ATi cards. You may have been thinking when Nvidia locked PhysX on it's own cards, when an AMD card was in the system - that was bypassed, but I don't think anyone ever got PhysX to actually run on the AMD card itself. I could be wrong. Either way PhysX has been completely rewritten like 4 times, so it now running on CPU in various games is kind of meaningless.
Tensors are a product of HPC shoved down into consumer GPUs. Most of that R&D was paid for by Nvidia HPC. Either way I don't think there's a reality where Nvidia releases a non-tensor card that's cheaper. We would have just gotten the same cards, with DLSS that doesn't use tensors, for the same price. Also while tensors aren't used for much stuff now (DLSS, Denoising in RT, Microphone cleanup stuff, some picture enhancements, etc) there's definitely a massive advantage with having them for potential future applications. Ignoring obvious improvements to current features (who's to say DLSS 3.0 won't massively increase performance?) DirectML is starting to take shape and it can leverage those cores. I think having the hardware creates a potential that AMD doesn't have. Whether or not we'll see something that utilizes that potential is a different story.. but Nvidia is king of value-add so I'm sure it's coming.
Also Nvidia does have a similar solution, it's NIS and it's also open source but it's not a temporal solution - it may become one though in response to this. Dunno.
Do you ever wonder why AMD never took the path of NVIDIA? Do you really think that AMD is so stupid ... I remind you that it works hand in hand with Microsoft for the consoles, Nvidia has worked on it yes and no once if I'm not mistaken and it is the only one that fully supports Open Source even with Freesync , FSR and so on .. Explain to me why in the so-called gaming range of cards it does not have the Deep Learning that Nvidia has with DLSS ... and wanted a more friendly approach
Stormyandcold
Senior Member
Posts: 5784
Senior Member
Posts: 5784
Posted on: 05/12/2022 07:32 PM
It's nothing to do with intelligence, it's called lack of resources, otherwise known as lack of money.
Do you ever wonder why AMD never took the path of NVIDIA? Do you really think that AMD is so stupid ... I remind you that it works hand in hand with Microsoft for the consoles, Nvidia has worked on it yes and no once if I'm not mistaken and it is the only one that fully supports Open Source even with Freesync , FSR and so on .. Explain to me why in the so-called gaming range of cards it does not have the Deep Learning that Nvidia has with DLSS ... and wanted a more friendly approach
It's nothing to do with intelligence, it's called lack of resources, otherwise known as lack of money.
=GGC=Phantomblu
Senior Member
Posts: 138
Senior Member
Posts: 138
Posted on: 05/12/2022 07:34 PM
Why does this PhysX BS still exist?
ATi chose not to work with Nvidia on Physx, which is why Nvidia locked them out. Ati chose to partner with Havok for their physics implementation which bore no fruit. It's a fact that Ati themselves were against Physx and hardware accelerated physics in the first place, where as Nvidia saw the potential and made it mainstream. Mainstream enough that even consoles used PhysX. Your version of PhysX history is just wrong.
What we really see with Nvidia is that they set the standard and Ati/AMD are always playing catch-up. DX12 is a testament to what happens when you let AMD push their agenda. You get BS like Async-compute, which AMD are more than willing to pedal as the next big thing, when in reality it was a last minute change in the DX12 specs that Nvidia weren't prepared for.
The fact we now have Ray Tracing, even in a hybrid form is down to Nvidia. Now AMD has RT. DLSS also Nvidia, now AMD has FSR. If it wasn't for Nvidia those features wouldn't even exist.
You believe this too because if you go to KRONOS you will see that the developers proceed in the implementation of the standards together .. So the knowledge is more or less the same ... If the AMD opencl standard is stuck at 2.0 while Nvidia about drivers is 3.0 and not because AMD chases it ... It is probably believed that the exploration of certain areas still has margins ...
Why does this PhysX BS still exist?
ATi chose not to work with Nvidia on Physx, which is why Nvidia locked them out. Ati chose to partner with Havok for their physics implementation which bore no fruit. It's a fact that Ati themselves were against Physx and hardware accelerated physics in the first place, where as Nvidia saw the potential and made it mainstream. Mainstream enough that even consoles used PhysX. Your version of PhysX history is just wrong.
What we really see with Nvidia is that they set the standard and Ati/AMD are always playing catch-up. DX12 is a testament to what happens when you let AMD push their agenda. You get BS like Async-compute, which AMD are more than willing to pedal as the next big thing, when in reality it was a last minute change in the DX12 specs that Nvidia weren't prepared for.
The fact we now have Ray Tracing, even in a hybrid form is down to Nvidia. Now AMD has RT. DLSS also Nvidia, now AMD has FSR. If it wasn't for Nvidia those features wouldn't even exist.
You believe this too because if you go to KRONOS you will see that the developers proceed in the implementation of the standards together .. So the knowledge is more or less the same ... If the AMD opencl standard is stuck at 2.0 while Nvidia about drivers is 3.0 and not because AMD chases it ... It is probably believed that the exploration of certain areas still has margins ...
Denial
Senior Member
Posts: 13755
Senior Member
Posts: 13755
Posted on: 05/12/2022 07:35 PM
Do you ever wonder why AMD never took the path of NVIDIA?
Because it didn't have the money to build a multi-billion dollar AI software ecosystem. Pretty much all the technology for RT/DLSS are just consumer variants of HPC/Professional technology. If you go and look at any of the white papers for RT/DLSS(convolutional autoencoder)/etc its' pretty much all developed while looking for solutions to datacenter problems or in RT's case self driving cars.Do you really think that AMD is so stupid
I don't recall stating AMD is stupid? In fact I praise them in my first post in this thread.I remind you that it works hand in hand with Microsoft for the consoles
I'm not sure what this has to do with anything. Microsoft isn't really known for being a company that's innovative. Most of these features are developed by AMD/Nvidia and then those companies work with Microsoft on implementing broader support through APIs.Nvidia has worked on it yes and no once if I'm not mistaken and it is the only one that fully supports Open Source even with Freesync , FSR and so on
Okay?Explain to me why in the so-called gaming range of cards it does not have the Deep Learning that Nvidia has with DLSS ... and wanted a more friendly approach
AMD does have acceleration for tensor operations on it's consumer cards.. it just doesn't have discreet hardware for it and for all we know that changes next gen - CNDA1 and now 2 have Matrix cores. It's just known if they will hit consumer variants. They've also been working with Microsoft on developing AI upscalars. Their efforts just haven't bared any real products.pages « 2 3 4 5 > »
Click here to post a comment for this article on the message forum.
Senior Member
Posts: 5784
This is what I have been saying for years. Nvidia could of easily done this themselves with their insane R&D budgets and huge work force. But no, they create needless custom hardware and then charge the customer for the privilege of using it trying to create an even larger monopoly on the market. Much like they did with GSYNC and their custom hardware inside each monitor when they could of just supported Adaptive Sync, or buying PhysX off Ageia and locking it to their hardware only. Then community members were able to enable PhysX on AMD/ATi cards with modded drivers, proving again that Nvidia lied when they said it was only possible on their hardware. Now PhsyX is used inside tons of game engines.
Why does this PhysX BS still exist?
ATi chose not to work with Nvidia on Physx, which is why Nvidia locked them out. Ati chose to partner with Havok for their physics implementation which bore no fruit. It's a fact that Ati themselves were against Physx and hardware accelerated physics in the first place, where as Nvidia saw the potential and made it mainstream. Mainstream enough that even consoles used PhysX. Your version of PhysX history is just wrong.
What we really see with Nvidia is that they set the standard and Ati/AMD are always playing catch-up. DX12 is a testament to what happens when you let AMD push their agenda. You get BS like Async-compute, which AMD are more than willing to pedal as the next big thing, when in reality it was a last minute change in the DX12 specs that Nvidia weren't prepared for.
The fact we now have Ray Tracing, even in a hybrid form is down to Nvidia. Now AMD has RT. DLSS also Nvidia, now AMD has FSR. If it wasn't for Nvidia those features wouldn't even exist.