AMD FidelityFX Super Resolution in Ultra Quality shows visible quality loss
Click here to post a comment for AMD FidelityFX Super Resolution in Ultra Quality shows visible quality loss on our message forum
cucaulay malkin
rl66
On other hand DLSS and FFX are both thing to compensate the real problem: the GPU doesn't kick in RT as those company have promess to consumer.
Because even with the latest DLSS, an experimented eyes can see that the speed have a dramatical cost.
beedoo
Personally, I think that if you're going to take 100+ consecutive images from a 100 FPS game and look at them individually for 'razor sharpness' then you're a bloody idiot.
ViperAnaf
of course it does - without AI HW to "predict" the missing pixels this is nothing but a gimmick by AMD - a fake alternative for DLSS... its not the first time AMD.. kind of like how they branded a standard PCIE spec feature (resizable bar) as "smart access" (lol always makes me laugh)... but you know what? the fanboys gobble it up and its working so why not?
beedoo
AlmondMan
AsiJu
I'm sorry but:
"...a Reddit user grab video where the FSR performance advantages were shown at 4K resolution capturing of a few frames of the technology comparing 'Ultra Quality' (the one with the least loss of graphic quality. ) in the form of a BMP file so that it does not lose quality..."
We are still talking about taking a snapshot from a compressed video, no matter what image format you use. It's never going to match what's actually on screen.
Especially as YT videos are actually compressed twice, once during recording, again during YT processing so YT video is never as good as the original video.
Besides upscaling has worse quality than native, why is this news? DLSS is no different even if latest iterations are very good.
The question is, is the resulting image good enough for one to make that performance-quality tradeoff.
I won't even look at any "video grabs" but judge for myself once tech is released.
Trying to assess image quality with lossy sources is about as pointless as it gets.
"...The user enlarge the image x4 in order to see the visual differences between the native quality and the FSR in Ultra..."
If you need to zoom in on an image to see a quality loss, I'd say the upscaling works very well.
AMD would do well though, to release a few uncompressed screenshots from actual gameplay and from a game that is not readily blurry.
knightriot
I guess i should hold my old 1080ti then 😛
WhiteLightning
Moderator
Nvidia has been into deep learning for so long, it would be hard for AMD to catch up. I guess this is the best next thing as an alternative.
And while the capture is from a youtube video which has tons of compression , I honestly do not think it will ever be as good as Nvidia's method right now.
Francesco
Not that impressive but I think judging this kind of stuff with still images is a little misleading.
No one is going to catch minor details while playing an high FPS game.
Te check if FidelityFX is doing any good we should have 3 videos of the same game scene, one with everything maxed 4K, one with same settings and FidelityFX ON and one with no FidelityFX but lowered settings to achive the same FPS achived with FidelityFX.
This way you could see if FidelityFX is acutally NOTICIABLY better than just lowering same settings here and there.
Ryu5uzaku
AsiJu
AsiJu
beedoo
rl66
Both are like fancy pink ribbon on a woodleg...
rl66
Astyanax
WhiteLightning
Moderator
The better AI hardware will be to predict things, the more data and faster the results will be.
beedoo
cucaulay malkin
you don't need rt cores to do rt either
see ? gtx1080ti runs it too
https://i.imgur.com/enGoy0t.jpeg
and no hardware will make it...... ?
to identical final result and performance ? interesting cause i've never seen any publication comparing hardware and software dlss,but if you claim the results are the same...I mean you gotta have data confirming it,you're just not willing to share it.