Faked NVIDIA GeForce RTX 3090 Performance Slides are making Rounds on the WWW

Published by

Click here to post a comment for Faked NVIDIA GeForce RTX 3090 Performance Slides are making Rounds on the WWW on our message forum
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
sbacchetta:

Sorry but I have seen jpg image with that kind of artifacts in the past. The one on the upper part of the zoomed picture look a tad strange, but not impossible either. PS I am a professional retoucher/photographer, you don't have to trust me, but I am not speaking out of my a##
well theres also the fact that coreteks came out and said it was a troll pic.
data/avatar/default/avatar18.webp
Astyanax:

well theres also the fact that coreteks came out and said it was a troll pic.
Ok, sorry I didn't read that part. Any ways, in a few dozen of hours we will get actual performance number AND price.
data/avatar/default/avatar37.webp
those graphs probably represent the price not the performance hahahaha
data/avatar/default/avatar18.webp
ViperAnaf:

those graphs probably represent the price not the performance hahahaha
Muahahahahah 😀
data/avatar/default/avatar40.webp
Yeah I don’t see what’s funny im expecting close to 2x performance specially in ray tracing it was terrible on 2080ti
https://forums.guru3d.com/data/avatars/m/282/282392.jpg
kieron fleming:

Yeah I don’t see what’s funny im expecting close to 2x performance specially in ray tracing it was terrible on 2080ti
Yup they got us by the cajones with all this RT core BS, the 2080ti should have had at least 100. No doubt CP2077 will release with that much RT only 3090 can cope with it's 200 odd. A dedicated RT card now there's a thought!:D
https://forums.guru3d.com/data/avatars/m/72/72830.jpg
DannyD:

Yup they got us by the cajones with all this RT core BS, the 2080ti should have had at least 100. No doubt CP2077 will release with that much RT only 3090 can cope with it's 200 odd. A dedicated RT card now there's a thought!:D
Hope not, this is what killed physX and for RT it would likely create crazy latencies
data/avatar/default/avatar37.webp
Fox2232:

I love when people make jokes... Like measured at 4K while DLSS enabled. (Means 1440p with DLSS upscaling to 4K.)
It doesn't matger what they say or how they bench as long a s it all else being equal comparison then the percent increase type bench is fine (they don't even need to indicate resolution of its exactly the same on both) the main thing this is a test of is ray tracing performance and it indicates a doubling on average that's all that needs to be compared here. why does DLSS being an option bother people so bad they could call it 1440p up scaled and inserted into a 4k output signal but it would mean the same thing. Heck all signals that input into my TV is automatically uosclsed to 4k by my display should that matter as well?
data/avatar/default/avatar27.webp
Exodite:

Uh, maybe I'm an outlier here but a 2X performance increase at the top end using DLSS + RTX is hardly a high expectation. I'd probably be disappointed if it isn't more than that, given than RTX with Turing has been a huge drag on performance and DLSS being improved with Ampere is definitely the expectation. Now if we're talking 2X in pure raster performance, without DLSS, then yes - that would be a pretty high expectation!
getting rtx not to be a hit to performance as much is all I really want combined with any reasonable about of increased raster. if they do 35-50% higher raster along with rtx that has a 50% reduction on the hit it causes would seem pretty good to me and would give most games a big bump while allowing rtx games to deliver higher fps than before when compared to old non rtx numbers. if we end up getting a situation where you see 2x the fps with rtx on vs the same in the old card then that's a huge improvement remember rtx used to take 60 turn it to 30 so if I can now have 90 without it and 75 with that's a very nice increase.
data/avatar/default/avatar26.webp
Exodite:

I suppose it's a bit silly to discuss made-up slides but my reasoning is that RT is by far the most significant limiting factor in certain situations, like Control. And we're all expecting significantly higher RT performance with Ampere. Hence, Ampere wouldn't need much more raster performance do significantly outdo Turing in the examples used - because what performance it does have won't be constrained by RT (or at least not to the same extent). The DLSS aspect is less important, generally speaking it just means the comparison is made at a lower resolution. I would imagine that any performance hit from the technology to be lower with Ampere though.
right the comments on DLSS were basically useless as there was no comparison between it on and it off which is the only time it's inclusion matters either way. they both used it and what their actual performance / fps was isn't even listed it's all just relative numbers which DLSS would have no impact on. you do a bench with DLSS and without at what the output resolution would be to get a clear idea of its benefit. It's that or you run DLSS at its "before ai up scaling res" and compare it to the same res but without it at which point you compare image quality. outside of those two comparisons it's inclusion or exclusion is irrelevant
data/avatar/default/avatar03.webp
Fox2232:

Does not look equal, nor better as even newest implementations stand. For now, it is always downgrade in fidelity. In some cases visual downgrade is so big, that when applied to 1440p and then down-scaled to 1080p, you can still see that it is much worse. But hey, you can always blame game developers... all of them. I would not touch DLSS for now, unless my HW was so poor, I could not play game without it.
fps over perfect visuals or high res any day im playing a game to play it not take screenshots and when I'm in the heat of battle I'm not focused on small tiny imperceptible differences I have to pause and stare at to see.
https://forums.guru3d.com/data/avatars/m/226/226700.jpg
Thank you for posting this article Hilbert. And I definitely appreciate your cautious approach to their validity.
data/avatar/default/avatar12.webp
rdmetz:

fps over perfect visuals or high res any day im playing a game to play it not take screenshots and when I'm in the heat of battle I'm not focused on small tiny imperceptible differences I have to pause and stare at to see.
While I completely agree with you, you understand that the same argument can be use against ray tracing....
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
rdmetz:

fps over perfect visuals or high res any day im playing a game to play it not take screenshots and when I'm in the heat of battle I'm not focused on small tiny imperceptible differences I have to pause and stare at to see.
Are you really telling that to someone with 1080p screen and 240Hz? You have quite some option with 4K TV. And that's to render at 1080p and use integer upscaling done by Graphics Card instead of outputting lower resolution and then letting TV to mess it up. Left FXAA, Right TAA
Vermintide2_FXAA.jpg

Vermintide2_TAA.jpg
Files are compressed to jpeg as forum does not take as large files as required for 1080p png. But it yet again demonstrated staggering difference. Both jpegs have quality set to 95%. To preserve 95% of details: - FXAA image has size 1660KB. - TAA image has size 1152KB. That's how much less information there is once TAA is applied. So, My baseline for image quality comparison has always been image without TAA applied. Sure, DLSS can match (in some instances) TAA and often enough it is even better. But TAA should never be baseline for self respecting gamer. TAA is in most cases sorry excuse for image quality downgrade. And then comes that total absurdity with VRAM. People are like: 8GB VRAM not enough. We need more for insane image quality. Sure, use 4K textures by all means. But then do not degrade that final image to point where it looks like it has been rendered with textures 2 LoDs worse. Because that in itself is waste of performance on top of wasting VRAM.
https://forums.guru3d.com/data/avatars/m/242/242573.jpg
Fox2232:

Does not look equal, nor better as even newest implementations stand. For now, it is always downgrade in fidelity. In some cases visual downgrade is so big, that when applied to 1440p and then down-scaled to 1080p, you can still see that it is much worse. But hey, you can always blame game developers... all of them. I would not touch DLSS for now, unless my HW was so poor, I could not play game without it.
You clearly haven't seen Death Standing at 1440p upscaled to 4k, compared to the game rendered in 4k native using DLSS 2.0. I never thought it possible, but the DLSS 2.0 looked better than native 4k. Makes no sense, but it is what it is.
data/avatar/default/avatar39.webp
Andrew LB:

You clearly haven't seen Death Standing at 1440p upscaled to 4k, compared to the game rendered in 4k native using DLSS 2.0. I never thought it possible, but the DLSS 2.0 looked better than native 4k. Makes no sense, but it is what it is.
Short of proper super-sampling, DLSS is the best looking AA tech on the planet. And it makes sense because it's derived from super-sampled images. The fact that it's super cheap to execute on Tensor equipped GPU is a reason why Nvidia took somewhat different approach: instead of representing DLSS as purely image enhancing technique, like all AAs are, they choose to represent it as a perf. optimizing technique. Reason being that DLSS is so good looking that Nvidia was bold enough to call it much higher res than what it's being rendered at. I wouldn't waste my breath trying to argue that with @Fox2232. He spent more time writing about how bad is DLSS than playing games on his 5700XT, and I am not sure I am even joking 🙂
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Andrew LB:

You clearly haven't seen Death Standing at 1440p upscaled to 4k, compared to the game rendered in 4k native using DLSS 2.0. I never thought it possible, but the DLSS 2.0 looked better than native 4k. Makes no sense, but it is what it is.
I did. And I laughed at it. Textures for terrain/rocks are not having sufficient resolution to actually scale image quality above 1080p. They are actually more suited for 720p. What improves with higher resolution (while using DLSS) is perceived amount of blur around edges. And perceived reduced amount of IQ loss. (Not per pixel, but per screen area on denser screens with higher resolution.) In other words, someone who plays at 1080p without TAA/similar IQ downgrading technologies and has applied AA one way or another has IQ comparable to that on 4K with DLSS on almost all pixels. DLSS does good job at clean edges. But fine detail suffers. And only reason why Death Stranding does not look as atrociously on 4K as it does on 1080p with DLSS 2.0 is that 4K has 4 times as much pixels per same viewing angle which results in 4 times as much pixels per any detail element. Therefore blurring area no longer covers pixel wide detail as same detail now covers 4 pixels. But Death Stranding image quality is joke, no matter how one looks at it as it should be much better at each resolution. On top of that is is yet again comparison of TAA vs DLSS. And guess what? That TAA still wins in water reflections enough for it to be visible in 4K video downsampled to 1080p. IQ loss visible at when downsampled from 4K to 1080p must be really lovely at native 4K. There used to be PCMR laughing at consoles bad IQ due to low rendering resolutions upsampled to 1440p/4K. PC-downgrade-race it is as long as this is considered way to go. @Noisiv : Problem of DLSS is not AA, it is in areas where AA has no business to touch anything. AA is supposed to be applied on tiny fractions of pixels that are at edges of polygons. But DLSS messes up fine details on textures because it can't provide you with information that was not there when rendered at lower resolution. It is not worth to downgrade image quality for 90% of pixels just so one has few finely AAed pixels. And with Death Stranding it is big problem. Because if game was rendered at 1080p, then DLSS took care of just AA on edges, it would provide wonderful IQ on native 1080p, that would beat IQ on those higher resolutions as they are now.
https://forums.guru3d.com/data/avatars/m/260/260855.jpg
Well, I wasn't in the market for a new card...but a 3080 that doubles the 2080 in performance for the same $700 price? That sounds pretty sweet...
https://forums.guru3d.com/data/avatars/m/238/238795.jpg
rm082e:

Well, I wasn't in the market for a new card...but a 3080 that doubles the 2080 in performance for the same $700 price? That sounds pretty sweet...
Good luck actually finding one for that suggested price.