NVIDIA releases some RTX 2080 performance numbers and some info on DLSS

Published by

Click here to post a comment for NVIDIA releases some RTX 2080 performance numbers and some info on DLSS on our message forum
https://forums.guru3d.com/data/avatars/m/273/273822.jpg
-Tj-:

So tiny bit faster then 1080ti.. that said 2070 will be between 1080 and 1080ti. Maybe near 1080ti if OCed to max.
the 2070 will be on par with the 1080ti. Give or take 5-10%. The 2080 will crush it and the 2080ti will run circles around it.
https://forums.guru3d.com/data/avatars/m/105/105985.jpg
nevcairiel:

In the presentation it was said that the algorithm needs to be trained for the specific games, so developers can submit it to NVIDIA for training on their super computer, and then basically just enable it once the data is available. Sounds to me like it sort-of is game specific, but perhaps easy to use.
hmmm great boobs jiggling on gta is gunna be noiceeee nv will have a great porn library after training that game
data/avatar/default/avatar05.webp
That's a nice graph 🙂 But you have to know how to read it... fully. Yes it suggests around 1080 Ti performance, and even >= 2x1080 with DLSS=ON But it also says that DLSS=On tanks GTX 1080, with at least 40% performance hit compared to DLSS=Off With DLSS=On GTX1080 exhibits ~30% bigger perf. hit compared to 2080. Furthermore, with DLSS=On 2080 drops by n% ---> GTX 1080 DLSS perf hit = 30% + n >= 40%
data/avatar/default/avatar24.webp
Nvidia keep throwing more marketing BS. How about some apples to apples comparison, or supply cards to reviewers.
https://forums.guru3d.com/data/avatars/m/260/260103.jpg
I certainly can't afford one of these new cards. But they look more impressive with everything you read.
https://forums.guru3d.com/data/avatars/m/154/154983.jpg
BetA:

well, he was right the last time, and SPOT on if i may say..
His previous rumour had the RTX 2070 with 7GB of VRAM, and had nothing about a RTX 2080 Ti.
data/avatar/default/avatar36.webp
Noisiv:

bla bla 🙂
Assuming they run DLSS on top of normal AA. Which very well might not be the case Because: ''We’ve seen it in motion and it’s really impressive. And the best part is that the AI hardware inside the Turing GPU actually means it can also boost performance, sometimes by over 50%. That’s a hell of a double win.'' https://forums.guru3d.com/threads/the-rtx-2080ti-thread.422587/page-10#post-5576818
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
meh I want one but not for the price, by time GPU can do 4k @ 60 comfortable they will move to 8k, which is aready poping up, and most card cant do 4k @60 not cost over 500$ and most streaming/tv service cant even do 1080p properly let alone 4k feeds
data/avatar/default/avatar16.webp
so roughly 30%~50% over the last 1080. prob under best conditions too..... at least there's a performance jump
https://forums.guru3d.com/data/avatars/m/115/115710.jpg
Fox2232:

So, there is certain heavy level of TAA in this scenario to guarantee performance gain for 2080 cards which do TAA faster. And then they replaced TAA with DLSS for 2080 which delivers close to same results as not running AA at all. So 2 questions remain: Hows does run that 1080 w/o TAA? How does run that 1080 w/ DLSS? Because on one of those comparison images floating over web, DLSS runs quite faster on Pascal than TAA. So theoretically 1080 with DLSS does around same as 2080 with TAA.
Apples to oranges. DLSS is more like AI assisted upsampling which allows you to run higher resolution at much higher speed, because they use AI to generate higher resolution image from lower one.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Assuming this is legit, these tensor cores are proving to be a lot more valuable for consumer-grade software than I ever would've expected. However, seeing as these results came straight from Nvidia, I'll take them with a grain of salt. I never take any charts that come directly from the manufacturer seriously. I'm glad this series isn't as boring as everyone predicted. I know AMD is planning on some pretty hefty changes but I'm not quite sure how they're supposed to compete with this. I hope they do though, because the price point of these new GPUs is unattractive to me. I'm not upgrading until 4K-capable GPUs don't cost more than the rest of my PC combined (at MSRP), including the GPU that's already in there...
https://forums.guru3d.com/data/avatars/m/273/273822.jpg
ezodagrom:

His previous rumour had the RTX 2070 with 7GB of VRAM, and had nothing about a RTX 2080 Ti.
yeah, he likes to contradict himself.
data/avatar/default/avatar34.webp
Every AI graphic solution I know of has artifacts.. Am I the only one thinking DLSS will introduce artifacts? I mean.. Compromises has to be made as I see it.
data/avatar/default/avatar36.webp
So if we take the average of 20 and 50, it's 35%. The 1080 Ti was 34-35% faster than the 1080. That would mean the 2080 would be on par with the 1080Ti. After the last 2 generations of NV, this would be a disappointing result (as the 970 nearly beat the 780 Ti and the 1070 easily stepped over the 980 Ti).
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
OrdinaryOregano:

Was it enabled outside of the game's own settings? I mean to ask, how have you determined that it's not game specific, it's AI based so I imagine it needs to be trained on ground truth?
It is an algorithm, the setting, in the end, will be available in the NV driver properties with a slider. Currently 2xDLSS equals (roughly) TAA. It's not a 100% perfect supersampling AA technology, but it's pretty good I must say. Considering you run them on the tensor cores, the shader engine is offloaded. So you're rendering a game with the perf of no AA, as DLSS runs on the tensor cores.
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
SmootyPoody:

Every AI graphic solution I know of has artifacts
So does every AA methodology. Remember DLSS is AA at pretty much no additional cost in the rendering engine.
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
asturur:

So tensor cores are not a quadro exclusive? we get them too?
That I can confirm, yes. You get the shader cores, RT cores and Tensor cores enabled.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Anarion:

Apples to oranges. DLSS is more like AI assisted upsampling which allows you to run higher resolution at much higher speed, because they use AI to generate higher resolution image from lower one.
But that's what they are comparing there.
https://forums.guru3d.com/data/avatars/m/270/270041.jpg
Still waiting ton third Party, but by looks of it without the DLSS, its similar performance to a 1080ti, which isn't that impressive...Though with DLSS it seems to be a lot stronger, big question is though is DLSS worth it? and specially at 4k, where i believe AA isn't as important how much should it be valued. Still i look forward to seeing how a 2080ti performs, but for the price tag i sure hope it doubles the 1080ti performance in at least the majority of games... Be nice if HH (if you have the time) to run a test using different AA methods and their performance with the 1080 vs 2080 to see how much AA matters and what performance hit there is between the two. would be rather interesting to see
https://forums.guru3d.com/data/avatars/m/270/270041.jpg
Hilbert Hagedoorn:

So does every AA methodology. Remember DLSS is AA at pretty much no additional cost in the rendering engine.
Assume this is because the Tensor cores which arent being used for rendoring are being used for the DLSS AA instead... Can you then have DLSS and ray tracing at the same time, or will a hit happen then?