PNY GeForce RTX 2080 and 2080 Ti Product Data Sheet Slips Out

Published by

Click here to post a comment for PNY GeForce RTX 2080 and 2080 Ti Product Data Sheet Slips Out on our message forum
https://forums.guru3d.com/data/avatars/m/196/196426.jpg
buhehe:

nice chart!
The big drop after 2008 happens to coincide with the launch of Radeon 4000 series from ATI (RV770 chip) https://en.wikipedia.org/wiki/Radeon_HD_4000_series That architecture and the following GCN continued to be a big thorn in nVidia's back all the way to GTX 980, meaning they couldn't just push the price as high as they wanted (780 Ti being an exception, as it was much faster than previous halo product Titan, due to being a full chip with all it's glorious 2880 cuda cores) But with AMD being so slow to release Vega, and being underwhelming, they can do the same thing they did before 2008, just slowly cook our wallets into higher and higher prices. ~~ Imagine that RV770 wasn't a good chip and ATI/AMD was always far behind, providing only low-end products. Look at that chart, and imagine the line that started with GeForce 4 and continued with 6800 and 8800... if it would still go on that slope. The prices on "top" chips today would be over 1300$: http://dl.wavetrex.eu/2018/nv1300.png
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
wavetrex:

Now the x80 is $800 and the Ti is $1000. Really ?? In just 3 generations we moved from $500 high-end mainstream and $700 for elite "Ti" to the new amazing prices. What's next ? $1000 for 3080 and $1500 for Ti version? And I thought I paid way too much on my 1080... before the mining frenzy. Keeping it until it breaks.
The Ti letters are sprinkled with magic fairy dust that makes you PC run 10% faster, at least! On top of that it also improves your health and your sexual performance so as you can see the price increase is totally justified...
data/avatar/default/avatar08.webp
Im loving the price $1000 + is just a crazy number for a gaming gpu. They take the piss out of you and you let them. It will be 2021 before i upgrade from my vega 64. rock solid drivers and amazing performance so long as you avoid overclocking.
https://forums.guru3d.com/data/avatars/m/270/270041.jpg
alanm:

What other rumors? The 2080 will NOT be weaker than a 1080ti, there is no historical precedence. AdoredTV (who seems to have a sold inside source), and the first on the internet to let us know of the RTX2080, mentioned that the 2080 should be around 8% stronger than the 1080ti. That in itself is very disappointing. May not be accurate, who knows, but it will NOT be weaker than last gens Ti. That would be a monumental failure on Nvidias part and they would get eaten alive by review sites. Secondly, they will have an uphill battle trying to peddle this 'junk hardware' to gamers at the 'rumored' high prices. Pls people, use some common sense when tossing out these so-called 'rumors'. I dont even think they are rumors, just silly forums chatter here and there.
https://www.guru3d.com/articles-pages/geforce-rtx-2080-and-2080-ti-an-overview-thus-far,1.html I mean if you actually use this forum and read the pages, scroll down a bit and you'll see the 2080 has a rumoured compute limit of 11 TFLOPS, whilst the 1080TI has a maximum compute limit of 11.5 TFLOPS, as also stated its a bit of speculation on few data points. Please actually read articles posted by HH before you run your mouth. Like you i hope it isn't true, but it is possible. yes historically its never happened... but there is a first time for everything and if what we know is true, then in theory it might be a tad weaker
https://forums.guru3d.com/data/avatars/m/271/271560.jpg
well it seems like Nvidia decided to play "tick-tock"... architecture now, process shrink later. with the efficiency of their design that is a viable play, more for shareholders than consumers. this buys them time to make sure all is hunky dory before a 7nm process. but... this leaves an 18 month window for AMD. all they have to do is make sure at least one new gpu scores as well as a 1080ti (for less money)...with Freesync that's a slam-dunk for sales. we will see in January
data/avatar/default/avatar05.webp
If this was a AMD product everyone would be laughing and ridiculing the die size. But its nvidia so you all turn a blind eye. why is that ?
data/avatar/default/avatar13.webp
And to be honest how they are doing this ray tracing is flawed. It will have a shimmer effect and you can see that in their demo. It will be annoying and people will just turn it off. This is taking up die space so unless devs use it, its no different from a igpu on my cpu wasted silicon. The nvidia game engine they call unreal should never become a mainstream game engine because that's bad us.
https://forums.guru3d.com/data/avatars/m/196/196426.jpg
Ricepudding:

2080 has a rumoured compute limit of 11 TFLOPS, whilst the 1080TI has a maximum compute limit of 11.5 TFLOPS
That doesn't mean much. Vega 64 (Water) has 13.7 TFlops and we already know that it can -barely- compete with 1080 (Non-TI), that is the founder Ed, non-overclocked, which only 8.9 Tflops. Comparing gaming performance of two different architectures (And Turing is a completely new one, according to nVidia) based solely on TFlops is just... silly. Also the 11, where is that coming from ? The shader count difference is exactly 15%. Since clocks are similar, if architecture isn't changed (much) it would result in 10.2 TFlops, not 11 ! Hey, I'm an Nvidia fan, typing right now on an 1080, looking at an 100Hz G-SYNC 34" monitor, with a Gameworks enabled game minimized in the Taskbar... but that doesn't mean I should blindly believe that the new gen is going to be a miracle beating soundly everything before it. Actually, the numbers that already popped up show that it will be just a small incremental upgrade, no more than 10-20%, compared to the chips that are being replaced. I guess we shall find out in 2-3 days what the truth is.
https://forums.guru3d.com/data/avatars/m/270/270041.jpg
wavetrex:

That doesn't mean much. Vega 64 (Water) has 13.7 TFlops and we already know that it can -barely- compete with 1080 (Non-TI), that is the founder Ed, non-overclocked, which only 8.9 Tflops. Comparing gaming performance of two different architectures (And Turing is a completely new one, according to nVidia) based solely on TFlops is just... silly. Also the 11, where is that coming from ? The shader count difference is exactly 15%. Since clocks are similar, if architecture isn't changed (much) it would result in 10.2 TFlops, not 11 ! Hey, I'm an Nvidia fan, typing right now on an 1080, looking at an 100Hz G-SYNC 34" monitor, with a Gameworks enabled game minimized in the Taskbar... but that doesn't mean I should blindly believe that the new gen is going to be a miracle beating soundly everything before it. Actually, the numbers that already popped up show that it will be just a small incremental upgrade, no more than 10-20%, compared to the chips that are being replaced. I guess we shall find out in 2-3 days what the truth is.
It's HH estimates from the page i linked in his review of information, i'm hoping its wrong like i said....Also don't compare Tflops from AMD vs Nvidia it doesn't work like that, Comparing just Nvidia makes more sense specially when the architecture is very similar (personally don't think its totally new). although i agree it isn't totally apples to apples comparison. I'm just going off the chart that HH posted on here Though i agree it shouldn't be some miracle card, its been 28 months currently since they released the 1080, we should see more of a difference between the two... a small incremental upgrade would of made more sense 12 months after that release not 28 and counting, Though personally i don't mind too much having a small increase means my 1080TI will last longer so eh... just a shame for the industry not to have its normally big leaps in performance, though we will see more hopefully soon what these cards can do and if tensor cores will play a role or not.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
Ricepudding:

https://www.guru3d.com/articles-pages/geforce-rtx-2080-and-2080-ti-an-overview-thus-far,1.html I mean if you actually use this forum and read the pages, scroll down a bit and you'll see the 2080 has a rumoured compute limit of 11 TFLOPS, whilst the 1080TI has a maximum compute limit of 11.5 TFLOPS, as also stated its a bit of speculation on few data points. Please actually read articles posted by HH before you run your mouth. Like you i hope it isn't true, but it is possible. yes historically its never happened... but there is a first time for everything and if what we know is true, then in theory it might be a tad weaker
woops... missed wavetrex post... TFLOPs is a rough indicator, but doesnt equate to absolute GPU gaming performance, especially if comparing different arches. ie, Vega 64 had 12.6 TFLOPS vs 1080ti 11.5, but we know where things stand. Similarly GTX780ti with 5.3 TFLOPS vs 980ti @ 5.63 TFLOPs but performance spread is much larger. Other examples exist of TFLOPs vs performance disparity on different arches as well. Sorry if I had to 'run my mouth', but I cannot believe for a minute that Nvidia would upend its tradition of new card releases that ALWAYS had the no.2 card beating last gens flagships. Its like a 'sacrosanct' Nvidia formula. And the kicker is doing it at higher price points? They would not release such a product, in fact would make much more sense to have waited for 7nm and continue to milk Pascal.
https://forums.guru3d.com/data/avatars/m/270/270041.jpg
alanm:

woops... missed wavetrex post... TFLOPs is a rough indicator, but doesnt equate to absolute GPU gaming performance, especially if comparing different arches. ie, Vega 64 had 12.6 TFLOPS vs 1080ti 11.5, but we know where things stand. Similarly GTX780ti with 5.3 TFLOPS vs 980ti @ 5.63 TFLOPs but performance spread is much larger. Other examples exist of TFLOPs vs performance disparity on different arches as well. Sorry if I had to 'run my mouth', but I cannot believe for a minute that Nvidia would upend its tradition of new card releases that ALWAYS had the no.2 card beating last gens flagships. Its like a 'sacrosanct' Nvidia formula. And the kicker is doing it at higher price points? They would not release such a product, in fact would make much more sense to have waited for 7nm and continue to milk Pascal.
Oh no i totally agree, maybe i came off as a bit too disgruntled. What i meant was although it's not apples to apples it can be a good place to try and see performance changes specially when the architecture seems quite similar to pascal, though as i mentioned the tensor cores might play a bigger role in the difference in performance we should see...My big worry or issue, is that the 2080 should be more like the 2070 (normally the XX70 equals or beats the previous TI card give or take some), but it doesn't seem to be the case least by what we know... and yeah the prices of GPU's are a big kicker for almost anyone atm, i feel bad for first time buyers honestly. and with the 2080TI not being a massive jump but still asking price wise for that jump, yeah thats a kicker But if the performance jump isnt that great then i will happily skip this generation, though i'm quite shocked that they might bring out the 2080ti already, normally thats around a year after the XX80 cards come out.
https://forums.guru3d.com/data/avatars/m/202/202673.jpg
RTX on Pascal vs Turing runs a factor 6 faster on the latter('s Tensor cores), so expect nVidia not only to coax game developers in implementing ray-tracing to promote Turing but also to push loyal Pascal users to buy new cards they otherwise definitely wouldn't have needed to buy yet. Impact on AMD GPU's remains to be seen, currently they're 'dreadfully' slow in comparison to 1080Ti, but Vega has Rapid Packet Math, who knows what's going to happen. Maybe the number of tensor cores on Turing is overkill for actual RTX/DXR implementation, so a Vega56 suddenly ends up being twice as fast as a Titan Xp in next year's ray-tracing games as the ray-tracing won't hit a bottleneck, and without the Vega56 necessarily being fast LOL...interesting times ahead. RTX in Vulkan should be nVidia/AMD apparently, so who knows...maybe the 7nm Vega really is a threat to the RTX Titan as AdoredTV's source was insinuating. edit: was it a factor 6 or was it worse...now I have to check the news again...
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
NDA apparnt dont mean anything anymore the leaks are like swiss cheese
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
2 more days hopefully things much clearer.
data/avatar/default/avatar01.webp
This misconception that vega 64 is slower than the 1080 needs to go away. Clearly the propaganda guys manage to succeed in planting that lie. Go on play frostpunk gtav sniper elite 4 project cars 2 total war hammer is you wish to cultivate the lie im happy for your ignorance.
data/avatar/default/avatar23.webp
Nvidia should bring 512 and 448 bit back for in future. I have old gtx 560 1gb with 256 bit. Now I have new gtx 1050 ti 4gb with 128 bit lol. RTX 2080 ti with 352 bit is a joke.
data/avatar/default/avatar20.webp
GTX 2080 has lesser cores than GTX 1080ti? So 1080ti will still win in performance ? Naaah???
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
What are you guys smoking 670 is faster than the 1050 non ti! And 770 trading blows with the 1050ti! Winning slightly till you factor in the consumptio where the 1050's destroy em!