Nvidia Turing GeForce 2080 (Ti) architecture review
Click here to post a comment for Nvidia Turing GeForce 2080 (Ti) architecture review on our message forum
PrMinisterGR
It all depends on the quality of the model training. Nvidia exaggerates always, so hold a smaller basket.
It is still impressive and the only logical way forward. Both the machine learning and the partial ray tracing. If AMD was more liquid, we would have had these a bit earlier.
People also forget that we will have an Intel GPU in a couple of years. Things are bound to be more interesting then.
chispy
That's a lot of new tech pack into this new video cards , great review @Hilbert Hagedoorn , it looks good but that price it is what turns off many people. Nevertheless great to see new technology moving forward.
pharma
Any new RTX tech/process adopted by competitors will need to be different so as not to infringe on Nvidia's hardware and software IP. In that aspect it may take longer to develop as opposed to licensing.
Reddoguk
I personally think these first RTX cards will be a flop. They will sell but people have already realized that these cards with RT On won't be pushing out the performance they expected.
Real-Time Ray Tracing might be a cool new feature but the hardware to run that type of tech is not here yet.
I don't care too much about Ray Tracing if you can only hit 30-60fps at 1080p. 🙁
Solfaur
Strictly aesthetically speaking, I really dislike the MSI Gaming this time around, the design is a mess. I wonder how much better the cooling is compared to the Duke, which I find to look a lot better. And of course I'm looking forward to all the Turing reviews like everybody else, despite the fact that I'll most likely skip this generation. Nice preview boss.
Caesar
https://image.ibb.co/b2QU0U/caesar2.jpg
"The lines up top marked “1 Turing frame” are the key. For Nvidia’s other GPUs, it would consist of just one portion—the yellow FP32 line. But it’s more complicated in Turing.
While that standard FP32 shader processing takes place, the dedicated RT cores and integer pipeline are executing their own specialized tasks at the same time. Once all that’s done, everything’s handed off to the tensor cores for the final 20 percent of a frame, which perform their machine learning-enhanced magic—such as denoising ray traced images or applying Deep Learning Super Sampling—in games that utilize Nvidia’s RTX technology stack."
Source: PC WORLD
https://image.ibb.co/euKuZp/caesar.jpg
StewieTech
I cant afford this. Im poor 🙁 *sad, crying face*
edit: also, good stuff Hilbert, thanks a lot :´)
Caesar
https://media1.tenor.com/images/c1847062cd9d4fc0ffe9e7f51936ff15/tenor.gif?itemid=6140316
😳...when RTX will be out .... i will try to mod the BIOS - Tensor cores - [Joke]
wavetrex
Somehow I feel that all the effort will be thrown on the backs of game developers, which need to recode their engines specifically for Turing to take advantage of the new stuff.
And because the market for these will be so tiny in the first year or so... they won't bother, unless NV pays them big to do it.
Without all the new fancy mumbo-jumbo into the game code, the mighty RTX cards will probably be, as all estimates show so far, only 15-25% faster than their Pascal counterparts, from the raw number of FP32 cores, and that only in 4K. Lower resolutions will see even less gain.
Less than a week left, we'll see.
Hilbert Hagedoorn
Administrator
StewieTech
https://static1.fjcdn.com/comments/I+feel+you+bro+_c348de8ec3c4bee27f1379f2061a9035.jpg
DSLore
I've ordered the Duke 2080Ti; after seeing it in the case on here.........wow, i'm really happy I did 😀
PrMinisterGR
Fox2232
wavetrex
varkkon
Awesome! Thanks for the write up and pics, I really love the look of the stock cards best they have ever been by a long shoot. My self I have my eye on the MSI RTX 2080 Ti Gaming TRIO, it looks really nice.
Am dying to read your mega review and see how the Ti does, I really hope it will be 50% over a 1080 Ti. Will see soon what the 411 is, happy testing/reviewing! 🙂
[ Edit ] What the heck I noticed the MSI RTX 2080 Ti Gaming TRIO has 3 power inputs!? is that a first? It has 2x 4 and 1x 3 that is crazy!
PrMinisterGR
Their approach makes sense guys.
tunejunky
PrMinisterGR
This doesn't feel like a rush job at all. Possibly doing this at "12nm" was, but this could have been the silicon after Maxwell.
tunejunky