GeForce RTX 2080 and 2080 Ti - An Overview Thus far

Graphics cards 1049 Page 1 of 1 Published by

Click here to post a comment for GeForce RTX 2080 and 2080 Ti - An Overview Thus far on our message forum
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
This is a great summary of everything we know so far. Should also point out that Nvidia hinted at the tensors being used for more than just raytrace acceleration - at Siggraph they mentioned a few new AA methods, ATAA and DLAA, and they also mentioned AI Upscaling. No idea if any of that is coming to consumer cards - but I figure if they have the hardware they may as well do value-add features with it. Especially because new rumors are pointing to the GTX2060 being as fast as a 1080. I feel like they are going to need to have some nifty features for the RTX series to drive sales.
https://forums.guru3d.com/data/avatars/m/271/271560.jpg
great job keeping this concise. afaik, tensor cores will be disabled on the consumer line. at least, that's what Nvidia hinted at during earnings call 2nd quarter 2017. mainly so AI and Deep Learning have to buy Quadros and Titans...as is AI, Deep Learning, and Big Data have been cheating with 1080ti's (on a massive scale... i.e. Google has more than the nations of Denmark, Belgium, and the Netherlands combined).
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
tunejunky:

great job keeping this concise. afaik, tensor cores will be disabled on the consumer line. at least, that's what Nvidia hinted at during earnings call 2nd quarter 2017. mainly so AI and Deep Learning have to buy Quadros and Titans...as is AI, Deep Learning, and Big Data have been cheating with 1080ti's (on a massive scale... i.e. Google has more than the nations of Denmark, Belgium, and the Netherlands combined).
The RTX acceleration is based on Tensor. They will probably be crippled for training ops but definitely not for inferencing. Also I don't know why you think they are "cheating" with 1080Tis.. the 1080Ti doesn't support TCC mode.
data/avatar/default/avatar21.webp
Well done!! ๐Ÿ™‚ I wonder if we will actually see those ~prices.
https://forums.guru3d.com/data/avatars/m/271/271560.jpg
mai
Denial:

The RTX acceleration is based on Tensor. They will probably be crippled for training ops but definitely not for inferencing. Also I don't know why you think they are "cheating" with 1080Tis.. the 1080Ti doesn't support TCC mode.
only because i've seen them in use at server farms
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
tunejunky:

mai only because i've seen them in use at server farms
Idk - doesn't really make much sense to use 1080Tis, TCC is completely disabled on them and even if you could find a way to enable it the SLI connector isn't going to give you any kind of scalability. Now if you said Titan's that would be a different story - because those have TCC enabled - but even there, on the newer Titans, the cluster bandwidth is gimped (NVLink disabled) on purpose so you're not going to get any kind of good scaling out of them. That's not to mention that GP100/GV100 have other features more geared towards compute workloads: http://images.anandtech.com/doci/10325/PascalVenn.png?_ga=1.83280852.372099904.1468967622 GV adds independent scheduling and more. I mean I definitely think Nvidia is going to be gimping some features to keep data center guys using their beefier, more expensive hardware.. but AFAIK Tensor cores power RTX, the denoising runs on them. So the Tensors should 100% be there and somewhat functional for that regard.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
Nice preview. Although iยดm a little surprised with Nvidia releasing a new card with tensor and ray tracing cores at the same time... Seems expensive... Anyway, now all i have to do is wait for the next christmas giveaway...
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Interesting - seems like the 2000 series is definitely more than just a refresh. The 2080Ti is more impressive than I was expecting.
https://forums.guru3d.com/data/avatars/m/245/245459.jpg
Thanks for the article, an informative overview of what we could expect. I think I can see myself skipping this new architecture & going with the next one. I think with the inclusion of these new Ray Tracing Cores as well as the Tensor Cores, I'm thinking these may need time to mature with the next new release (after this one) rather than just jumping on the first release, and we also don't know how important or unimportant these new cores will be, so that's more of a risk jumping on it when games are not developed for it yet. I'm likely to extract a bit more value from GTX 1070, and only upgrade it if I can't maintain 1080p/144Hz at decent settings in upcoming online shooters (like BF V). I'm excited to read the reviews though for these cards, learn about it & to see the performance breakdowns!
https://forums.guru3d.com/data/avatars/m/240/240605.jpg
Good stuff boss. Technology is moving so fast, my wallet cant take any more hits. Please make it stop :O
https://forums.guru3d.com/data/avatars/m/270/270233.jpg
schmidtbag:

Interesting - seems like the 2000 series is definitely more than just a refresh. The 2080Ti is more impressive than I was expecting.
It'll be the 20 series (their current line is called the 10 series). Assuming that the products are real, I might just pick up a 2080 to play around with. I'm guessing it'll be sold out for months though (like with the 1080) so I might have to wait until next year to pick one up.
https://forums.guru3d.com/data/avatars/m/164/164785.jpg
Yah, I kind of hope that this release is just the 2080/Titan and not the Ti. I'll buy Titans at this point just because I know that the price is prohibitive and there probably won't be a big supply shortage whereas you can be pretty much guaranteed that the 2080 is going to be very hard to come by especially if it has any discernable gains where mining is concerned.
https://forums.guru3d.com/data/avatars/m/169/169351.jpg
Robbo9999:

Thanks for the article, an informative overview of what we could expect. I think I can see myself skipping this new architecture & going with the next one. I think with the inclusion of these new Ray Tracing Cores as well as the Tensor Cores, I'm thinking these may need time to mature with the next new release (after this one) rather than just jumping on the first release, and we also don't know how important or unimportant these new cores will be, so that's more of a risk jumping on it when games are not developed for it yet. I'm likely to extract a bit more value from GTX 1070, and only upgrade it if I can't maintain 1080p/144Hz at decent settings in upcoming online shooters (like BF V). I'm excited to read the reviews though for these cards, learn about it & to see the performance breakdowns!
I agree somewhat, it'll be a few generations until we see what these new features will bring us and how well they'll be implemented. The problem I see with the ray-tracing processing being leveraged (I would argue that it's simplified ray tracing, low pass with AI processing on top) is that it's exclusive to nVidia through their gameworks platform and if the rumours are true, there will be further segmentation with nVidia hardware as well (GTX vs RTX) so only the very top end cards will see this capability. The problem this brings, as we've seen with anything with Gameworks integration is, we see only few games with the enhanced graphics and fairly often, particularly if games are ported from consoles (which really shouldn't be difficult to port anymore given the current gen consoles are x86) often with crippling results (from both of the main hardware manufactures and again, only high end NV cards only just being able to view the results). Lets also be honest here, the majority of games available today are made for the consoles, the only difference is PC gaming allows for various configurations of quality (for the better usually of course) but what will be the point of this new technology if it's going to be inaccessible to the majority? Perhaps I'm being cynical but I can't help but fear we're going to see more of the same - PhysX and Gameworks being locked away and once again with the ray tracing package being a part of gameworks, stifling the innovation that nVidia are actually providing and it and makes me think of this quote "the left hand doesn't know what the right hand is doing" - in this case the left hand of nVidia is innovation, making great hardware and technology but the right hand is the greed and monopoly, keeping it locked away, not giving the innovation what it deserves. So what do I mean by all this? Well while I can appreciate that yes, people will be expected to pay high prices for the new features, it's understandable, the problem is, we still wont see the innovations for many many many years, not because of the prices themselves but because of the exclusivity of the features and it's resultant platform availability and ecosystems and the rather small marketshare in the grand scheme of computer gaming - unless nVidia have a hand during game development, we're just not going to see the ray-tracing unless the next gen consoles also leverage this hardware. Tl;dr I just really really really hope nVidia's innovation in ray-tracing wont follow the same history of PhysX.
https://forums.guru3d.com/data/avatars/m/267/267581.jpg
Great editorial Hilbert, much appreciated. but what about the elephant in the room? strait to "2080 Ti" and no word about a 2070?! guess we'll have to wait till the official release so we could know more. someone promised sometime ago more unboxing videos ๐Ÿ˜‰, hope we'll be seeing more of our Don when the time comes:D
https://forums.guru3d.com/data/avatars/m/270/270233.jpg
BangTail:

I'll buy Titans at this point just because I know that the price is prohibitive and there probably won't be a big supply shortage whereas you can be pretty much guaranteed that the 2080 is going to be very hard to come by especially if it has any discernable gains where mining is concerned.
Mining is dead. Ethereum has plunged back down to $300 (this was the price that initially triggered the mining boom so it's come full circle) and there's no sign of a sustained comeback. A few miners may pick up a 2080 out of curiosity, but most will be sold to gamers.
https://forums.guru3d.com/data/avatars/m/164/164785.jpg
D3M1G0D:

Mining is dead. Ethereum has plunged back down to $300 (this was the price that initially triggered the mining boom so it's come full circle) and there's no sign of a sustained comeback. A few miners may pick up a 2080 out of curiosity, but most will be sold to gamers.
I genuinely hope so.
https://forums.guru3d.com/data/avatars/m/172/172560.jpg
a lot of "humans" on the internet still cannot convince me that they are humans.
data/avatar/default/avatar37.webp
2080 ti for $800 Bucks sounds like a damm good deal! I still cant figure out what that raytracing is if it slapped me in the face haha. The graphics looked really nice of course in the video but again in plain english what am I looking for lol?
data/avatar/default/avatar08.webp
gx-x:

a lot of "humans" on the internet still cannot convince me that they are humans.
RTX 2070 will easily be able to simulate AI which is more nuanced than 3/4 of internet humans PS if coins that utilize RT cores take off, we're fked goiid thing is they won't appear right away
nz3777:

2080 ti for $800 Bucks sounds like a damm good deal! I still cant figure out what that raytracing is if it slapped me in the face haha. The graphics looked really nice of course in the video but again in plain english what am I looking for lol?
Ray tracing is a simulation of physically correct light-path. The resulting image being the solution of this simulation. Unlike traditional rendering, which starts either with the known image and just draws it, or it calculates certain parts of the scene, but nowhere near as strict and physically correct as Ray-tracing. Think of RT as a theoretical physicist's answer to a question: what this scene looks like