DeepCool LS720 (LCS) review
Fractal Design Pop Air RGB Black TG review
Palit GeForce GTX 1630 4GB Dual review
FSP Dagger Pro (850W PSU) review
Razer Leviathan V2 gaming soundbar review
Guru3D NVMe Thermal Test - the heatsink vs. performance
EnGenius ECW220S 2x2 Cloud Access Point review
Alphacool Eisbaer Aurora HPE 360 LCS cooler review
Noctua NH-D12L CPU Cooler Review
Silicon Power XPOWER XS70 1TB NVMe SSD Review
GeForce RTX 2080 and 2080 Ti - An Overview Thus far





It has been a long time coming but NVIDIA is ready to announce their new consumer graphics card next week. Looking back in the past, they’ll start with the GeForce GTX or what I believe will be called GeForce RTX 2080. In this quick write-up I wanted to have a peek at what we’re bound to expect as really, the actual GPU got announced this week on Siggraph already.
Read article
Advertisement
« Cooler Master Masterliquid ML240R RGB review · GeForce RTX 2080 and 2080 Ti - An Overview Thus far
· Guru3D Rig of the Month - August 2018 »
pages 1 2 3 4 > »
tunejunky
Senior Member
Posts: 2360
Senior Member
Posts: 2360
Posted on: 08/17/2018 05:07 PM
great job keeping this concise.
afaik, tensor cores will be disabled on the consumer line. at least, that's what Nvidia hinted at during earnings call 2nd quarter 2017. mainly so AI and Deep Learning have to buy Quadros and Titans...as is AI, Deep Learning, and Big Data have been cheating with 1080ti's (on a massive scale... i.e. Google has more than the nations of Denmark, Belgium, and the Netherlands combined).
great job keeping this concise.
afaik, tensor cores will be disabled on the consumer line. at least, that's what Nvidia hinted at during earnings call 2nd quarter 2017. mainly so AI and Deep Learning have to buy Quadros and Titans...as is AI, Deep Learning, and Big Data have been cheating with 1080ti's (on a massive scale... i.e. Google has more than the nations of Denmark, Belgium, and the Netherlands combined).
Denial
Senior Member
Posts: 13757
Senior Member
Posts: 13757
Posted on: 08/17/2018 05:09 PM
great job keeping this concise.
afaik, tensor cores will be disabled on the consumer line. at least, that's what Nvidia hinted at during earnings call 2nd quarter 2017. mainly so AI and Deep Learning have to buy Quadros and Titans...as is AI, Deep Learning, and Big Data have been cheating with 1080ti's (on a massive scale... i.e. Google has more than the nations of Denmark, Belgium, and the Netherlands combined).
The RTX acceleration is based on Tensor. They will probably be crippled for training ops but definitely not for inferencing. Also I don't know why you think they are "cheating" with 1080Tis.. the 1080Ti doesn't support TCC mode.
great job keeping this concise.
afaik, tensor cores will be disabled on the consumer line. at least, that's what Nvidia hinted at during earnings call 2nd quarter 2017. mainly so AI and Deep Learning have to buy Quadros and Titans...as is AI, Deep Learning, and Big Data have been cheating with 1080ti's (on a massive scale... i.e. Google has more than the nations of Denmark, Belgium, and the Netherlands combined).
The RTX acceleration is based on Tensor. They will probably be crippled for training ops but definitely not for inferencing. Also I don't know why you think they are "cheating" with 1080Tis.. the 1080Ti doesn't support TCC mode.
Embra
Senior Member
Posts: 1300
Senior Member
Posts: 1300
Posted on: 08/17/2018 05:09 PM
Well done!!
I wonder if we will actually see those ~prices.
Well done!!

I wonder if we will actually see those ~prices.
tunejunky
Senior Member
Posts: 2360
Senior Member
Posts: 2360
Posted on: 08/17/2018 05:10 PM
mai
only because i've seen them in use at server farms
mai
The RTX acceleration is based on Tensor. They will probably be crippled for training ops but definitely not for inferencing. Also I don't know why you think they are "cheating" with 1080Tis.. the 1080Ti doesn't support TCC mode.
only because i've seen them in use at server farms
pages 1 2 3 4 > »
Click here to post a comment for this article on the message forum.
Senior Member
Posts: 13757
This is a great summary of everything we know so far. Should also point out that Nvidia hinted at the tensors being used for more than just raytrace acceleration - at Siggraph they mentioned a few new AA methods, ATAA and DLAA, and they also mentioned AI Upscaling. No idea if any of that is coming to consumer cards - but I figure if they have the hardware they may as well do value-add features with it. Especially because new rumors are pointing to the GTX2060 being as fast as a 1080. I feel like they are going to need to have some nifty features for the RTX series to drive sales.