New NVIDIA GPU is named at Reuters, called Turing

Published by

Click here to post a comment for New NVIDIA GPU is named at Reuters, called Turing on our message forum
https://forums.guru3d.com/data/avatars/m/242/242134.jpg
@Tj and how many times in the past 10y did we see the promised "we will beat Nv.." gpus ATI?AMD promised? Not saying it wont happen at all, but the past shows this is unlikely. They did pretty good on getting their act together when it comes to driver/software, but as long as their cards need more power, run hotter (and hence noisier) to deliver the same performance, while costing (bit less or) the same, i dont see any appeal to switch.
https://forums.guru3d.com/data/avatars/m/270/270233.jpg
fry178:

@Tj and how many times in the past 10y did we see the promised "we will beat Nv.." gpus ATI?AMD promised? Not saying it wont happen at all, but the past shows this is unlikely. They did pretty good on getting their act together when it comes to driver/software, but as long as their cards need more power, run hotter (and hence noisier) to deliver the same performance, while costing (bit less or) the same, i dont see any appeal to switch.
Well, the R9 290X was actually a very good card, easily beating the 780 and even trouncing the Titan in some cases. The problem was the inefficient reference cooler, resulting in a very loud card (frankly, I have no idea why blower cards are still used by both companies). Also, AMD's not the only one with driver problems - in the past, Nvidia released drivers which actually burned out or destroyed the GPU and/or other components. IMO, AMD GPUs have always gotten less respect by gamers like us - we often complain about AMD giving up on gamers but we gave up on them first. I too am guilty of this, preferring cards from team green of late (although I bought a couple of AMD cards last year, it wasn't for gaming). This is the only positive that I see from the mining craze, as miners are buying out cards from both companies and making Polaris and Vega into something of a financial success.
https://forums.guru3d.com/data/avatars/m/231/231931.jpg
D3M1G0D:

I have no idea why blower cards are still used by both companies
Because non-blower cards suck in small form factors
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
fry178:

and how many times in the past 10y did we see the promised "we will beat Nv.." gpus ATI?AMD promised? Not saying it wont happen at all, but the past shows this is unlikely.
But seriously - how many times was this ACTUALLY promised? I don't remember this ever happening, beyond belligerent fanboyism. AMD on the CPU end has had a lot of broken promises and cherry-picked results, but on the GPU end I don't recall their claims ever getting too ambitious. Anyway, there have been times AMD had an objectively better product than Nvidia. It's just that those moments weren't common, and they were usually short-lived.
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
can it pass the turing test?
Agent-A01:

Because non-blower cards suck in small form factors
that might be so but in bigger card that arnt in small form cases suck with blower
https://forums.guru3d.com/data/avatars/m/272/272452.jpg
schmidtbag:

Not quite - by that analogy, the GPU would have to be named something like "fritz" or "electromagnetic pulse". Besides, they already used Volta, so Ampere, Watt, or Joule would make sense as successors, since they're all related to electricity or power.
Think we will ever see a GPU architecture named Oppenheimer?
data/avatar/default/avatar38.webp
schmidtbag:

But seriously - how many times was this ACTUALLY promised? I don't remember this ever happening, beyond belligerent fanboyism. AMD on the CPU end has had a lot of broken promises and cherry-picked results, but on the GPU end I don't recall their claims ever getting too ambitious.
"several months ahead of competition" - Polaris "Overclocking Champion" - Fury X "poor Volta" - Vega It's not like the crazy need too much encouragement - usually a tweet is enough.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Noisiv:

"several months ahead of competition" - Polaris "Overclocking Champion" - Fury X "poor Volta" - Vega It's not like the crazy need too much encouragement - usually a tweet is enough.
Mind citing who said those things? I never heard anyone say any of those things. Also, unless it's someone very high up in AMD (like Lisa Su) or an engineer, I'd take anything anyone says on Twitter with a grain of salt.
data/avatar/default/avatar23.webp
First quote is from Raja. 3nd is from AMD's countdown site for Vega launch 2nd is also from official channels. Either Lisa, or promo material for Fury X, cant remember which tbh. You already made me google for first two, you find this one 🙂
data/avatar/default/avatar32.webp
D3M1G0D:

Well, the R9 290X was actually a very good card, easily beating the 780 and even trouncing the Titan in some cases. The problem was the inefficient reference cooler, resulting in a very loud card (frankly, I have no idea why blower cards are still used by both companies). Also, AMD's not the only one with driver problems - in the past, Nvidia released drivers which actually burned out or destroyed the GPU and/or other components.
mmm... if I remember correctly... 290 did not beat 780, until maybe 12 or 18 months down the line. Awesome card if you are a cheapskate like me LOL. It repayed itself many many many many times over. "loud" comes close to describing it, but.... not really. The first time its fan jumped at 80%, I fell from my chair!! 😀 😀
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Noisiv:

First quote is from Raja. 3nd is from AMD's countdown site for Vega launch 2nd is also from official channels. Either Lisa, or promo material for Fury X, cant remember which tbh. You already made me google for first two, you find this one 🙂
1st quote said "transition", not "competition". What Raja was implying in this context was AMD had (in his opinion - keep in mind he said "we believe") better priorities than Nvidia with Polaris. To paraphrase, he's basically saying "Polaris is more forward-thinking than Nvidia's self-driving car chips", which I don't really agree with, but it's a lot more realistic of him to say that than "Polaris is more competitive than what Nvidia will launch". I tried googling the 2nd quote and couldn't find anything useful; mostly just forum posts. I'll give you the 3rd one - that was a bit braggy, and I confirmed that it is officially from AMD's marketing team. So, shame on them for that. On the other hand... Vega can be a modest competitor against Volta, when accounting for price and compute performance. But to say "poor Volta" is a bit cringy, considering Volta tends to win by a wide margin in most tests.
data/avatar/default/avatar04.webp
Cringiest thing I've heard was made during live conference. By Raja. Something like "no one knows what they're doing haha", implying that Nvidia was bumbling with AI. Which coming from a smart and likable guy, was kinda weird. Then he basically apologized for not recognizing AI earlier. So what NV is doing with AI is, they're raking in hundreds of millions, while he's apologizing. Was so weird.
https://forums.guru3d.com/data/avatars/m/267/267787.jpg
This might just be the time to turn my trusty RX 580 in... Or will I wait for something midrange from AMD again.
https://forums.guru3d.com/data/avatars/m/265/265660.jpg
Fine they can call it Turding as well so long it costs Euros and not kidneys.
https://forums.guru3d.com/data/avatars/m/270/270233.jpg
Noisiv:

mmm... if I remember correctly... 290 did not beat 780, until maybe 12 or 18 months down the line. Awesome card if you are a cheapskate like me LOL. It repayed itself many many many many times over. "loud" comes close to describing it, but.... not really. The first time its fan jumped at 80%, I fell from my chair!! 😀 😀
Well, if the benchmarks at AnandTech are anything to go by, it beat the 780 from the get-go. It ran twice as loud though so AMD developed a reputation for creating loud and hot cards. It then got caught up in the initial Bitcoin mining craze and gamers weren't able to buy them anymore (followed by the mining crash, which burned AMD badly). Bad cooling decisions and an unfortunate series of events means that it's not remembered too fondly by enthusiasts (or by AMD). At any rate, just pointing out that AMD had made competitive products in the not-so-distant past. Polaris could have been competitive as well but for some unknown reason the RTG decided to abandon the high-end market by limiting it to 150 watts (IMO, this will go down in history as one of their biggest mistakes).
https://forums.guru3d.com/data/avatars/m/164/164785.jpg
Agent-A01:

Because non-blower cards suck in small form factors
Just to add to that, non blower cards are not good for SLI either.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
D3M1G0D:

Well, if the benchmarks at AnandTech are anything to go by, it beat the 780 from the get-go. It ran twice as loud though so AMD developed a reputation for creating loud and hot cards. It then got caught up in the initial Bitcoin mining craze and gamers weren't able to buy them anymore (followed by the mining crash, which burned AMD badly). Bad cooling decisions and an unfortunate series of events means that it's not remembered too fondly by enthusiasts (or by AMD). At any rate, just pointing out that AMD had made competitive products in the not-so-distant past. Polaris could have been competitive as well but for some unknown reason the RTG decided to abandon the high-end market by limiting it to 150 watts (IMO, this will go down in history as one of their biggest mistakes).
Reason was probably money, I've seen cost estimates of about ~$150M per chip SKU for manufacturing/bring-up/marketing/the entire package. AMD was ramping up Ryzen at the time, so they probably had to make cuts and figured that if Polaris could serve 85% of the market it wouldn't be too much of a loss if they bowed out of high end for a year.
https://forums.guru3d.com/data/avatars/m/260/260826.jpg
-Tj-:

and Vega refresh should be just enough to rival full GP102, and that OC'ed can almost reach Volta perf. So its kind of logical NV will make something faster then that.
Do you still believe in fairy tales (AKA marketing BS/Bro/fanboys posts in forums)? When was the last time an AMD GPU had a good OC margin? 290X? AMD OC: - AMD is stubborn about HBM/HBM2 for gaming GPUs (now mining GPUs) .Mem factory clocks and OC clocks are almost the same. - The chip clock is already near his max from factory to try to reach near competition performance (completely loosing his power efficiency in the process). At this point (since some gens) the only Nvidia competition has in gaming GPUs is...Nvidia itself.
https://forums.guru3d.com/data/avatars/m/269/269912.jpg
I don't see Nvidia calling their new gpu "Ampere". There ishttps://lh3.googleusercontent.com/proxy/mRK-idgftdhlY6SG1O1_jHSfeyZYDkVQFINjjdtBdUpVLi5WFtgo4MhzPveDJtK0SmpOVvT26cIeGEI1HU_eh89zpRfBaeVR-QLDm4TrvLyK2ouPAePy_15VPlCx3y0C8ggKnsZQ3cyolWkk22PAzFlZUE5CDg=w152-h160-k-no scooter company called Ampere, and Intel has a server chip called Ampere https://www.extremetech.com/wp-content/uploads/2018/02/Ampere-Feature-640x354.jpg so that name is well used up. I think Volta is still viable since it has been on the Nvidia roadmap since 2013. This is the Volta https://images.techhive.com/images/article/2017/05/volta-gpu-size-100722193-large.jpg so why name it after some obscure scientist that was only relevant for a couple of years during the second world war? And why make all these roadmap slides with Volta on it to change the name at the last minute and confuse everyone? You also expose the company to ridicule when one of the Turing cards fails it will be said that it committed suicide. What is he going to do put a sticker on it that says Turing, or dumbass? Here is a new rumour I hear AMD's new gpu will be based on their newly named "Lamprey" platform. https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcQr0NfB7d_Bvjq3DuTNh2x-tgNPnbbVgxk6T7iz_2b4S86Grxz0
https://forums.guru3d.com/data/avatars/m/186/186798.jpg
Next month? God damnit! I've waited too long to update my GPU, the itch to upgrade is so bad that my body is just a big rash by now! I saved enough money to get the top card they release, I'M READY.