NVIDIA Congratulates Turing - But Might Delay new upcoming Turing graphics cards

Published by

Click here to post a comment for NVIDIA Congratulates Turing - But Might Delay new upcoming Turing graphics cards on our message forum
https://forums.guru3d.com/data/avatars/m/180/180081.jpg
Actually Turing cracked the Enigma cipher, which was used by the german military during WW2 πŸ˜‰
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
Could be thats why they named the next cards Turing. The release date is as much a cipher as the Enigma code.
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
AlmondMan:

Actually Turing cracked the Enigma cipher, which was used by the german military during WW2 πŸ˜‰
Subtle difference that slipped me before my morning coffee, adding that πŸ˜‰
https://forums.guru3d.com/data/avatars/m/216/216490.jpg
alanm:

Could be thats why they named the next cards Turing. The release date is as much a cipher as the Enigma code.
...which it seems they still haven't cracked yet... Ergo the delaying/postponing. :P
https://forums.guru3d.com/data/avatars/m/271/271573.jpg
I assume that few of you can not enjoy the 1080's settling in your cases. If Nvidia was going to release a new architecture design based on the 12nm fab process how bigger the gain could be? Math is simple here. Try to stay calm and enjoy one of the most durable architecture that Nvidia has ever made. Personally this month I have started my third year with the 1080 FE and I am impressed what this card can do...try to do the same πŸ™‚ Peace!
data/avatar/default/avatar21.webp
Michal Turlik 21:

I assume that few of you can not enjoy the 1080's settling in your cases. If Nvidia was going to release a new architecture design based on the 12nm fab process how bigger the gain could be? Math is simple here. Try to stay calm and enjoy one of the most durable architecture that Nvidia has ever made. Personally this month I have started my third year with the 1080 FE and I am impressed what this card can do...try to do the same πŸ™‚ Peace!
It's still far from satisfying my X34 @ 3440*1440 100hz, not to mention Nvidia is pushing stuff like 4k120 monitors.
https://forums.guru3d.com/data/avatars/m/216/216490.jpg
Michal Turlik 21:

I assume that few of you can not enjoy the 1080's settling in your cases. If Nvidia was going to release a new architecture design based on the 12nm fab process how bigger the gain could be? Math is simple here. Try to stay calm and enjoy one of the most durable architecture that Nvidia has ever made. Personally this month I have started my third year with the 1080 FE and I am impressed what this card can do...try to do the same πŸ™‚ Peace!
Has it been that long? πŸ˜› Also entering 3rd year with Pascal in general but only now entering(next month) 2nd year with my Ti. And I couldn't be any happier with it. And it's ready to be block hopefully this week too! πŸ™‚
https://forums.guru3d.com/data/avatars/m/271/271573.jpg
D3Master:

It's still far from satisfying my X34 @ 3440*1440 100hz, not to mention Nvidia is pushing stuff like 4k120 monitors.
I understand the enthusiast point of view but I am also sure that at the Nvidia headquarters they try to focus on the 250 watt tdp limit for their gaming cards so how much power the new theoretical 12nm architecture may need to be capable of 120 hz at 4k? At this moment it seems that only the Titan V is capable to do what you would like to see in the mainstream segment.
https://forums.guru3d.com/data/avatars/m/193/193513.jpg
AlmondMan:

Actually Turing cracked the Enigma cipher, which was used by the german military during WW2 πŸ˜‰
eeerghh no He didn't , someone else did this . stop learning history from movies ,mate πŸ˜€ check here for start πŸ™‚ : https://en.wikipedia.org/wiki/Enigma_machine
https://forums.guru3d.com/data/avatars/m/270/270008.jpg
I wonder if Nvidia will scrap 12nm and go straight to 7nm then?
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
If Nvidia delays these GPUs in order to refine their performance, I'd be fine with that.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
JamesSneed:

I wonder if Nvidia will scrap 12nm and go straight to 7nm then?
I don't think Nvidia was ever intending on shipping 12nm gaming GPUs. The only 12nm variant that brings anything is 6.5T and its roughly 10% increase in switching speed/power. Their FFN variant was for reticule sizes above 600mm2 and that was only available in 7.5T which brings no other benefits over 16nm. I never understood why people seemed so convinced that next gen would be 12nm when 7nm was slated for mass production 1H this year for over 2 years now.
JamesSneed:

He lead the team, Hut 8, that cracked the cypher. https://en.wikipedia.org/wiki/Alan_Turing
It states right after the HUT 8 part that he improved on what the Polish had already done:
Here he devised a number of techniques for speeding the breaking of German ciphers, including improvements to the pre-war Polish bombe method, an electromechanical machine that could find settings for the Enigma machine.
The Polish originally cracked specific implementations of the enigma machines - the germans improved them overtime and the method the polish were using became ineffective as complexity increased. Turing stepped in and came up with a more "generalized" method that sped up the process and made it effective again.
https://forums.guru3d.com/data/avatars/m/245/245459.jpg
I know - how about actually lowering GPU prices now that there is glut of GPUs on the market! As an added bonus, the stock gets cleared & the new cards get launched!
data/avatar/default/avatar02.webp
Pascals are still wiping the floor with all the games so I'm in no rush with my Titan Xp's.
https://forums.guru3d.com/data/avatars/m/270/270008.jpg
Denial:

I don't think Nvidia was ever intending on shipping 12nm gaming GPUs. The only 12nm variant that brings anything is 6.5T and its roughly 10% increase in switching speed/power. Their FFN variant was for reticule sizes above 600mm2 and that was only available in 7.5T which brings no other benefits over 16nm. I never understood why people seemed so convinced that next gen would be 12nm when 7nm was slated for mass production 1H this year for over 2 years now. It states right after the HUT 8 part that he improved on what the Polish had already done: The Polish originally cracked specific implementations of the enigma machines - the germans improved them overtime and the method the polish were using became ineffective as complexity increased. Turing stepped in and came up with a more "generalized" method that sped up the process and made it effective again.
Semantics a bit here. The polish were "first" and did hand over what they had . The polish made a method to essentially brute force attack the message to determine the keys where Pascal improved that by essentially making a "crib" attack. Also the polish never cracked the latest enigma machines they were cracking the commercial ones that the germans used as a starting point(Not that they couldn't have if they were not overrun by Germans) but the Polish never did crack the improved enigma machines. So to say Pascal is first isn't wrong and to say the Polish were first isn't wrong either because they both had firsts at cracking certain versions of the enigma machines. From the CIA link included: "When Poland was overrun by Germany in September 1939, the Polish as well as French cryptanalysts shared everything they knew about ENIGMA with the UK, which allowed the cryptanalysts at Bletchley Park, including the famous Alan Turing, to finally crack the ENIGMA ciphers." https://www.cia.gov/news-information/blog/2016/who-first-cracked-the-enigma-cipher.html
data/avatar/default/avatar23.webp
JamesSneed:

So to say Pascal is first isn't wrong
No. Tesla was first πŸ˜€ πŸ˜€
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
I'm seeing some disturbing news re Nvidias new NDAs. https://videocardz.com/76645/nvidias-new-non-disclosure-agreement-leaked German site Heise.de has refused to sign it as its a long term (5 years) contract which could limit even speculation about technical aspects of Nvidia products unless its "beneficial" to Nvidia. https://www.heise.de/newsticker/meldung/In-eigener-Sache-Nvidia-NDA-als-Maulkorb-fuer-Journalisten-4091751.html I wonder why now? Could the timing also be related to how Turing may turn out? That it may not live up to expectations?
https://forums.guru3d.com/data/avatars/m/272/272452.jpg
schmidtbag:

If Nvidia delays these GPUs in order to refine their performance, I'd be fine with that.
They would be stupid to not experiment while waiting for inventory to move. Surprised no one has speculated they could be waiting for better gddr6 for launch to the masses.