Rumor: NVIDIA GeForce Ampere to be fabbed at 10nm, all cards RTX ?
We'll probably go back and forth a bit when it comes to the topic of Ampere until NVIDIA lifts all mystery, expected was that NVIDIA's upcoming GPUs would be fabbed at 7nm. However, that fabrication node is over-utilized. NVIDIA could be reverting towards 10nm baked by Samsung.
It's once again Twitter user KittyCorgy (who has a bit of a questionable reputation for these things as never ever he explains how he obtained the info) claims to have information about Ampere GPUs. Remember, it's just some guy posting some stuff, unvalidated with a Twitter account created in January. Anyhow, all cards would support ray tracing and thus be fabricated using a different process than expected, 10nm. We do hope to see some announcements at an NVIDIA GDC live stream presentation from mister leather jacket himself, as yeah .. it's about time eh? We kind of expect an announcement on the GA102 GPU, let's call it GeForce RTX 3080 Ti for now.
Rumored for the consumer products would be a GPU with 84 shader units resulting in 5376 shader cores paired with 12 GB graphics memory and a bus width of 384-bits. based on that bus width we can declare the memory type, that's GDDR6. KittyCorgi also mentioned the GA102 is the only Multi-GPU compatible product and mentioned that at best a 40% uplift in performance can be expected.
It is suggested the biggest improvement would be ray tracing performance, as granted that was the Achilles heel of the current RTX 2000 generation cards. As far as validity and credibility go, all this need to be taken with a grain of salt and then some extra, and then some huge disclaimers in mind, but yeah here it is:
Rumor: NVIDIA Ampere GeForce RTX 3070 and RTX 3080 specs surface - 01/20/2020 01:04 PM
It has been a crazy year, lots of leaks and info mostly coming through unknown Twitter accounts. Today we can add to that as alleged specifications of the GeForce RTX 3070 and RTX 3080 have surfaced....
Rumor: Nvidia Reportedly To Acquire Mellanox For $7B - 03/11/2019 08:56 AM
And let me state, this is a very solid rumor. Nvidia would have purchased the American company Mellanox Technologies, the manufacturer of InfiniBand and Ethernet network equipment. Reportedly, Nvidia...
Rumor: Nvidia GeForce GTX 1180 Expected in July? - 05/18/2018 08:06 AM
A new generation of graphics cards from Nvidia would become available in July, according to close-lipped sources from Tom's Hardware. The next-gen GPUs would be based on the Turing platform, but de...
Rumor: Nvidia Ampere might launch as GeForce GTX 2070 and 2080 on April 12th - 02/09/2018 04:07 PM
It's been silent with new graphics card releases from Nvidia. We've mentioned it a couple of times already, we really do not expect Volta with HBM2 to reach consumer grade graphics anytime soon. The...
Rumor: Next gen AMD Epyc processors will get 64 CPU cores - 11/02/2017 10:44 AM
And that would be 64 CPU cores per processor. AMD Epyc processors are intended for the enterprise and server market of course and seem to become a huge success story thanks to its terrific price an...
Senior Member
Posts: 14948
Joined: 2018-03-21
3. Ignores the reality of how RTX works, you can't scale RTX up without scaling up raster too.
Senior Member
Posts: 3289
Joined: 2013-03-10
According to this 3060 would have the exact same amount of CUDA cores as 2060 (non-super) and even the same 6GB of memory. I hope these particular rumours aren't the truth.
Senior Member
Posts: 13996
Joined: 2004-05-16
I'm not sure that's entirely accurate. Surely you can scale it that way but you can't tell me there is zero way for them to improve the implementation without increasing the number of RT/SMs. RT seems to touch multiple parts of the chip - there has to be room for optimizations outside of just increasing the number of cores.
Don Vito Corleone
Posts: 45505
Joined: 2000-02-22
Actually I was still editing this story and included precisely that.
Senior Member
Posts: 13996
Joined: 2004-05-16
I don't get it. This guy comes out of no where and just starts posting this stuff on twitter and we believe him, why?
You write:
I'm not saying he's wrong or right, I just don't get why every single tech news site is looking at this guy like he's god's gift to leaks. And while I think it's fine to post it as a rumor because others are and I understand the need to want to capture that audience - I think at least part of the post should explain why this is just a rumor and not verifying his credibility when as far as I can tell he has none.