140 billion transistors for Nvidia Hopper GH100 GPU?
Click here to post a comment for 140 billion transistors for Nvidia Hopper GH100 GPU? on our message forum
Denial
In datacenter it's all about performance density, so this doesn't surprise me. Really interested to see a deep dive into the architecture though - I think we typically get those around March historically so hopefully that's the same this year. Seems like Nvidia's datacenter parts are going to be much different than gaming ones from here on out.
JamesSneed
Maybe if its multiple dies even if they are directly connected on silicon like Cerebras. Not like Nvidia makes its own chips so we would all know if TSMC or Samsung figured out how to increase the reticle limit(they didn't).
rl66
cucaulay malkin
same as GV100 to GA100,that was 2.56x too
same two years between them too.
another 60% performance increase for +100W power ? seems likely.
I'm not complaining,will settle for a "4070" ~3080/Ti equivalent w. 250W TDP and I'll be happy if it's gonna be available at 650eur.I could really use that +50% performance for rt.
Silva
3090 is 628 mm²
Who the f will afford a 1000mm² GPU? (Don't answer miners, I mean real people)
Just stop making larger and larger chips only 1% of the world can buy!
Krizby
cucaulay malkin
The Goose
I feel for the poor guy that counted them
MonstroMart
NCC1701D
A dollar per transistor?
asturur
alanm
What will make it extra expensive is that the larger the die, the poorer the yields. Flaws more likely to occur in larger dies.