NVIDIA Tesla A100 with GA100 Ampere GPU based on 7nm, 54b transistors and 6912 CUDA Cores

Published by

Click here to post a comment for NVIDIA Tesla A100 with GA100 Ampere GPU based on 7nm, 54b transistors and 6912 CUDA Cores on our message forum
https://forums.guru3d.com/data/avatars/m/196/196426.jpg
6 stacks instead of just 4 ?? Wow, this will be insanely powerful. 1.5 TBits bandwidth, or more.
https://forums.guru3d.com/data/avatars/m/54/54823.jpg
20x the AI training :D
data/avatar/default/avatar30.webp
Whatever card uses that chip will be $3000. I'm still using my 1080ti on a 1440p 165hz g-sync montior. I'll sell it and but a good 1080 before I pay 3 grand on a video card. I was a regular buyer, at release, of every Ti released, FOREVER. I skipped the 2000 series, and I'll move to console before I pay even$1500 for a card. What was the best 1080TI, maybe the MSI trio X, and it was $800-850 new?. They double the price of the 2080ti with the worst generation to generation perf gain ever. Now I bet you they want even more. I already dumped intel for this greedy, lazy behavior. nVidia you are next. I've not see this price craziness ever, and I've owned PC's back to the TRS-80 and Apple II Plus.
https://forums.guru3d.com/data/avatars/m/54/54823.jpg
Rx4speed:

Whatever card uses that chip will be $3000. I'm still using my 1080ti on a 1440p 165hz g-sync montior. I'll sell it and but a good 1080 before I pay 3 grand on a video card. I was a regular buyer, at release, of every Ti released, FOREVER. I skipped the 2000 series, and I'll move to console before I pay even$1500 for a card. What was the best 1080TI, maybe the MSI trio X, and it was $800-850 new?. They double the price of the 2080ti with the worst generation to generation perf gain ever. Now I bet you they want even more. I already dumped intel for this greedy, lazy behavior. nVidia you are next. I've not see this price craziness ever, and I've owned PC's back to the TRS-80 and Apple II Plus.
The amp extreme and gigabyte xtreme were the best non modded cards as they had 360W+ limit capability
https://forums.guru3d.com/data/avatars/m/280/280231.jpg
So much power.. 😱
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
As far as the news have indicated, all new supercomputers are a combination of CPUs and GPUs (Like Epyc+Tesla). I don't see how you could do without CPUs with these Ampere Teslas. Aren't they controlled and fed data by CPUs? Does Nvidia give you free ARM things along with appropriate mobos to run them or something (seeing how Nvidia can't make x86-64)? In any case, it does sound like a real monster. AMD would need to work like Conan at the wheel of pain to produce something to match it.
https://forums.guru3d.com/data/avatars/m/120/120642.jpg
I expected the first comment here to be "..but can it run crysis remastered" 😉 -andy-
data/avatar/default/avatar31.webp
Rx4speed:

Whatever card uses that chip will be $3000. I'm still using my 1080ti on a 1440p 165hz g-sync montior. I'll sell it and but a good 1080 before I pay 3 grand on a video card. I was a regular buyer, at release, of every Ti released, FOREVER. I skipped the 2000 series, and I'll move to console before I pay even$1500 for a card. What was the best 1080TI, maybe the MSI trio X, and it was $800-850 new?. They double the price of the 2080ti with the worst generation to generation perf gain ever. Now I bet you they want even more. I already dumped intel for this greedy, lazy behavior. nVidia you are next. I've not see this price craziness ever, and I've owned PC's back to the TRS-80 and Apple II Plus.
Yes I completely agree, nVidia has gotten out of hand with their pricing. First, the excuse was the "mining craze" during their 1000 series launch and now it was RTX, I don't buy their PR waffle anymore. I am buying an AMD card next time. AMD's new generation will have raytracing in any case and will probably be anything from $200-$400 cheaper than the green team. I wish AMD the best of luck; nVidia needs a hard kick in their private parts, just like Intel got with AMD's CPU's.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
Legacy-ZA:

I am buying an AMD card next time.
shame amd's drivers are a marketing program for RTX 2070 supers, 😡
https://forums.guru3d.com/data/avatars/m/270/270041.jpg
GeniusPr0:

The amp extreme and gigabyte xtreme were the best non modded cards as they had 360W+ limit capability
Didn't the MSI lightning series also have that?
data/avatar/default/avatar22.webp
Kaarme:

As far as the news have indicated, all new supercomputers are a combination of CPUs and GPUs (Like Epyc+Tesla). I don't see how you could do without CPUs with these Ampere Teslas. Aren't they controlled and fed data by CPUs? Does Nvidia give you free ARM things along with appropriate mobos to run them or something (seeing how Nvidia can't make x86-64)?
Well I'm not 100% sure, but I think the DGX-II had 2 Xeons, as well as the capability to communicate with an external CPU \ system as if it was a single GPU (Jensen even said on this recent teaser video, "The world's largest graphics card" and not "graphics card cluster", so I don't think my memory is failing here). That being said, those supercomputers might as well just get clustered individual Teslas and not the DGX. NVLink would still work and it would take less space, so I think it's more plausible. On another note, 5 PETAFLOPS?! WITH ONLY 8 GPUs?!? That's freaking mental, I don't know which data type or precision they're referring to, but that's around 625 TFLOPS each... Looks like we're in for a major change in the landscape this generation, I can barely imagine what gaming \ ("indie") rendering \ sci computing is gonna look like in 2022. Let the good times roll 😀
https://forums.guru3d.com/data/avatars/m/196/196426.jpg
AndyMutz:

I expected the first comment here to be "..but can it run crysis remastered" 😉
Sorry to disappoint.. Let me fix that: But can it run Crysis Remastered ?
https://forums.guru3d.com/data/avatars/m/242/242134.jpg
@Rx4speed/Legacy-ZA and? do you also post "oh i won't buy a Taycan cause its at least double of any other e-car..."? i buy based on performance "need" and my budget, and until i make +100K/year that wont change, but i wont whine about something being outside what im willing to spend. and ignoring price for a second, what does amd have that has at least 2070S (better) performance at same temps/power consumption? right. @GeniusPr0 and still temp limited unless its under water. i've seen ppl getting better fps out of a water cooled 2080S than an air cooled ti. cards now start dropping boost after passing 54*C (at least what i can confirm on the 2 cards i had (2080/S). @wavetrex yes, and finally at a constant 60 fps 😀
https://forums.guru3d.com/data/avatars/m/196/196426.jpg
fry178:

@wavetrex yes, and finally at a constant 60 fps 😀
Pfff 60 is so outdated. My TV does 120 Hz @ 2160p over HDMI 2.1, why would I want to play at 60 ? I hope 3080 Ti or whatever can actually play Crysis Remastered at that refresh rate with everything cranked to the max, otherwise NO BUY.
data/avatar/default/avatar40.webp
54 billion transistors, 826 mm^2 and they said Nvidia can't go big die first because of brand new process.
https://forums.guru3d.com/data/avatars/m/63/63215.jpg
Skull cracking power. Can't wait to see consumer cards.
data/avatar/default/avatar04.webp
Astyanax:

shame amd's drivers are a marketing program for RTX 2070 supers, 😡
ahm no? In MOST cases it's user error. Bad XMP, bad PSU, using PCIe4, Anti Lag enabled and etc. Yea, it should work even with these, but ffs, people are so dumb nowadays, can't troubleshoot literally anything. IT SHOULD BE plug/play, but most just buy whatever random XMP with 400W PSU and then cry that x and y is crashing. Idk, was the DX9 downclock fixed? AMD should stop with this useless power saving features ffs
data/avatar/default/avatar37.webp
D1stRU3T0R:

ahm no? In MOST cases it's user error. Bad XMP, bad PSU, using PCIe4, Anti Lag enabled and etc. Yea, it should work even with these, but ffs, people are so dumb nowadays, can't troubleshoot literally anything. IT SHOULD BE plug/play, but most just buy whatever random XMP with 400W PSU and then cry that x and y is crashing. Idk, was the DX9 downclock fixed? AMD should stop with this useless power saving features ffs
Oh please stop it. Team red fanboys are so toxic.. You can't report an issue with an AMD product without getting a "urh durh, I don't have any issue with mine, you're too stupid to set you product correctly". It tooks months for amd to acknowledge downclocking issues and it is still listed in the known issues log of their drivers months after. It's sad really, because they do have a really interesting line-up versus nvidia. Anyway, regarding the topic at hand, it looks very promising. Really hoping nvidia won't go crazy with the pricing of consumer products...
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
Noisiv:

54 billion transistors, 826 mm^2 and they said Nvidia can't go big die first because of brand new process.
it matches the leaks too 😀 The full implementation of the GA100 GPU includes the following units: 8 GPCs, 8 TPCs/GPC, 2 SMs/TPC, 16 SMs/GPC, 128 SMs per full GPU 64 FP32 CUDA Cores/SM, 8192 FP32 CUDA Cores per full GPU 4 third-generation Tensor Cores/SM, 512 third-generation Tensor Cores per full GPU 6 HBM2 stacks, 12 512-bit memory controllers The A100 Tensor Core GPU implementation of the GA100 GPU includes the following units: 7 GPCs, 7 or 8 TPCs/GPC, 2 SMs/TPC, up to 16 SMs/GPC, 108 SMs 64 FP32 CUDA Cores/SM, 6912 FP32 CUDA Cores per GPU 4 third-generation Tensor Cores/SM, 432 third-generation Tensor Cores per GPU 5 HBM2 stacks, 10 512-bit memory controllers Leaks TSMC 7EUV GA100 8GPC * 8TPC * 2SM 6144bit