NVIDIA Next Gen-GPU Hopper could be offered in chiplet design
Next years Nvidia Ampere GPUs have not yet even been announced and now the supposedly next-generation chip generation Hopper info already spreads on the web, Hopper could be based on a multi-die design, you know chiplets.
Hopper (Grace Hopper) is the name of yet another mathematician being applied to architecture. She has been involved in groundbreaking projects, such as the Mark I super-computer (at the time) at Harvard University, and revolutionized computer science with the first high-level programming language COBOL. Now her name is supposed to adorn Nvidia's next-generation GPU: The Hopper architecture is to follow Ampere (presumably RTX 3000) and be manufactured in chiplet (MCM) design.
MCM design scale less for GPUs, think a little about SLI for example and the problems that come with it. However, it also offers significant advantages in production. The larger a chip is, the higher the risk of production error that can render the chip completely unusable (yields). MCM can create far larger packages than the limitations of one chip alone, so they are very scalable. It might be a trend for graphics chips. However, these are for the time being unconfirmed rumors, all of them.
Photo NVIDIA: 36-chip MCM prototype
NVIDIA Next-Gen Ampere GPUs to arrive in 2020 - based on Samsung 7nm EUV process - 06/05/2019 06:26 PM
If you have noticed the 'super teaser' from NVIDIA, and line that up against what is posted in the topic title, then it's safe to say that the SUPER graphics cards indeed will be a Turing refresh. ...
NVIDIA Next Generation Mainstream GPU Detailed in August - 05/31/2018 11:36 AM
Something is brewing at Nvidia for sure. During Sunday-Tuesday, August 19-21, 2018 the Hot Chips convention is taking place. The conference agenda slash program is made public, and on the first con...
Nintendo NX to use Nvidia new unannounced Tegra chip - 08/01/2016 08:34 AM
Interesting news on the console front, a new report now says that while the NX will be powered by the Tegra, it won’t be the current Tegra X1 chip, but the upcoming X2, which should be more...
NVIDIA New GTX 1080 Cooler Design Kinda Confirmed - 05/06/2016 03:17 PM
Remember the GeForce GTX cooler photo that appeared a while ago ? Interesting find, as that photo and thus the design might be the real deal. Nvidia today added a background skin to one of the micro-s...
Nvidia not going to support VESA Adaptive Sync - 09/25/2014 09:51 AM
There’s a bit controversy breaking on the web which I wanted to write a few words on. Last week some website posted that Nvidia would be adding VESA Adaptive Sync support to their graphics c...
Senior Member
Posts: 13753
Joined: 2004-05-16
I could've sworn there was another user here saying that the chiplet design for GPUs wasn't possible. The only thing that surprises me is that AMD didn't beat Nvidia to the punch.
Anyway, I'm assuming Hopper's per-core performance isn't going to be much better than Turing's, but, I imagine we should meet/exceed 2080Ti performance for hundreds of dollars cheaper. Remember - the main benefit of chiplets is to add more transistors at a lower cost.
I don't think hopper is going to have consumer facing chiplet GPUs - it will probably only be used in HPC. Per the Nvidia whitepapers it appears workloads related to compute/HPC are significantly easier to get proper scaling vs gaming ones. Nvidia actually announced a working inference chiplet design a few months ago. Intel basically announced theirs two days ago. AMD is probably going a similar direction with their Vega HPC split.
Senior Member
Posts: 613
Joined: 2007-09-24
I could've sworn there was another user here saying that the chiplet design for GPUs wasn't possible. The only thing that surprises me is that AMD didn't beat Nvidia to the punch.
Anyway, I'm assuming Hopper's per-core performance isn't going to be much better than Turing's, but, I imagine we should meet/exceed 2080Ti performance for hundreds of dollars cheaper. Remember - the main benefit of chiplets is to add more transistors at a lower cost.
Why did it surprise you???
AMD did not invented anything in GPU industry for a decade already...In fact this is the main problem for them...Nvidia always pushed new technologies...good or bad they did it even if they have the almost complete monopoly of the market they did not sit on their assess like Intel with countless refreshes.
Moderator
Posts: 15139
Joined: 2006-07-04
Also, on a funny note Nvidia’s CEO, Jen-Hsun Huang - the leather jacket king - claims that Nvidia has just “created a brand-new game platform: notebook PC gaming.”
So, nvidia just invented notebook pc gaming - these are historical times!
I mean considering how "reliable" mobile gpus were before Max Q I wouldn't say he's wrong. Maybe not invented, but definitely paved way for more mobile products and better pricing. AMD is still struggling in the mobile market still so not much on that front.
Senior Member
Posts: 6564
Joined: 2012-11-10
For now, yeah. Compute workloads don't require cross-communication between cores as much as rendering a 3D scene does, so, that makes sense.
Because AMD is the one that popularized chiplets in complex processors. So to me, it makes sense how they'd take the same approach to their GPUs too.
Senior Member
Posts: 6564
Joined: 2012-11-10
I could've sworn there was another user here saying that the chiplet design for GPUs wasn't possible. The only thing that surprises me is that AMD didn't beat Nvidia to the punch.
Anyway, I'm assuming Hopper's per-core performance isn't going to be much better than Turing's, but, I imagine we should meet/exceed 2080Ti performance for hundreds of dollars cheaper. Remember - the main benefit of chiplets is to add more transistors at a lower cost.