Well it looks like it's made of 8 chiplets to me, if you look at the back side of the package. The article calls it "monolithic" in the title, I thought that was supposed to mean that the GPU was gonna be one big piece of silicon rather than chiplets, but maybe I'm misunderstanding "monolithic" (the article then goes on to suggest chiplets too)? Most of the time monolithic has been used to describe a chip made of just one piece of silicon in my experience.
That or it's Temujin / Genghis Khan. 😀
Well nice to see Intel showing AMD and NVIDIA who's the big daddy in the GPU department, possibly, guess we'll see how it goes and what their development into graphics cards can do for the current market situation. 🙂
EDIT: Or they're giving NVIDIA back for their little comics back then which painted them in a not so positive light...
https://komplettie.files.wordpress.com/2009/12/intel-nvidia.jpg
(There's a fair few of these, no idea what it was about originally but it's the internet so the comics preserve forever.)
Still it'll be interesting to see what this leads to, server market, desktop or further developments on the mobile end or perhaps a bit of everything going to take a while though as usual with new processes and from planning to engineering and whenever these are actually on the market but it changes things up a bit and that can't be too bad.
Soo many Pins though.., I bet MB manufactures aren't looking forward to all the RMA's
*I remembered Intel getting us to glue plastic-guides on Ivy Bridges as bent pins or contaminated pads sent RMA requests through the roof. (found a CPU with the mount like the one we attached)https://images.app.goo.gl/w8Njmtzo3LLwF3hv7https://images.app.goo.gl/w8Njmtzo3LLwF3hv7
AMD did 3 slot GPU which *could have worked* as 1 slot, just for having a better cooling.
Intel doing 3 slot GPU, really utilising every inch from that space to have some performance.
Oh, look: Larrabee 2.0! ahahah
Intel, you crazy! This is doomed to fail on a mass market application.
If they're targeting servers first, means it will be years until a consumer grade product.
At the time i thought Larrabee was an amazing idea but it was just too early for that tech to even work.
These days they might of cracked the multi-gpu system.
Oh neat, reminds me of what 3DFX attempted with the Voodoo5 I think it was but now on a single chip and I presume multi-GPU improvements has advanced even if AMD and NVIDIA has scaled back their usage for multi-GPU support.
Which for gaming I'd imagine must be kinda problematic to get working right unless the devs work the game engine from the beginning to scale properly and minimize issues and drawbacks even if Vulkan and D3D12 supports it better than before.
Workstation and programs or server environments though this could be really neat plus instead of a whole array of GPU's you can do it like this or scale it up even more possibly further still in the future depending on how many chips could be placed on each die what with the shrinking fabrication process and all those complications already having drawbacks like yields and more but that could improve or find other means or materials that will improve things. 🙂
EDIT: So instead of 3 - 4 cards one GPU each this would make it 12 - 16 units total and depending on scaling if they could work together that would also greatly improve speeds but I suppose we'll see how this all works.
Nice potential though from what I'm thinking of how it could work or would work even without multi-GPU support on that level.
AMD I think had some patents or plans but for later on or just patented for now, not sure about NVIDIA and now Intel is actually doing it but we'll see when it's showcased how this all works, going to be interesting to see what this solution could do but it must be pretty complicated stuff too having multiple GPU chips like this.
Oh well the same happened for CPU's and scaling in the 2000's and GPU's already have a ton of cores so why not take it to multiple chips, it'll work out and see various resolutions and improve whether it's workstations, server or possibly even desktop environments in the future. 🙂
(Bit optimistic perhaps and besides since it's like 4x GPU's what would that do to pricing heh.)
EDIT: Well there's also the part where 4x cores on one chip would result in a bit more heat, can't exactly go big-tower cooling on a GPU slot-in card so AIO water or something else would be needed to dissipate that ha ha.
Still interested in seeing how this will be used though, work and program oriented of course but it's a nice little change-up of things and if it could bring back multi-GPU support that'd be neat but far as gaming goes that's pretty far out from being a thing for regular desktop systems I would expect.
(HBM and what can be done with memory stacking and such is already a hurdle, doing multi-GPU dies and even more complications would at the very least result in a very costly card and initial scaling would be a problem too.)
Well it should shake up the server market a bit and other workstation type environments, can't be bad. Not sure for AMD and their position but it might shake things up a bit for NVIDIA although CUDA is still a strong incentive to use their hardware for one thing.
What would I know, just fun to see something new, well sorta but it's a neat little multi-GPU implementation using new tech and hardware advancement and years of research and general improvements both hardware and software wise.
Oh neat, reminds me of what 3DFX attempted with the Voodoo5 I think it was but now on a single chip and I presume multi-GPU improvements has advanced even if AMD and NVIDIA has scaled back their usage for multi-GPU support.
Which for gaming I'd imagine must be kinda problematic to get working right unless the devs work the game engine from the beginning to scale properly and minimize issues and drawbacks even if Vulkan and D3D12 supports it better than before.
Workstation and programs or server environments though this could be really neat plus instead of a whole array of GPU's you can do it like this or scale it up even more possibly further still in the future depending on how many chips could be placed on each die what with the shrinking fabrication process and all those complications already having drawbacks like yields and more but that could improve or find other means or materials that will improve things. 🙂
EDIT: So instead of 3 - 4 cards one GPU each this would make it 12 - 16 units total and depending on scaling if they could work together that would also greatly improve speeds but I suppose we'll see how this all works.
Nice potential though from what I'm thinking of how it could work or would work even without multi-GPU support on that level.
AMD I think had some patents or plans but for later on or just patented for now, not sure about NVIDIA and now Intel is actually doing it but we'll see when it's showcased how this all works, going to be interesting to see what this solution could do but it must be pretty complicated stuff too having multiple GPU chips like this.
Oh well the same happened for CPU's and scaling in the 2000's and GPU's already have a ton of cores so why not take it to multiple chips, it'll work out and see various resolutions and improve whether it's workstations, server or possibly even desktop environments in the future. 🙂
(Bit optimistic perhaps and besides since it's like 4x GPU's what would that do to pricing heh.)
EDIT: Well there's also the part where 4x cores on one chip would result in a bit more heat, can't exactly go big-tower cooling on a GPU slot-in card so AIO water or something else would be needed to dissipate that ha ha.
Still interested in seeing how this will be used though, work and program oriented of course but it's a nice little change-up of things and if it could bring back multi-GPU support that'd be neat but far as gaming goes that's pretty far out from being a thing for regular desktop systems I would expect.
(HBM and what can be done with memory stacking and such is already a hurdle, doing multi-GPU dies and even more complications would at the very least result in a very costly card and initial scaling would be a problem too.)
Well it should shake up the server market a bit and other workstation type environments, can't be bad. Not sure for AMD and their position but it might shake things up a bit for NVIDIA although CUDA is still a strong incentive to use their hardware for one thing.
What would I know, just fun to see something new, well sorta but it's a neat little multi-GPU implementation using new tech and hardware advancement and years of research and general improvements both hardware and software wise.
Nvidia already has inference chips utilizing chiplets in testing. They also published a number of papers that talk about chiplets being the only way to get scaling going forward. Bill Dally's (Nvidia) recent interview he says they consider chiplets on GPUs "de-risked" and ready for actual implementation. There has already been rumors that Hopper (next gen after Ampere) is going to utilize chiplets.
That being said it's completely different then multi-GPU and it doesn't seem like it's going to be good for gaming. It's highly likely we'll only see it in HPC GPUs for sometime across all three GPU designers.
Do people really get excited about the back of a chip? AMD's HPC CPU / GPU (1 cpu + 8 gpu's) chips are large too. Not sure I care much about any of these HPC chips from either camp.
Even if Intel makes physical product, they have never proven they can make gameday drivers for our market. So it really doesn't matter if they make a product as it will be ignored with its lack of driver support.
Oh, look: Larrabee 2.0! ahahah
Intel, you crazy! This is doomed to fail on a mass market application.
If they're targeting servers first, means it will be years until a consumer grade product.
They said it from start, architecture should be viable from smallest of IGPs through consumer grade graphics to exascale. Considering it is socket base, it is unlikely to be used for desktops.
Some other solution will.
Soo many Pins though.., I bet MB manufactures aren't looking forward to all the RMA's *I remembered Intel getting us to glue plastic-guides on Ivy Bridges as bent pins or contaminated pads sent RMA requests through the roof. (found a CPU with the mount like the one we attached)https://images.app.goo.gl/w8Njmtzo3LLwF3hv7 https://images.app.goo.gl/w8Njmtzo3LLwF3hv7