Intel Larrabee GPU designer rejoins Intel GPU Team

Published by

Click here to post a comment for Intel Larrabee GPU designer rejoins Intel GPU Team on our message forum
https://forums.guru3d.com/data/avatars/m/190/190660.jpg
i rly hope they release something interesting. gpu market has been boring af for almost 10 years now
https://forums.guru3d.com/data/avatars/m/216/216490.jpg
Would love to see them using the Iris name: "Hey, just got an Iris xxx" or something.. :P
https://forums.guru3d.com/data/avatars/m/270/270008.jpg
Intel is gearing up for the first damn time to get serious on the GPU market. I don't know why it took this long but I couldn't be more thrilled to have a third competitor. I read Tom's blog he only called Larrabee a success in relation to what Intel asked his team to build. They built it and it nailed what Intel wanted. With that said what Intel wanted was way off point on what the market wanted which has been the story of Intel GPU's.
https://forums.guru3d.com/data/avatars/m/115/115462.jpg
I'm actually looking forward to this, since they seem to be serious about it. The GPU market sure needs the competition.
https://forums.guru3d.com/data/avatars/m/267/267641.jpg
Same mistakes again and again..
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Kind of weird how he accepted a job and doesn't know what he'll be doing.
ruthan:

Same mistakes again and again..
Larrabee actually would've been a success if Intel didn't keep shafting it. If they funded it properly, Nvidia would be nowhere near as successful as they are in the server market.
https://forums.guru3d.com/data/avatars/m/267/267787.jpg
Well for once it looks like Intel is actually pushing hard to enter the GPU market. Maybe it’s because they can see AMD is catching up quicker than they thought on the CPU side. At the end it will be great for all of us when a third player is around. Now only Nvidia has to make a CPU for the PC sector.
data/avatar/default/avatar27.webp
From what i remember it was performing bad, like 30% of competition. It was a complete different architecture.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
RooiKreef:

Well for once it looks like Intel is actually pushing hard to enter the GPU market. Maybe it’s because they can see AMD is catching up quicker than they thought on the CPU side. At the end it will be great for all of us when a third player is around.
I was thinking something similar. Intel knows they won't compete against AMD or Nvidia in the gaming market, but the server market is a whole different beast. Intel knows that AMD isn't doing so great in the server GPU market, and unlike Nvidia, they're willing to create a new architecture from the ground-up specific for server workloads (Nvidia's hardware is still heavily revolved around the needs of gamers and workstations).
Now only Nvidia has to make a CPU for the PC sector.
That will never happen. To my recollection, they were explicitly denied the license to manufacture x86. This is why they went with ARM instead, and made the Tegra series.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
asturur:

From what i remember it was performing bad, like 30% of competition. It was a complete different architecture.
Yeah, they originally intended it to compete in the consumer space but the performance wasn't up to the competition so they kind of reshuffled it as a compute card with the added bonus of being easy to program for compared to CUDA at the time. Most of the work of Larabee ended up going into Knights Landing and AVX512 and stuff - so it wasn't all lost.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
JamesSneed:

Intel is gearing up for the first damn time to get serious on the GPU market. I don't know why it took this long but I couldn't be more thrilled to have a third competitor. I read Tom's blog he only called Larrabee a success in relation to what Intel asked his team to build. They built it and it nailed what Intel wanted. With that said what Intel wanted was way off point on what the market wanted which has been the story of Intel GPU's.
They were serious before, burned quite some cash in it, and everyone knows what came out of it.
https://forums.guru3d.com/data/avatars/m/202/202673.jpg
I'm sure they'll just throw together a bunch of AMD iGPU dies this time around...and then proclaim it'll become thrice as powerful by 2023...
https://forums.guru3d.com/data/avatars/m/270/270008.jpg
schmidtbag:

Kind of weird how he accepted a job and doesn't know what he'll be doing. Larrabee actually would've been a success if Intel didn't keep shafting it. If they funded it properly, Nvidia would be nowhere near as successful as they are in the server market.
Totally agree. Google Tianhe-2 or TH-2 supercomputer. It was the fastest supercomputer at the time and was based on a ton Xeon Phi's aka Larrabee. One of the coolest ideas with Larrabee was it was x86 based with software sitting on top of those tiny cores versus having dedicated single function hardware like AMD and Nvidia. You could do a driver update and go from say Directx 9 compatible to 100% Directx 12 compatible. In fact Intel did just this with its prototypes and made them Directx 11 compatible even know the hardware predated the Directx 11 specification. The down side since the software sat on top of x86 cores is Larrabee just wasn't that efficient compared to AMD and Nvidia GPU's but as with software I suspect that could have been vastly improved.
https://forums.guru3d.com/data/avatars/m/270/270233.jpg
The hiring of Forsyth reinforces my suspicion that Intel is making a compute product, as opposed to a gaming product. This might be a Larrabee 2 or sorts, to compete with Nvidia and AMD in the AI, machine learning and cloud computing markets.
data/avatar/default/avatar03.webp
I think Intel wants to be a part of the AI future. Its going to mean everything moving forward and GPU's are what's used for that. I honestly think gaming will be very low on Intel's list of stuff to consider or care about. They might have a GPU in 2020, but it won't mean much for gamers and it will likely be way underpowered and weak anyway. I expect nothing but garbage from Intel, although I have to admit its odd to see them assembling such a significant team...hmm...maybe we'll get lucky...or NAH
data/avatar/default/avatar07.webp
Would be cool if they could come-up with something at least mid-range if not the high-end of gaming spectrum. Nvidia has things tied down and I do not see that changing anytime soon unless they team-up with Amd and come out with something kick-ass. If that works then they can improve on there crappy i-gpu side of things also,all in all they are making the right steps in my opinion. Nvidia is the Giant in the room whahaha.
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
Well he was/is a "expert" by compute and guess Raja needed some help there, idk why would that automatically make it just a compute only card. I still think intel will deliver and stir up gpu market a lot. 2020 is not far away, good year and a half to go.
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
UnrealGaming:

i rly hope they release something interesting. gpu market has been boring af for almost 10 years now
And daylight robbery. Don't forget daylight robbery.
data/avatar/default/avatar38.webp
angelgraves13:

If Intel could create some new hardware extensions to speed up graphics processing, so that we didn’t need to hit 5 GHz for increased frame rates it would be better than them building a GPU.
well at some point, you need the frequency too. 150fps is like 6ms per frame. Give yourself some clock.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
angelgraves13:

If Intel could create some new hardware extensions to speed up graphics processing, so that we didn’t need to hit 5 GHz for increased frame rates it would be better than them building a GPU.
Hardware extensions for what? GPUs don't run at 5GHz, and CPUs don't have that much to do with graphics performance. It's pretty hard to find a game that is bottlenecked on a modern affordable CPU. Regardless - DX12 and Vulkan are the hardware extensions you're looking for, and I would argue new GPU technologies have been appearing at a decent rate. As for CPUs, no amount of new extensions or instruction sets will improve performance if software isn't compiled to use it (which would include all software developed before the release of them). This is the crux of CPU-bound software: you must either leave something behind (whether that be outdated software or those with outdated hardware) or improve performance via brute force.