Intel Introduces the Max Series Product Family

Published by

Click here to post a comment for Intel Introduces the Max Series Product Family on our message forum
https://forums.guru3d.com/data/avatars/m/270/270008.jpg
This is why the rumor about Intel abandoning GPU's will not happen. There is way too much money in the data center for GPU tech.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
JamesSneed:

This is why the rumor about Intel abandoning GPU's will not happen. There is way too much money in the data center for GPU tech.
Well, there's still a possibility they'll abandon desktop GPUs, but yeah way to much money in the server market to abandon that. EDIT: Their desktop GPUs are not doing Intel's reputation any favors, the R&D costs are a lot more expensive (compared to the server market), and Intel doesn't have as high of profit margins for desktop parts regardless of R&D costs. So, they would have a compelling reason to drop out of it.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
schmidtbag:

Their desktop GPUs are not doing Intel's reputation any favors, the R&D costs are a lot more expensive (compared to the server market), and Intel doesn't have as high of profit margins for desktop parts regardless of R&D costs. So, they would have a compelling reason to drop out of it.
The gaming and server GPUs aren't totally separated from R&D's point of view. The same basic research will advance both. Of course the gaming side might be more troublesome in some ways, but on the other hand, it's not like you'd want to sell a hundred million euros supercomputer and then have the customer call back because it doesn't work. It's also good to remember that when you have both sides, it means more volume. Not to mention there are also the professional cards between the gaming and server GPUs. It's also a significant market.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Kaarme:

The gaming and server GPUs aren't totally separated from R&D's point of view. The same basic research will advance both. Of course the gaming side might be more troublesome in some ways, but on the other hand, it's not like you'd want to sell a hundred million euros supercomputer and then have the customer call back because it doesn't work. It's also good to remember that when you have both sides, it means more volume. Not to mention there are also the professional cards between the gaming and server GPUs. It's also a significant market.
You're right, though I guess I should have elaborated that it's the drivers where the two segments diverge and where the desktop market gets to be a bit expensive to R&D. When it comes to the hardware, I think the two segments can be a bit more cooperative.
data/avatar/default/avatar16.webp
i watched the stream while watching some polling stream pretty cool intel got so many p-cores @ 350w. when those High Efficiency p-core hit the market these xeons are going to be the right choice.
https://forums.guru3d.com/data/avatars/m/270/270008.jpg
schmidtbag:

Well, there's still a possibility they'll abandon desktop GPUs, but yeah way to much money in the server market to abandon that. EDIT: Their desktop GPUs are not doing Intel's reputation any favors, the R&D costs are a lot more expensive (compared to the server market), and Intel doesn't have as high of profit margins for desktop parts regardless of R&D costs. So, they would have a compelling reason to drop out of it.
All that R&D has to be done for the data center GPU's as well. Most of the overhead is drivers not chip design when we are talking cost of the consumer side if you are saying DC is a given. So as long as they don't lose money on graphics cards they will keep going. I am pretty sure they will be doing very well once battlemage lands. Also I think both AMD and Intel see where things are headed for SoC's and APU's by 2026 and you must have a good GPU to compete.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
JamesSneed:

All that R&D has to be done for the data center GPU's as well. Most of the overhead is drivers not chip design when we are talking cost of the consumer side if you are saying DC is a given.
Agreed, but the amount of effort needed for datacenter drivers is minuscule compared to games. I think people greatly underestimate how complex the software is for games, which is only made more complicated by devs who don't follow standards or only optimize for a specific platform (typically a console). It just seems relatively simple to many since many engines take 99% of the hard work out of developing a game, which also includes much of the driver/hardware level code. In my experience, robots are a lot simpler to code, and robots are typically more complex than what the average datacenter runs. Keep in mind too that a lot of server software is optimized for the hardware, so the vendor doesn't usually have to do any of that kind of work in their drivers. Granted, AI has definitely increased server GPU complexity, and I'm not about to pretend I know what goes on with any of that.
So as long as they don't lose money on graphics cards they will keep going. I am pretty sure they will be doing very well once battlemage lands. Also I think both AMD and Intel see where things are headed for SoC's and APU's by 2026 and you must have a good GPU to compete.
I don't think simply profiting is enough to incentivize Intel to continue. Seems to me the profit margin needs to be rather significant for Intel to bother. I agree that Battlemage ought to be much better, primarily because of driver stability.
https://forums.guru3d.com/data/avatars/m/277/277169.jpg
Oh boy, HBM2 Memory. I want to see HBM2 Memory on the competition too. AMD once did with Radeon Vega, it was soo good, and then they abandoned it.
https://forums.guru3d.com/data/avatars/m/270/270008.jpg
schmidtbag:

Agreed, but the amount of effort needed for datacenter drivers is minuscule compared to games. I think people greatly underestimate how complex the software is for games, which is only made more complicated by devs who don't follow standards or only optimize for a specific platform (typically a console). It just seems relatively simple to many since many engines take 99% of the hard work out of developing a game, which also includes much of the driver/hardware level code. In my experience, robots are a lot simpler to code, and robots are typically more complex than what the average datacenter runs. Keep in mind too that a lot of server software is optimized for the hardware, so the vendor doesn't usually have to do any of that kind of work in their drivers. Granted, AI has definitely increased server GPU complexity, and I'm not about to pretend I know what goes on with any of that. I don't think simply profiting is enough to incentivize Intel to continue. Seems to me the profit margin needs to be rather significant for Intel to bother. I agree that Battlemage ought to be much better, primarily because of driver stability.
I agree long term about profits but short term I think Intel is just happy breaking even. They knew this first gen would have a ton of teething issues, so I am sure they didn't plan to make much of a profit. If battlemage flops that is when the warning lights need to start flashing as they could very well pull out of the consumer space. It would be a terrible idea because AMD will be going nuts with APU's with Zen5 and that will mean neither Intel nor Nvidia could compete with those higher end APU's. The APU's are going to take over low and some midrange gaming next generation. I say that because if you look at Navi 33 its going to be 203 mm² monolithic die which is already small enough on 6nm to be included into an 8 core APU design but put that on a better node like 5nm and for around 250mm2 for both the CPU(Zen4 71mm2) and GPU. That is a > 6800xt class APU. They will need to run everything at lower frequencies but we are talking very powerful APU's. This will sell like crazy for budget gaming. Intel very much needs to be a part of this shift.