Next-gen AMD EPYC (Genoa) Would get 50% larger socket SP5, 96 cores and 400W TDP

Published by

Click here to post a comment for Next-gen AMD EPYC (Genoa) Would get 50% larger socket SP5, 96 cores and 400W TDP on our message forum
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
nosirrahx:

They are not going to reduce the number of channels per core, even from just a marketing standpoint. Software over time becomes more and more hungry for memory (both capacity and throughput), this is not a static metric. You are also operating under the assumption that DDR4 100% meets the needs of the existing processors, that adding frequency would have 0 impact, which clearly isn't true.
RED: I have not wrote anything that would even remotely hint such thing. BOLD: You are one making assumptions. Making ass of you and me in process. But if we take AM4, TR4, and SP3. Then maximum core count available on each, memory channel count and clock under full load, you can notice certain trend. AM4: 16 cores, base clock ~3.4GHz, 2 channels TR4: 64 cores, base clock ~2.9GHz, 4 channels SP3: 64 cores, base clock ~2.6GHz, 8 channels SP3 has highest available bandwidth per core. Or if you like highest bandwidth per Core FLOP.
data/avatar/default/avatar40.webp
Fox2232:

One DDR5 has much higher bandwidth by itself than One DDR4. When you increase number of memory banks by 50%, and bandwidth each memory bank provides by 40%. How much more bandwidth you get? And you are getting this increase while adding only 50% more cores.
I think it would be more work for AMD to change 12 channel memory into 8 channel and some of the cores could be starved, because they are sharing memory access or the traffic is somehow redirected. Motherboard manufacturers are free to change the board to 6 channel instead, if the dimms do not physically fit the board and if good DDR 5 modules are used, the bandwidth could be good enough compared to DDR 4. I can not find the DDR 5 specs, but I do believe DDR4 4000 modules would be around the same speed as DDR5 5000 modules, so expect atleast half a year before DDR5 modules really makes a difference, just like the other older generations of DDR memory. Also remember that this socket looks to have physical room for something stupid like almost 200 cores in 2-3 years time, maybe 12 channel memory would be needed when that comes.
data/avatar/default/avatar32.webp
Fox2232:

SP3: 64 cores, base clock ~2.6GHz, 8 channels
64/8 = 8 96/12 = 8 You want this to be infinitely more complicated than this, which is fine. Not really sure how this has you excited to the point where you need to use bold and colors but whatever.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
TLD LARS:

I think it would be more work for AMD to change 12 channel memory into 8 channel and some of the cores could be starved, because they are sharing memory access or the traffic is somehow redirected. Motherboard manufacturers are free to change the board to 6 channel instead, if the dimms do not physically fit the board and if good DDR 5 modules are used, the bandwidth could be good enough compared to DDR 4. I can not find the DDR 5 specs, but I do believe DDR4 4000 modules would be around the same speed as DDR5 5000 modules, so expect atleast half a year before DDR5 modules really makes a difference, just like the other older generations of DDR memory. Also remember that this socket looks to have physical room for something stupid like almost 200 cores in 2-3 years time, maybe 12 channel memory would be needed when that comes.
It does not make more engineering work for AMD than any other I/O die design. It is there for simple reason. To handle communication with anything that sits on MB and each separate CCD die. Same I/O die takes care when there are 16, 24, 32, ... cores and 8 memory channels. I think that due to IF, AMD does not really care about how many Xs gets connected to how many Ys. At this point, I/O die is nothing more than lego. But that's beyond point. I did point out that socket is going to have a lot of extra memory bandwidth. And that people should expect proportionally higher computational power. (Achieved one way or another.) While those arguing against did not get it, You expect same thing by adding possibility of extra core count. #2 "(Conspiracy theory about drastic increase in Zen4 IPC or EPYC all core boost clock.)"
https://forums.guru3d.com/data/avatars/m/261/261894.jpg
Just passed here to say again that Intel is Dead! Did lost at the Mainstream and gamer market and now will be smashing at the server market... Obsolete, power angry, heater function and expensive CPUs.... just for lammers!
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Borys:

Just passed here to say again that Intel is Dead! Did lost at the Mainstream and gamer market and now will be smashing at the server market... Obsolete, power angry, heater function and expensive CPUs.... just for lammers!
No, they are not dead. Their trajectory may lead them there, but they have so much money and other businesses that they can sit on their asses and not make single CPU for 10 years. Sometime in future, they'll release chip that truly fixes those tons of security issues and delivers big IPC improvement. In meanwhile, they still have advantage in high fps gaming. (160fps+ range, so people with 144Hz screens can easily pick brand they prefer.) And that actually serves as efficient marketing which was always absurd. Because that advantage required low resolution (720p) or 1080p resolution with top-end card. And most of people who bought intel for high achievable fps ended up with cheap graphics card. And they would be better off if they got cheaper CPU and better GPU.
data/avatar/default/avatar24.webp
Borys:

Just passed here to say again that Intel is Dead! Did lost at the Mainstream and gamer market and now will be smashing at the server market... Obsolete, power angry, heater function and expensive CPUs.... just for lammers!
Intel has the resources to combine a few right decisions and regain market leadership. They are going to have to swallow their pride and copy AMD but it is what it is and they are going to have to go chiplet. I'm buying AMD for now though and for the first time ever I am building an AMD workstation later this year. X79 to X99 to X299 so this is kind of a big deal for me. ARM is also getting super interesting. I am starting to wonder if AMD and Intel will need to come up with a hybrid design that includes ARM cores to improve their flexibility going forward. X86 had legacy software keeping it locked in place but if ARM emulating X86 really can scale up to something like a Threadripper then X86 will fall out of favor.
https://forums.guru3d.com/data/avatars/m/270/270008.jpg
Fox2232:

You wrote a lot. And you are not wrong. But ignored elephant in the room I was pointing on entire time. Do you have estimation on initial available bandwidth change from SP4 to SP5 socket CPU per core, when we say that SP4 has top chip with 64 cores and SP5 has top chip with 96 cores? No need to write bandwidths. Just multiplier, percentage or ratio will be enough. And do you have estimation on this change in 2~3 years when DDR5 mature?
I don't even want to comment on the SP5 socket change as that will also depend on memory channels. To much to speculate on until we see at least some samples. Its taken around 2 years after each DDR generation to have products that are better in actual latency(CAS latency/RAM clock speed) . I suspect that will mostly hold true.
https://forums.guru3d.com/data/avatars/m/270/270008.jpg
Fox2232:

No, they are not dead. Their trajectory may lead them there, but they have so much money and other businesses that they can sit on their asses and not make single CPU for 10 years. Sometime in future, they'll release chip that truly fixes those tons of security issues and delivers big IPC improvement. In meanwhile, they still have advantage in high fps gaming. (160fps+ range, so people with 144Hz screens can easily pick brand they prefer.) And that actually serves as efficient marketing which was always absurd. Because that advantage required low resolution (720p) or 1080p resolution with top-end card. And most of people who bought intel for high achievable fps ended up with cheap graphics card. And they would be better off if they got cheaper CPU and better GPU.
I agree. Anyone that thinks Intel is dead is clueless. Intel has a new CEO, there Lisa Su aka Patrick Gelsinger, that knows really knows his stuff. Once Intel is on there own 7nm they will be a force. There chiplet designs and 3D stacking are so much more advanced than anything else in the market today.