Rumor: Next gen AMD Epyc processors will get 64 CPU cores

Published by

Click here to post a comment for Rumor: Next gen AMD Epyc processors will get 64 CPU cores on our message forum
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
Oh boy oh boy oh boy imagine how many layers of drm you can run with that !
data/avatar/default/avatar17.webp
Any more news on Ryzen 2 on the AM4 platform or is it to early for speculations?
https://forums.guru3d.com/data/avatars/m/270/270233.jpg
schmidtbag:

Unless Intel managed to copy AMD with the multi-die configuration (in which case they were a little too quick to laugh at the whole "glued together" thing), I don't see how Intel could ever compete with this within a year of its release. There is no way they could ever make a single-die 64-core CPU without making any sacrifices and have even a semblance of a decent value. Their 18-core consumer parts are already absurdly expensive.
I'm guessing Intel will adopt a similar approach soon. They've used MCM before and they can do it again. They're certainly not going to risk losing the lucrative server market to AMD.
Noisiv:

They are down because server contracts are not coming up as expected. And because they guided revenue decrease of 15 percent sequentially.
The thing is, that was actually in-line with expectations. These days, in order for AMD stock to rise on earnings, they need to blow expectations out of the water. The stock was also hammered afterwards by the deflating cryptocurrency bubble (seems Wall Street only recently became aware of this).
https://forums.guru3d.com/data/avatars/m/101/101279.jpg
Silva:

That was my growing concern for the last 10 years. Both Intel and Nvidia product prices spiked really hard.
They had to raise prices, you can't just have any old rabble purchasing their fine equipment. o_o
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
D3M1G0D:

I'm guessing Intel will adopt a similar approach soon. They've used MCM before and they can do it again. They're certainly not going to risk losing the lucrative server market to AMD.
They're going to have to, but like I said, I think they were a little too quick to point and laugh at AMD's approach. They're going to get some backlash for that. I am curious if they have anything lined up. I don't think designing a MCM effectively is something they can pull off in just a matter of months - this will probably need some major revisions to the architecture. I guess that's good, since we're long overdue for such a thing. But it means Intel's competition to this will arrive pretty late.
Elfa-X:

They had to raise prices, you can't just have any old rabble purchasing their fine equipment. o_o
To be fair, RAM prices are often more than twice as expensive as they were in mid 2016, and demand for GPUs has gone up.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
schmidtbag:

I know Intel has that new Xeon Phi, but that architecture is a little too limited and niche to be a direct competitor to something like this. Unless Intel managed to copy AMD with the multi-die configuration (in which case they were a little too quick to laugh at the whole "glued together" thing), I don't see how Intel could ever compete with this within a year of its release. There is no way they could ever make a single-die 64-core CPU without making any sacrifices and have even a semblance of a decent value. Their 18-core consumer parts are already absurdly expensive.
Intel is already working on a multi-die config for their CPUs and so is Nvidia for their GPUs. Everyone has already admitted that making a single monolithic chip with lots of cores has it´s limits and financially is a nightmare in terms of wasted dies with defective cores. Basically Intel was caught with their pants down by AMD when it comes to making a CPU from multiple dies. The glue remark was a really stupid "answer/joke" from Intel marketing team specially when we consider how the legendary Conroes were made...
data/avatar/default/avatar21.webp
Even servers has it's limits in multi-threading. Well for supercomputers they are perfect! But we have special GPU's for that purpose...ummm....still Great!!!
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
Silva:

That was my growing concern for the last 10 years. Both Intel and Nvidia product prices spiked really hard.
Proof? Because, i mean...you don't have any. 100% Intel has been limiting their products, but their overall range of processor prices have remained the same, with the exception of a few of the highest end processors. AMD, if anything, has gotten cheaper, though with Ryzen, their overall range of processor prices are now the same as they used to be. AMD/Nvidia graphics cards are the same overall price range as well, aside from things like the Titan, which didn't exist before, and is a new bracket. There were $1000 CPUs "back in the day" that would be $1300+ today with inflation. There were $550-800 GPUs "back in the day" that would be worth $800-1000 today with inflation.
warlord:

I don't like nor care about cores race. All these strange things are gonna help greedy companies like Ubisoft to give us 6-way DRM solutions just for their money. I hate this kind of trash programmers destroying our systems. They should use their degree as toilet paper at best.
Lol so you don't like the increase in performance in hardware because developers can use said performance increase? You might be on the wrong forum.
https://forums.guru3d.com/data/avatars/m/250/250418.jpg
Aura89:

Proof? Because, i mean...you don't have any. 100% Intel has been limiting their products, but their overall range of processor prices have remained the same, with the exception of a few of the highest end processors. AMD, if anything, has gotten cheaper, though with Ryzen, their overall range of processor prices are now the same as they used to be. AMD/Nvidia graphics cards are the same overall price range as well, aside from things like the Titan, which didn't exist before, and is a new bracket. There were $1000 CPUs "back in the day" that would be $1300+ today with inflation. There were $550-800 GPUs "back in the day" that would be worth $800-1000 today with inflation.
Paid 188.19€ for my i5 2500k back in August 2011. Now a i5 8600k costs 280€, on average (minimum 277€, max 294€). Paid 330€ for my 9800GTX back in 2008. Now a 1080Ti costs 832€, on average (minimum 800€, max 880€). So Intel was charging consumers 49% more for the same processor and if it wasn't for Ryzen, it would still be selling modified Sandy Bridges... Meanwhile people still defend Intel as the good tech company... Back in 2008 for 330€ I could buy the best Nvidia had to offer. Now? Consumers pay 2.5 times more and don't argue about it, even when said company shits on their face with products using the "Ti" branding on front. Now consider the fact my country minimum wager, as of January 2017, is 557€ a month. I mean, I've been here long enough to know and have proof...
data/avatar/default/avatar13.webp
"this seems and deems plausible" Should just be "this seems plausible". You could replace "deems" with "is deemed", but that adds nothing to "seems" in this case and so would be clunky and over-written. Not a criticism, just a pointer 🙂 I'd like to have any second language one-tenth as good as your English.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
Silva:

Paid 188.19€ for my i5 2500k back in August 2011. Now a i5 8600k costs 280€, on average (minimum 277€, max 294€). Paid 330€ for my 9800GTX back in 2008. Now a 1080Ti costs 832€, on average (minimum 800€, max 880€). So Intel was charging consumers 49% more for the same processor and if it wasn't for Ryzen, it would still be selling modified Sandy Bridges... Meanwhile people still defend Intel as the good tech company... Back in 2008 for 330€ I could buy the best Nvidia had to offer. Now? Consumers pay 2.5 times more and don't argue about it, even when said company shits on their face with products using the "Ti" branding on front. Now consider the fact my country minimum wager, as of January 2017, is 557€ a month. I mean, I've been here long enough to know and have proof...
Literally nothing of what you have said means diddly squat, due to the fact you're comparing apples to oranges (GTX 1080 ti to a 9800 GTX, even though there is no varient of the Ti for the 9800 GTX, and can't be compared to a GTX 1080 ti, as well, the 2500k to an 8600k, which you can't do, since you'd have to compare it to an 8500k, which doesn't exist) GTX 1080 ti would be more similar to a GTX 8800 Ultra($830+), not a 9800 GTX. Not only that, but the 9800 GTX was a weak, and cheap release, just so they could release something. It was a sidestep from the 8000 series. By cheap release, i literally mean, a cheap release. The X800 GTX series generally releases around the $500-600 range 7800 GTX - $599 8800 GTX - $599 9800 GTX - $349 GTX 280 - $649 GTX 480 - $499 GTX 580 - $499 So don't come here cherry picking your cards trying to say a company is getting "more expensive" because you cherry picked. The 9000 series was an efficiency and node decrease series from the 8000 series. It was a sidestep, that is all, and that is why it was cheap. The entire lineup was cheap. In regards to the i5 2500k, that had a release price of $216. Now you compared it to the 8600k? Why? Where's the logic? You'd compare it to the 8500k, not the 8600k. But there is no 8500k, as of yet, so you can't even compare it. That'd be like comparing a GTX 970, and complaining that a GTX 1080 is more expensive then the GTX 970 was. But, there are some you can compare it to. Oh, and before i do, $216 in 2011 would be about $235 now (which isn't too far off from the MSRP of the 8600k at $257, and considering cost differences, material, etc is well within reason) 2500 - $205 2500k - $216 3550/3570 - $205 4570/4590 - $192 6500 - $192 7500 - $192 Now, you could say you're unhappy they got rid of the "k" model of your x5xx series, but again, that would just mean those prices would be more the same, possibly a little cheaper, and possibly up to $230, given the technical, irrefutable facts. As to your "i paid" statement, btw, no one cares. What somethings MSRP is, is what matters, not what you were able to find on some sort of deal, or where you are regionally. MSRP is the only thing that matters. MSRP can change between countries in terms of actual worth, but in general (not always) the difference between the MSRP of products within a country are the same as the difference in other countries. The world doesn't care what your, specific to you region is. Computer hardware doesn't even get developed or produced there so your minimum wage has absolutely nothing to do with their costs and has zero to do with a company being more or less expensive. Your "examples" are done in such a way that i could do the exact same thing the opposite way and state you're wrong for it. http://hexus.net/media/uploaded/2017/3/1c9a8251-8039-4dc6-9e84-40f92178c220.png ^ To refute facts is madness. And i'm not defending intel, btw. You'd never see me defending intel, unless what is being said is a direct and complete lie. I hate intel, with a passion almost. In my opinion, their processors are too expensive, not because of their naming scheme. Their naming schemes, and their prices, aside from their higher end processors, are right in line with history. And that's the point, you can't say something is more expensive then it was before, when that literally isn't true. You can't compare a CPU or a GPU to something it's not replacing, as well, you can only compare them, price was, with what directly they are replacing, not the one before it or after it. To me, intel processors are only too expensive because they've been releasing the same stuff for years now, and only this year actually advanced at all, but not nearly enough, only half way where they should have gone in my opinion, to be worthwhile.
data/avatar/default/avatar17.webp
Sorry but to me inflated prices logic is allways pure BS. It took half of my salary back in 2007 to buy 8800GTX and it now took half of my salary to buy 1080ti, however my salary in 2007 was like triple minimum wage and now its 7 times minimum wage here. Following similar logic current highent PCs should cost 4 times more.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
xrodney:

Sorry but to me inflated prices logic is allways pure BS. It took half of my salary back in 2007 to buy 8800GTX and it now took half of my salary to buy 1080ti, however my salary in 2007 was like triple minimum wage and now its 7 times minimum wage here. Following similar logic current highent PCs should cost 4 times more.
That's just insanity there. By that logic, if we don't compare based off inflation, we should all be pissed that a car doesn't cost $800 like it did in 1903. Inflation is everything. The number is meaningless, the value of that number at the specific time, is what is important. And sadly, it has nothing to do with minimum wage. It should, minimum wage should go up the exact same as inflation, but that's not reality, so to base anything on minimum wage, instead of what a dollar is worth, is again insane. Though again, you compared a 8800 GTX to a 1080 ti, instead of the correct comparison, an 8800 Ultra to a 1080 ti. But this "triple" and "7 times" minimum wage nonsense, that is meaningless. But since you did not say what minimum wage is there, as well as what you get paid, it's pretty hard to figure out what you're even saying. But from the sounds of it, you must have been making $1200 a month before (or equivalent in your currency if you're not in the USA), and now be making $1400, and somehow your minimum wage must have decreased drastically. 8800 GTX = $599 GTX 1080 ti = $699 These are the MSRPs of the cards. Any other increases due to your countries difference in dollar, your shops jacking up their prices, or etc. have zero to do with if nvidia, themselves, are "more expensive".
data/avatar/default/avatar40.webp
Aura89:

Inflation is everything. The number is meaningless, the value of that number at the specific time, is what is important. .
Well to be picky, inflation and avg wage (acquisitive power, not sure if there is a better term in english for that) :P I agree with you but take into account we are from a lot of different countries, were I live, last year inflation was a bit over 40% (official) but wages raised between 20 and 30% only.
data/avatar/default/avatar39.webp
To add to the price logic, i do not see why the price of the top card should stay constant or fixed. There are many cards out there for each segment of the market. Is like complaining that in 200X someone could afford the top tier card, while now in 201X a person with a similar salary as to settle for the lower spec model ( example gtx 1070 ). Game price went up, milk and house prices went up, salary did not went up ( unless you moved to a better job ) as much as those other things. We know this. There is a card for 300$ in 2008 and one for 300$ now. And the new one for the same money outperforms the old one. Games look better now and for the same 300$ if you could set max quality in 2008 now you can't, you probably set up medium quality on the top tier games. But the gaming experience improved overall. Better experience, better frame rates, better game visuals, higher prices to stay on top. It makes sense to me, that does not mean i m happy with it. Also there is still that elite component that people search. I earn better now than on 2008, and if in 2008 or near there i sweated to buy 2 geforce 260gtx, now i would not buy them, because i would like to get something more, because my life status changed, so i will buy 2 gtx1080ti. Marketing people knows that and they tailor product price of top product targeting a specific audience, and tailor other product price for the average / mass market. Is theyr job. They want to make the better cards, they need to hire the best engineers, they need to build techonology and not to use it, and they need money for it. lot of. And because cards are still make in poor countries. Imagine if they were manufactured in europe or in USA, where unions defends your salaries and rights and everything else, then you will see a real price bump of those cards.
data/avatar/default/avatar35.webp
With Crytek sort of flying under the radar lately, they have decided to pass the Y-shaped baton on to AMD. Epyc Ryzen proccessors are capable of hundreds of gygaflyps of processyng pyrformance.
https://forums.guru3d.com/data/avatars/m/226/226700.jpg
That will be an amazing CPU!
data/avatar/default/avatar03.webp
What about the memory controller and HT/DMI... whoops "infinity fabric" real improvements? Amdahl's law (which is a true math-law, not like that commercial of Moore) does not forgive, Vega was just a first tango down...... Yes laws are meant to be bypassed, but only with the right tools.
https://forums.guru3d.com/data/avatars/m/270/270041.jpg
Been saying for awhile now, the clock wars are now offficaly over. It will all be about cores from now on. At least till we get new technologies that allow us to break we'll past the 5ghz barrier without needed high end cooling solutions. If that's even possible. Till then more cores are the best bet at keeping performance gains, with programs making use of compute power over many cores rather than single strong ones.
data/avatar/default/avatar25.webp
I know this isn't a consumer part. But why do I feel like the core count race just did in 2 years what 15 years of clock sped races couldn't. I mean even with 64 cores....would the next big thing at 128 cores in 2 or 3 years really be beneficial for anything but pure rendering/calculating farms etc...or would the real speed increases come from higher/faster cpu cache....higher ipc and clock speed and interface improvements etc. If they made a fast enough 16 core consumer cpu. I am confident you cpuld get like 8 years out of it provided you bought a chipset that had a brand new pci interface like 5.0 so that you wouldn't limit your gpu down the line. Whereas beforw, I always tell myself that I will be good ages with a cpu and end up caving. I went from a 9450 to an i7 870 to a 3770k to a 6700k and got massive performance boost each time. Sometimes 50% faster depending on the game. And I mean real world improvements. Like dips down to 40 frames in farcry 2 to a 60 frame minimum. Give give me 8-16 cores and chipset loaded with m.2 slots and I'm good. Throw in pci 4.0... Hopefully in a year what I want will be out.