AMD Epyc 7763 CPUs break Cinebench world record, crushes Intel Xeon in performance

Published by

Click here to post a comment for AMD Epyc 7763 CPUs break Cinebench world record, crushes Intel Xeon in performance on our message forum
https://forums.guru3d.com/data/avatars/m/189/189980.jpg
tty8k:

There is one more thing and probably more important that Cinebench score, tech specs or even price: Proven reliability. Nobody will take a decision to upgrade to a product based on enthusiasm at this level. Even if great on paper, will still take some time to prove it reliable for business.
What about the security? Anything to say about proven security and vulnerability mitigations?
data/avatar/default/avatar06.webp
RealNC:

Who cares 😛
The market segment that actually makes the money for companies, enterprise customers. The costs in a datacenter are, as has been said, based on thermals more than anything else - the highest cost associated with running a DC is cooling it, followed by redundancy, power and rent on the space. You buy new servers every 5 years (longer, if you don't care about warranties), but you have to cool and power it 24/7, and UPS systems have to be maintained with battery swaps every 2-3, plus paying people to maintain it and the servers in it. As a result, most companies who operate their own servers lease space in hosted datacenters to alleviate complexity. The average cost of operating a Tier IV datacenter (the "five nines" level, 99.995% uptime) is around $25,000 a month per square foot, or around $750,000 per rack (datacenters treat this as 30 Sq ft, to account for cooling space and the doors opening). And yeah, this is *still* considerably cheaper than AWS/Azure/Google/etc for the same compute power - cloud hosting is super expensive for high data throughput applications. Therefore getting your compute into as small a physical area as possible is key to making money on it - density is king. To put that in perspective, a fully kitted out Dell R7525 server with the chips in the article costs around $75,000, and typically you can get 16-18 of them in a single 42u rack, with typical server lifespans being 5 years. Server costs are pretty small compared to running them.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
tty8k:

There is one more thing and probably more important that Cinebench score, tech specs or even price: Proven reliability. Nobody will take a decision to upgrade to a product based on enthusiasm at this level. Even if great on paper, will still take some time to prove it reliable for business.
Tell that to some of the largest corporations and organizations in the world that bought them anyway.
https://forums.guru3d.com/data/avatars/m/189/189980.jpg
@tty8k Reliability, as in CPU platform? Well, server platform manufacturers have a standard, as in reliability and hardware parts chosen. As in proven reliability through the years? OK, sounds logical, but also sometimes a new platform should and would be implemented. Nobody can see the future, but everyone can see in the past, higher electricity bills, more heat and vulnerabilities.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
King Mustard:

They're cheaper and faster but cost a little more to run, looking at the higher TDP. I imagine that matters to datacenters. Still - overall better from the looks of things
um...10 watt TDP, even if real-world that comes to be true that it uses 10 watts more, for better performance, means datacenters can get more done for cheaper. Yes, the total "package" cost more to run, but datacenters care about how much money they have to spend to get whatever project they have going done, done, when it comes to power use and cost of running. AKA, if something used 50% more power then a competitor, but got projects done three times as fast, they'd be spending less money to get a project done even with a higher TDP. And then there's simply the fact that time is money.
https://forums.guru3d.com/data/avatars/m/174/174772.jpg
tty8k:

There is one more thing and probably more important that Cinebench score, tech specs or even price: Proven reliability. Nobody will take a decision to upgrade to a product based on enthusiasm at this level. Even if great on paper, will still take some time to prove it reliable for business.
Both Intel as AMD are proven more than enough for server environment, so businesses will take what ever CPU that does the jobs it get's thrown at it in the most desirable way. Most companies are used to have both CPU's in their server racks anyway, since Intel does one thing better than AMD and visa versa, what ever that is in pure performance, cost effectivity or being best at something specific, not much place for fanboyism in that market.
https://forums.guru3d.com/data/avatars/m/174/174772.jpg
deusex:

Good for AMD. Intel deserves kick to the balls for milking customers for years.
You can't really look at it like that, regular consumers is not the same as server market where AMD always had a much larger role than what ended up in our PC's
data/avatar/default/avatar33.webp
Mineria:

You can't really look at it like that, regular consumers is not the same as server market where AMD always had a much larger role than what ended up in our PC's
You're saying that AMD has traditionally always been stronger in server market than in desktop? https://cdn.mos.cms.futurecdn.net/hKJkC49zDKKVjegHz6WjZc-970-80.png.webp [spoiler] https://www.extremetech.com/wp-content/uploads/2020/08/AMD-Opteron.png https://www.hardwaretimes.com/wp-content/uploads/2020/06/image-105-1024x672.png [/spoiler]
https://forums.guru3d.com/data/avatars/m/225/225084.jpg
AMD is back baby. Well maybe if they had slightly better GPU's but for CPU's they are finally ahead on all fronts. Once we get an APU that has a powerful enough GPU like a 3060Ti then i bet it'll sell like hot cakes as people on a budget go with APU's instead of folking out for both. I don't know what the score is atm between Intel and AMD(Desktop CPU's). I mean Intel used to have a massive advantage over AMD but i doubt that now though. Maybe Steam could give us a better idea of the recent CPU gains and losses.
https://forums.guru3d.com/data/avatars/m/189/189980.jpg
There was a time when Opterons were highly regarded in server - data centre space. Now they're back. What I would like is to see more ARM solutions developed and deployed. Old habits die hard.
https://forums.guru3d.com/data/avatars/m/220/220188.jpg
this whole situation leaves me wondering if there could be a mechanism to prevent companies from going stagnant and tyrannical after securing a market, but if so, wouldnt that make it exponentially harder for a competitor like AMD to ever catch up? there's no easy solution to this I hope this doesnt mean another 10 years of $499 8 cores from AMD this time around
https://forums.guru3d.com/data/avatars/m/174/174772.jpg
tty8k:

Serious businesses don't take whatever does the job, before investing tons of money there are many deciding factors other than the enthusiasm about performance and specs on paper. Most companies don't have both, most companies run on Intel by a huge margin, as far as I know AMD is still and "experiment" in server business. And to be honest, coming up with a cinebench benchmark in this cpu class looks a bit childish, well fit for fanboysm and kids channel. It's great that they make progress and I do hope they come up with competitive products but they're still short pants (very) in the big business.
AMD is used a lot in the racks I have seen, mainly in blade units, the bigger storage and db servers mainly use Intel, so they do mix. I agree regarding the Cinebench benchmark though. 🙂
https://forums.guru3d.com/data/avatars/m/225/225084.jpg
I very much doubt AMD will carry on pumping out 8 cores for very long. All along the Zen range we have seen core increases for the masses. They could make 5900X or 12 cores mainstream if they wanted but more likely is that newer gens will have 2 x 8 cores = 16 core 32 thread will be the norm in 2 or 3 years from now.