AMD Ryzen 7 3800X review

Processors 199 Page 1 of 1 Published by

Click here to post a comment for AMD Ryzen 7 3800X review on our message forum
data/avatar/default/avatar30.webp
I wonder if any or many games studios use Intel's compiler to build? It has been shown to disable optimal code paths at runtime if it detects it's not an Intel cpu. It's just strange how the Zen 2 often beats Intel considerably in single threaded and multi-threaded tests, but when it comes to games, Intel still just has a slight edge. An edge that I don't think is worth caring about, but still.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Richard Nutman:

I wonder if any or many games studios use Intel's compiler to build? It has been shown to disable optimal code paths at runtime if it detects it's not an Intel cpu. It's just strange how the Zen 2 often beats Intel considerably in single threaded and multi-threaded tests, but when it comes to games, Intel still just has a slight edge. An edge that I don't think is worth caring about, but still.
The compiler might have something to do with it, but even if you removed all architecture-specific advantages, I'm sure Intel would still get a slight edge, mostly because of lower latency. For most heavy-compute tasks, latency doesn't matter because they're usually churning a lot of data upstream in their own little bubble, not really needing to synchronize data downstream all that often. The more a thread can do by itself without synchronizing, the less a delay will have any real impact. This is why Cinebench, for example, works so well on AMD - the only time the cores really need to synchronize is to show their progress. But they're not really working together, they're just handling their own chunk of data by themselves, and submit the work to the complete scene when they're done. For an analogy, take for example a phone conversation with someone on the other side of the planet: sometimes there can be as much as a 1 second delay, but the conversation can still operate smoothly as though you were face-to-face as long as each person is talking long enough for the other to come up with a response at the appropriate time. But, have ever encountered a situation where you accidentally talk over someone on the phone, and both people stop speaking? There's usually an awkward long pause, at which point both people decide to speak up again, only to speak over each for a 2nd time with yet another long awkward pause. This is because both sides are trying to synchronize, but, synchronization is inefficient when there's such a long delay. When people accidentally talk over each other face-to-face, the problem is typically solved in less time than it took to wait for that first awkward pause over the phone because the delay is eliminated. Games have code that is heavily dependent upon synchronization, particularly with the GPU. So even though AMD doesn't have that big of a latency deficit, it becomes more noticeable when data is being synchronized millions of times per second. The more synchronization you have to do, the worse it gets, which is why AMD performs worse as frame rate goes up (that, and the GPU is being bottlenecked).
https://forums.guru3d.com/data/avatars/m/156/156348.jpg
Considering the next gen consoles coming out in 2020 are rumored to have 8 core and 16 threads cpu i'm not sure i would recommend anything lower than that if someone intend to keep his cpu for 6-7 years. The 7600k looked good when it was released but latest revisits of the cpu show that it is struggling to maintain an "acceptable" low 1% fps in some newer titles because it is limited to 4 threads. There's a couple of threads on BF V forum with 7600k owners complaining about performance. 6/12 and 8/8 will likely be enough moving forward but devs often lazily optimize for consoles only and not sure i would be confident with anything less than 8/16 since this is what next gen consoles will likely have and considering how weak they are devs will have to use those threads to push them to their limit toward the end of the generation. Current consoles have 8/8 cpu and some newer titles seem to be optimized for 8 threads.
https://forums.guru3d.com/data/avatars/m/156/156348.jpg
schmidtbag:

Games have code that is heavily dependent upon synchronization, particularly with the GPU. So even though AMD doesn't have that big of a latency deficit, it becomes more noticeable when data is being synchronized millions of times per second. The more synchronization you have to do, the worse it gets, which is why AMD performs worse as frame rate goes up (that, and the GPU is being bottlenecked).
Yeah but any difference in fps over 144 is not worth talking about imo. Personally i think it's dangerous to assume a difference in fps at high fps will translate to the same % of difference in a gpu bound scenario with a lower fps.
https://forums.guru3d.com/data/avatars/m/270/270008.jpg
Richard Nutman:

I wonder if any or many games studios use Intel's compiler to build? It has been shown to disable optimal code paths at runtime if it detects it's not an Intel cpu. It's just strange how the Zen 2 often beats Intel considerably in single threaded and multi-threaded tests, but when it comes to games, Intel still just has a slight edge. An edge that I don't think is worth caring about, but still.
Its likely down to frequency and latencies involved trying to produce 120+FPS. Also some games are simply going to be more Intel optimized which will give n percent better performance but that percentage makes a higher number when you get into very high frame rates so it seems like a larger difference than it really is. See this video for a 3700x and 9900k at 4Ghz on all cores compared. AMD is winning in CS:GO in this test and many of the workstation tests. For the most part what this means is AMD needs to get the boost frequency up to around 5ghz to make the very high FPS gamers choose them over Intel which we hopefully will see on the 7nm+ process next year.
data/avatar/default/avatar32.webp
The 3800x is more interesting than I thought I'd be. It's still pretty much a stopgap between the 3700x and 3900x but the bump over the 3700x is more than I thought it'd be, still maybe not as good bang for your buck as the 3700x but interesting none the less. Also, karma777police never disappoints.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
MonstroMart:

Yeah but any difference in fps over 144 is not worth talking about imo. Personally i think it's dangerous to assume a difference in fps at high fps will translate to the same % of difference in a gpu bound scenario with a lower fps.
I share your opinion, though I know others here would strongly disagree. I personally am satisfied with 60FPS, which is why I'm still fine with my mediocre 1500X.
https://forums.guru3d.com/data/avatars/m/156/156348.jpg
schmidtbag:

I share your opinion, though I know others here would strongly disagree. I personally am satisfied with 60FPS, which is why I'm still fine with my mediocre 1500X.
Personally i like to have between 80-90. I have a 144Hz screen and honestly i can tell when my fps drop in the 60 range. But i'll be honest i can't tell the difference between 100 and 144fps. Maybe some can but personally i can't.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
So about +10% base clock vs 3700x yet most of the benches are only about 2-3% diff. I know bin milking is a common practice, but would have expected better results tbh.
https://forums.guru3d.com/data/avatars/m/55/55855.jpg
karma777police:

in other words AMD cpu still suck for gaming.
They don't suck at all for gaming now, ive just gone from a 5930K, to a 3700x, and couldn't be happier, they're great for gaming πŸ™‚
https://forums.guru3d.com/data/avatars/m/263/263205.jpg
alanm:

So about +10% base clock vs 3700x yet most of the benches are only about 2-3% diff. I know bin milking is a common practice, but would have expected better results tbh.
I'm wondering if these results are from an older bios/AGESA. For my 3700X, I applied the new chipset drivers and asrock released an updated bios just a couple of days ago for my board. My cinebench scores are right there with the 3800X that HH shows and the 3700X scores shown are right about what my chip used to get before I updated. Assuming the 3800X would be even a notch better than what's shown here depending on which version is used for testing.
data/avatar/default/avatar03.webp
"Overclocking is the weak spot for Ryzen." Why do you keep saying this? β€’ It begs the question that manual OC is even desirable. β€’ It ignores the fact that Ryzen "overclocks" itself just fine. β€’ It makes a strength appear to be a weakness. People will read this one sentence and conclude that Ryzen is seriously lacking. All the words after that won't matter.
https://forums.guru3d.com/data/avatars/m/201/201426.jpg
I still sew moronic fanboyism runs amuck with Intel shills.
https://forums.guru3d.com/data/avatars/m/63/63170.jpg
Rich_Guy:

They don't suck at all for gaming now, ive just gone from a 5930K, to a 3700x, and couldn't be happier, they're great for gaming πŸ™‚
I'm going froman i5-3570K to a 3800X. Keeping the Vega64 EKWB though. Seem to have changed everything else almost. Can't wait to actually use it watercooled. edit : Have a 6670LP in it atm...best aircooled card I have left.
data/avatar/default/avatar01.webp
karma777police:

Conclusion 720p difference between 9900k and 3800x is going to creep at 1440p with 4000 series Nvidia card in other words AMD cpu still suck for gaming.
Amen, brother! You and I will stick to 720p gaming while the rest of the uninformed will purchase Ryzen and game on that newfangled 1440p nonsense! In all seriousness. If you're gaming at 1080p Intel is a good choice. Anything above that and either CPU is a good choice. If you're spending the bucks on a 9900K though, 1080p is probably not the resolution you're going to run at. Unless of course you like those fancy 240Hz monitors,then by all means, get the 9900K. Your creep explanation makes no sense however, and that will not be forgiven.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
Very nice CPU but i think the 3700x is a better buy over this one.
MonstroMart:

Personally i like to have between 80-90. I have a 144Hz screen and honestly i can tell when my fps drop in the 60 range. But i'll be honest i can't tell the difference between 100 and 144fps. Maybe some can but personally i can't.
I donΒ΄t think anyone can spot that type of differences but the good thing of having an higher FPS than needed is that it compensantes frame drops much better. For example if we have a game running between 80/90 frames and something more taxing happens suddenly, the frame rate can drop bellow the 60 mark and you are going to notice it. But if the game game is running at more than 120 FPS and the same taxing stuff happens, the frame rate will probably drop to the 80/90 mark and we are not going to notice it. So more more is better unless iΒ΄m mistaken... Great review as always!
https://forums.guru3d.com/data/avatars/m/55/55855.jpg
Evildead666:

I'm going froman i5-3570K to a 3800X. Keeping the Vega64 EKWB though. Seem to have changed everything else almost. Can't wait to actually use it watercooled. edit : Have a 6670LP in it atm...best aircooled card I have left.
You wont be disappointed. πŸ™‚
data/avatar/default/avatar16.webp
Arbie:

"Overclocking is the weak spot for Ryzen." Why do you keep saying this? β€’ It begs the question that manual OC is even desirable. β€’ It ignores the fact that Ryzen "overclocks" itself just fine. β€’ It makes a strength appear to be a weakness. People will read this one sentence and conclude that Ryzen is seriously lacking. All the words after that won't matter.
I don't think people are going to conclude what you think based of that one sentence. People are a little smarter than that. I also think it comes down to him maybe not being a native English speaker. People get bent out of shape all the time by the wording in stuff he says, but he does a pretty good job of explaining what he means if you read the accompanying detail. Technically, he's not wrong either. Those that like to tinker and are used to Intel chips might find the lack of OC headroom boring. I get what you're saying though. Weak might be the wrong word. You don't really need to fight in defense of AMD anymore either. They've proven themselves over the last couple of years and those that don't see it aren't likely to magically jump on board because he changes the wording of a sentence. Some people are brand Zealots and like to pick apart anything negative said about their brand of preference. Don't do that. Just buy what makes sense for you.