Ryzen 7 1800X Overclocked to 5.8GHz Breaks Cinebench R15 World Record at 5.36GHz

Published by

Click here to post a comment for Ryzen 7 1800X Overclocked to 5.8GHz Breaks Cinebench R15 World Record at 5.36GHz on our message forum
data/avatar/default/avatar36.webp
Regardless of which CPU performs best in games, AMD has created a fantastic processor and is now able to come close to competing with Intel again. That means that there's now 2 reasonable choices, and hence great news for everyone.
https://forums.guru3d.com/data/avatars/m/245/245459.jpg
You didn't fix it properly. You forgot to add... only if you play games at 640x480. Or was that on purpose?
Ryzen framerates are lower at 1080p too.
Yeah, it makes so much sense to get GTX 1080 and play 1080p. I'll just spend $650 to play on my $100 monitor @1080p. Yeah. Logic, bitch It really doesn't matter what processor you have in that case. I think i5 from 2012 would be just fine
Ermm, you forgot about 144Hz 1080p monitors, GTX 1080 is a good card for those. I have GTX 1070 for 144Hz 1080p and to be honest I have to turn down some settings in Titanfall 2 to achieve a consistent 120-144fps in that game.
https://forums.guru3d.com/data/avatars/m/227/227986.jpg
Yes, conveniently ignore the existence of 1080p 144Hz monitors and the latest 240Hz monitor while propagating the myth that AMD's Ryzen gaming performance is subpar (relative to what Intel is cranking out) ONLY at 1080p as if it's the resolution and not the higher framerate that's accompanying it. Hint, on multi-GPU, faster single GPUs in the future, and games with high framerate, such as CS:GO, the difference would be more pronounced (drastic in the case of CS:GO - go and check out the benchmarks). Now, if you come back and tell me but hey, 300 FPS vs. 450 FPS, both are ridiculously high, then I'm telling you beforehand that you did not get what's going on.
300 FPS vs 450 FPS in CS:GO was the deal breaker for me, and I can only assume that was tested on 64 tick servers. ESEA's 128 tick servers give even lower performance than that.
it is really outrageous the way people judge a new cpu with current games . They often fail to see that with such cpus pushed to the mainstream users game developers will be able to design games to work with more cores.
Very true, but lets be honest that was the same speculation when the FX CPUs came out. Imo, Ryzen would destroy current Intel CPUs in gaming if said games used 8 or more than 8 cores, no doubt about that. But that's not the case and Intel has slight edge against the best of AMD's in gaming.
https://forums.guru3d.com/data/avatars/m/191/191875.jpg
it is really outrageous the way people judge a new cpu with current games . They often fail to see that with such cpus pushed to the mainstream users game developers will be able to design games to work with more cores. There is also another thing , up to now , coding with multiple cpus is a nightmare ;thus, the need for more optimized game engines is required. All in all , the new AMd cpus simply are gamer changer. Despite the fact that I love intel product , this to me means cpus will be a lot cheaper and 2018 will bring interesting changes
No it won't. All these chips have done is bring more cores at a cheaper price but end of the day that doesn't change the fact that 4 core 8 thread 6 core 12 thread and 8 core core 16 thread CPUs from Intel have been around for 3 maybe 4 years now and some at a price that isn't that much more than the current top end Ryzen. Has THAT changed games? No not really the best you'll find from any game of the last 5 years is 6 threads, though I am lead to believe that Doom uses whatever it can get it's hands on. So if game developers haven't bothered to change up their game development to take advantage of a CPU that has been readily available for the last 5 years why are they suddenly going to change up their game just because a 'cheaper' CPU with extra cores has appeared on the market? Even worse you have to also take in to account that this new CPU has to establish a market that has been long held by Intel. I suspect that a lot of folk are clutching at straws, that everyone wanted this chip to appear on the scene and just blow what Intel has away and it really hasn't done that. It's good in some areas and worse in others and the best case scenario is everyone agrees that depending on what you want to do with your PC you now have a viable alternative to Intel that you can choose, but some folk just don't want to accept that, they're clutching at developers suddenly making their game engines better, developing the engines for better multi threading, develop their games so that they will work better with the new chip and given that game developers haven't done this so far with an established CPU market why on th earth would they suddenly funnel huge amounts of time and money in to doing it for a CPU that has to establish its place in the market?
https://forums.guru3d.com/data/avatars/m/254/254725.jpg
There was no talk of the framerate required for CS:GO to perform smoothly. That sentence revolved around the huge difference when high framerates were involved, indicating a bottleneck of sorts (that I hope would be sorted out very soon). As for that game engine you speak badly of, CS:GO is one of the games with the lowest input latency at equivalent framerates to other games. 300FPS is more in line with keeping that latency as low as possible since it can run so easily on any gaming machine. Higher FPS always makes more sense, even if it's much higher than your refresh rate - frametimes go down, variance goes down, input latency goes down, and the game becomes more fluid than simply running at your refresh rate (e.g. 60 / 144 / 165 / 240FPS). High framerates are what Fast Sync leverages. The more frames you're rendering, the less jitter introduced by Fast Sync (there are so many frames the deltas are much smaller). Also, if your minimum framerate is slightly higher than your refresh rate, with VSync off you get jitter (try playing with 61-62FPS at 60Hz for example) and tearing and with Fast Sync you get (somewhat worse) jitter and no tearing. If you're leaving your framerate go over your refresh rate, it better be a lot higher (ideally a multiple of your refresh rate).
Higher framerate and lower latency are always good. I don't quite understand why some people are dismissive of either. I figure they don't know or they've forgotten what it was like to play with CRTs at high refresh rate and PS/2 keyboard and mouse 🤓.
https://forums.guru3d.com/data/avatars/m/265/265660.jpg
Even though Ryzen is not topping the max fps an i7 7700k can top it's not fr behind nor the gaming performance is terrible. This is far from the Buldozer fail. This is a competitive CPU with some child decease every new platform has. I believe that Ryzen will mature way better into the future than any 4 core Intel is offering. Also we must take account that Ryzen is way more useful in other things except gaming. And even in gaming is useful when you multitask along side with the game like streaming for example or running something in the background. Unfortunately today we test and judge something only by the max fps number forgetting everything else. All these days I spent them watching and reading reviews from every possible source. Be it big and well known reviewers but also lesser known ones. Ryzen is a competitive product and offers great value. If AMD did something wrong this time is that they released the product without waiting for some basic patching and more stable bios. They should have waited a bit more and release a bit later with better Bios and the Windows patch. It's bad when you release something and then promise fixes via patching. It doesn't sound good to the average potential buyer plus the product is not getting the reviews scores it can get.
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
LN2 overclocking. Strange worlds.
It's that 0.8 Volt @ 5.8 GHz is interesting.
https://forums.guru3d.com/data/avatars/m/259/259067.jpg
Yeah,undervolt and then push the limit with LN2.Kind of weird,but useful in OC. or "Overclocking the base clock (BCLK) on AM4 platform is possible, however generally not recommended. This is due to its frequency relations with other interfaces, such as the PCIe. Unlike with Intel's more recent CPUs, there is no asynchronous mode (straps / gears) available, which would allow stepping down the PCIe frequency at certain intervals. The PCIe frequency relation is fixed and therefore it increases at the same rate with the BCLK. Gen. 3 operation can generally be sustained up to ~107MHz frequency and higher speeds will usually require forcing the links to either Gen. 2 or to Gen. 1 modes. Unstable PCIe can cause various issues, such as system crashes, data corruption (M.2 SSDs), graphical artifacts and various kinds of other undefined behavior."
https://forums.guru3d.com/data/avatars/m/99/99142.jpg
Weird that G.Skill are selling ram kits that require raising the BCLK to achieve their advertised speed then (3200mhz and 3466mhz).
https://forums.guru3d.com/data/avatars/m/259/259067.jpg
In actual stage of beta Bios from MB its hard to raise BCLK and stay stable. What ratio they recommend?
https://forums.guru3d.com/data/avatars/m/258/258688.jpg
I fixed your message so people could better understand what you're trying to say. No need to thank me!
Yes, for the next 3-6 months Intel cpus *may* see better gaming performance, until devs optimize their games for Ryzen--then it's "game over" for Intel iXXX. (Fixed it for you, don't thank me...;)) Seriously, how could anyone not know that? Ryzen's raw performance is better than Intel's right now--known fact. Game devs have optimized for Intel cpus for the past several years--when Ryzen is optimized for, that will be it...;) Same exact thing happened when the Athlon was first introduced--it happened with Intel's brand-new architectures, too. Prepare yourself, Ryzen's true game performance is just ahead...;)
https://forums.guru3d.com/data/avatars/m/245/245459.jpg
Yes, for the next 3-6 months Intel cpus *may* see better gaming performance, until devs optimize their games for Ryzen--then it's "game over" for Intel iXXX. (Fixed it for you, don't thank me...;)) Seriously, how could anyone not know that? Ryzen's raw performance is better than Intel's right now--known fact. Game devs have optimized for Intel cpus for the past several years--when Ryzen is optimized for, that will be it...;) Same exact thing happened when the Athlon was first introduced--it happened with Intel's brand-new architectures, too. Prepare yourself, Ryzen's true game performance is just ahead...;)
I hope you're right, because this would really kick up the CPU market into another gear & Intel would have to fight to compete. Thing is, only time will tell if this turns out to be the truth. In truth I think any gamer with a 60Hz monitor of any resolution can rest easy that a Ryzen CPU is a good buy, it's just for the folks who want say 90 or 100fps plus on 144Hz monitors where they're better off sticking with Intel for now. In fact, if I was building a new rig with a 60 or 75Hz monitor I'd for sure use a Ryzen CPU - cheaper & more future proof because of all those 8 cores, plus it would be awesome for non-gaming stuff too.
data/avatar/default/avatar08.webp
Meh I can live with a few fps less on the Amd platform,As it stands now Ryzen can pretty-much compete at every benchmark unless its gaming and even then its a few fps only.You get alot more on the Zen cpu! Its going to be 1700 for me I cannot afford to drop more on a cpu.
https://forums.guru3d.com/data/avatars/m/154/154498.jpg
The CPU scored 2454 points in the Cinebench R15’s multi-threaded test, breaking the previous world record by 9 points which was achieved with an Intel Core i7 5960X overclocked to 6GHz. This is pretty impressive considering the fact that Ryzen 7 1800X is clocked at over 600MHz less than the i7 5960X... [snip] ...the i7 5960X also has the advantage of featuring quad-channel DDDR4 memory compared to Ryzen’s dual channel.
This is quite amazing, let's see how Intel responds.
https://forums.guru3d.com/data/avatars/m/186/186805.jpg
[youtube]ylvdSnEbL50[/youtube] just going to leave this here.