Intel Core i9-10900K spotted overclocked at 5.4 GHz, incl Cinebench CB15 benchmark score

Published by

Click here to post a comment for Intel Core i9-10900K spotted overclocked at 5.4 GHz, incl Cinebench CB15 benchmark score on our message forum
https://forums.guru3d.com/data/avatars/m/234/234283.jpg
Core and North at 5.4 at 1.35v is impressive. And done on AIO. Not LN2. Will be interesting to see temps with the tweaks done to the, DIE/IHS and STIM.
data/avatar/default/avatar29.webp
The tweak allowed the processor to score ~3000 which is roughly a third faster than a 9900K at 2000 points
That's more like +50% πŸ™‚
https://forums.guru3d.com/data/avatars/m/196/196426.jpg
Not bad but CB15 finishes really quick on all these 8+ core CPUs Let's see how it handles in longer tasks ... CB20, or a real-life Blender render. On my R7 3700X it finishes in around 20 seconds with a score of 2150-2200. For 10900K to get 3000 points it means it run less than 14 seconds. Not hard to cool that... I doubt it will be able to sustain those 5.4 Ghz for minutes without cooking itself alive.
data/avatar/default/avatar07.webp
Much more interesting will be the actual power draw required.
data/avatar/default/avatar14.webp
TDP 125W ahahahaha πŸ™‚)
https://forums.guru3d.com/data/avatars/m/269/269253.jpg
Well, it seems really impressive - at first glance. But if you compare stock 9900K (8 core) to 10900K (10 core) "on STOCK" results, it shows (2067 / 2347) only 13,5% improvement - with 25% more cores! (Or did i miss something while looking at these chart?) So i would wait for real world benchmarks to make a judgement...
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
wavetrex:

I doubt it will be able to sustain those 5.4 Ghz for minutes without cooking itself alive.
It'll do just fine on the Intel stock cooler. The stock cooler is the large industrial chiller we have seen before.
https://forums.guru3d.com/data/avatars/m/270/270288.jpg
This gen will give AMD a good run for its money. This would be a good option to rival 3600x vs the i5 variant for the first time in history will have 6 core 12 threads. At the same time it might be short lived if AMD drop there new line. We dont know if the AMD new line will be that much better i dont think its will but thats pure speculation
https://forums.guru3d.com/data/avatars/m/234/234283.jpg
Sixtyfps:

This gen will give AMD a good run for its money. This would be a good option to rival 3600x vs the i5 variant for the first time in history will have 6 core 12 threads. At the same time it might be short lived if AMD drop there new line. We dont know if the AMD new line will be that much better i dont think its will but thats pure speculation
I don't think AMD is doing their launch till fall, so that's a ways a way.
https://forums.guru3d.com/data/avatars/m/270/270008.jpg
I wonder if this was cooled by water, a chiller or liquid nitrogen? No way this was done on air or an all in one. I assume we are talking 300-350w actual TDP maybe a little more.
data/avatar/default/avatar17.webp
I can already see the Headlines in the news: "...And that is how you melt the ice caps completely ladies and gentlemen." "In this week's guide on: How to set up a real life volcano in your PC case"
data/avatar/default/avatar14.webp
I keep clicking these threads to see the ridiculous AMD vs Intel slap fights strangely everyone talks about TDP like it matters - gamers don't give a sweet crap about TDP beyond any effect there may be to heat generation and overclocking headroom Why aren't more people talking about PCI 4.0 vs 3.0 support with the 2 platforms? I feel like that's an actual real difference - I can't seem to see any reason why it matters NOW but I would think within a couple years PCI 4.0 will come into play will it not?
https://forums.guru3d.com/data/avatars/m/34/34795.jpg
Guys, remember that this is likely just a golden sample. With the silicon lottery, what you actually end up purchasing might not even be stable at 5.4Ghz regardless of the voltage you throw at it.
https://forums.guru3d.com/data/avatars/m/196/196426.jpg
Stryfex:

about TDP like it matters
Because it does? Or maybe you are one of those that like hearing an F22 fighter jet taking off inside your computer. I'll stick to my whisper quiet and easily cooled 65W TDP on the 3700X with a Noctua fan on it which doesn't even break a sweat, thank you !
https://forums.guru3d.com/data/avatars/m/234/234283.jpg
Seketh:

Guys, remember that this is likely just a golden sample. With the silicon lottery, what you actually end up purchasing might not even be stable at 5.4Ghz regardless of the voltage you throw at it.
People always think this, but it is widely debunked as a conspiracy. There are many examples of people in retail getting great overclocks.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Stryfex:

I keep clicking these threads to see the ridiculous AMD vs Intel slap fights strangely everyone talks about TDP like it matters - gamers don't give a sweet crap about TDP beyond any effect there may be to heat generation and overclocking headroom
I wouldn't say so. When you look back at the GPU arena, it's clear Nvidia's successful architecture reforms to majorly lower the power consumption played a big role in their success. Sure, it was just a single piece of the puzzle as they got many other factors right as well, but there were myriad people who pointed it out time after time, as a strength against the AMD GCN tech, which was a guzzler. This is obviously gamers talking even more than with CPUs, as basically only gamers need strong GPUs in the consumer market in large numbers. Miners need as well, but for them the wattage means even more because it directly affects the profit margin.
Stryfex:

Why aren't more people talking about PCI 4.0 vs 3.0 support with the 2 platforms? I feel like that's an actual real difference - I can't seem to see any reason why it matters NOW but I would think within a couple years PCI 4.0 will come into play will it not?
It hasn't really got anything to do with the GHz. At least I doubt it did, unless the high clocks caused the noise preventing Intel from activating PCIe 4.0 in the 10k generation. It has been addressed in relevant threads, though.
https://forums.guru3d.com/data/avatars/m/248/248902.jpg
intelflex
https://forums.guru3d.com/data/avatars/m/270/270008.jpg
@nizzen Your screen shot is showing cores pulling 106.73W which is pretty much dead on TDP. I'm sure it takes more to power VRM's, and some losses etc but what is needed to cool the CPU aka the TDP is about dead on the rating.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Stryfex:

strangely everyone talks about TDP like it matters - gamers don't give a sweet crap about TDP beyond any effect there may be to heat generation and overclocking headroom
The wattage is largely irrelevant but managing to cool this is a whole other story. 10 cores at 5.4GHz is definitely going to be difficult to keep cool.
Why aren't more people talking about PCI 4.0 vs 3.0 support with the 2 platforms? I feel like that's an actual real difference - I can't seem to see any reason why it matters NOW but I would think within a couple years PCI 4.0 will come into play will it not?
PCIe 4.0 (or even 5.0) devices will likely be more common in the next few years but so far, very few things outside of synthetic benchmarks demand more than the bandwidth of 3.0, whether that be GPUs or SSDs. However, perhaps the RTX 3000 series might be bottlenecked by PCIe 3.0 @ x16. PCIe 4.0 is more appealing for servers at the moment.
jwb1:

People always think this, but it is widely debunked as a conspiracy. There are many examples of people in retail getting great overclocks.
Conspiracy or not, the prices still suggest they're better-binned chips.