Intel Core i9-14900KS Already De-lidded, Overclocker gets Temps 10°C down

Published by

Click here to post a comment for Intel Core i9-14900KS Already De-lidded, Overclocker gets Temps 10°C down on our message forum
https://forums.guru3d.com/data/avatars/m/283/283844.jpg
Going to pass on the 14900KS as they do not fit my use case PC Gaming.I owned some 14900KF's and now running 14600K even at default clocks the 14600K is just as fast as a 14900KF at 6200Mhz in 4K PC Gaming and I know the 14900KS would also be the same. 475 Watts is the most I seen on one of my 14900KF CPU's and I am sure a 14900KS could beat that easy.
https://forums.guru3d.com/data/avatars/m/246/246564.jpg
Yeah, the "lesser" CPUs are just as good for gaming as the top SKUs, and now that I don't use my PC for anything other than media and gaming, I no longer need these behemoths. Still, this one's going to be quite the performer until the next gen arrives.
data/avatar/default/avatar28.webp
Oh, that's fine then. All you have to do is delid your stupidly expensive CPU with the risk of destroying it and then it'll sip power at the economical rate of.... 366 watts That's fine!
data/avatar/default/avatar32.webp
mackintosh:

Yeah, the "lesser" CPUs are just as good for gaming as the top SKUs, and now that I don't use my PC for anything other than media and gaming, I no longer need these behemoths. Still, this one's going to be quite the performer until the next gen arrives.
Yup, people got brainwashed into buying these expensive CPUs with graphs of Cinebench and stuff like that. In reality there is very little difference in gaming performance between something like 14900K and 14600K. 5% at 1440p or something like that, and that's with a bloody 4090.
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
Glottiz:

Yup, people got brainwashed into buying these expensive CPUs with graphs of Cinebench and stuff like that. In reality there is very little difference in gaming performance between something like 14900K and 14600K. 5% at 1440p or something like that, and that's with a bloody 4090.
Don't even need a K model for gaming. But also don't get an F... cuz F if your GPU dies :P
https://forums.guru3d.com/data/avatars/m/265/265437.jpg
editing video on these processors would be nice but the power comsuption mentioned is not acceptable neither the core temps.
https://forums.guru3d.com/data/avatars/m/246/246564.jpg
I would rather see some official reviews before passing judgement. I mean I can sort of guess at what the power draw and temps are going to be based off of the 14900K, but still.
https://forums.guru3d.com/data/avatars/m/201/201426.jpg
Feels like the Pentium 4 days. High as hell clock speeds, 400w+ power consumption and DDR5 to beat my 5800x3d that draws 90-100w by what, 15-20 fps?
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
Agonist:

Feels like the Pentium 4 days. High as hell clock speeds, 400w+ power consumption and DDR5 to beat my 5800x3d that draws 90-100w by what, 15-20 fps?
I don't see mine going over 80W, but maybe that's because I have a -25 voltage offset using PBO2 tuner, since that's the only way I can get higher clocks out of it. I think people forget what a joke Netburst was, it was utterly crushed by the Athlon 64s in just about anything it was used for, there wasn't even a niche use case that you could make a good argument for it to be used in due to one reason or another. IIRC they just abandoned that entire arch and used the Pentium M's arch as their new starting point, which was based on the last Pentium 3 arch. The last time I saw something that really reminded of Netburst was Bulldozer.
https://forums.guru3d.com/data/avatars/m/223/223196.jpg
My current PC has it's end date set my Microsoft. Windows 10 supports ends in 2025, my PC will have had a run for 10 years then, with a graphics card upgrade for the last third of it's lifespan. The next one will be all 5s. 15900K, RTX 5090, 64GB DDR5 and at least 8 TB PCIe 5.0 storage.
https://forums.guru3d.com/data/avatars/m/201/201426.jpg
Neo Cyrus:

I don't see mine going over 80W, but maybe that's because I have a -25 voltage offset using PBO2 tuner, since that's the only way I can get higher clocks out of it. I think people forget what a joke Netburst was, it was utterly crushed by the Athlon 64s in just about anything it was used for, there wasn't even a niche use case that you could make a good argument for it to be used in due to one reason or another. IIRC they just abandoned that entire arch and used the Pentium M's arch as their new starting point, which was based on the last Pentium 3 arch. The last time I saw something that really reminded of Netburst was Bulldozer.
I usually see 75w in heavy cpu games, and full load is 95-105w, I also never see 4550 boost. Even stock settings with a custom loop. But my 5800x3d runs very hot without -30 offset.
Screenshot_54.png
Bulldozer was absolutely Netburst 2.0. Prescott was not much better either. I remember my Athlon XP 2800 destroying the P4 Prescott 2.66ghz p4 pc we had. Half Life 2 played so much better on my Athlon rig with the same exact 9800 pro 128mb gpu.
data/avatar/default/avatar12.webp
Agonist:

I usually see 75w in heavy cpu games, and full load is 95-105w, I also never see 4550 boost. Even stock settings with a custom loop. But my 5800x3d runs very hot without -30 offset. Bulldozer was absolutely Netburst 2.0. Prescott was not much better either. I remember my Athlon XP 2800 destroying the P4 Prescott 2.66ghz p4 pc we had. Half Life 2 played so much better on my Athlon rig with the same exact 9800 pro 128mb gpu.
That looks ok to me, cooling all the way down to 64C needs a lot of cooling for a 5800X3D, it would be very hard to get a 14900k down to those temperatures too without a delid. My 7800X3D 80W 80C 4800MHz allcore at 1V with aircooler.
https://forums.guru3d.com/data/avatars/m/283/283844.jpg
TLD LARS:

That looks ok to me, cooling all the way down to 64C needs a lot of cooling for a 5800X3D, it would be very hard to get a 14900k down to those temperatures too without a delid. My 7800X3D 80W 80C 4800MHz allcore at 1V with aircooler.
I do not agree because it is not true but saying that 14900K/KF hottest CPU's around. 14900KF Cyberpunk 1.34v-1.36v 6000/4400 all core no delid just AIO Temps in this video approximately 50°C-60°C from what I can see or remember when recording video. [youtube=jptVgVqXjx8]
data/avatar/default/avatar10.webp
bobnewels:

I do not agree because it is not true but saying that 14900K/KF hottest CPU's around. 14900KF Cyberpunk 1.34v-1.36v 6000/4400 all core no delid just AIO Temps in this video approximately 50°C-60°C from what I can see or remember when recording video.
He was talking about all cores at full load, at least that is how I understood it. The setup you are showing is at 20% usage, that setup is going to drop 500MHz on all cores when rendering, compressing/decompressing or shaderbuildup with 100% CPU load because of thermal throttling.
https://forums.guru3d.com/data/avatars/m/201/201426.jpg
TLD LARS:

He was talking about all cores at full load, at least that is how I understood it. The setup you are showing is at 20% usage, that setup is going to drop 500MHz on all cores when rendering, compressing/decompressing or shaderbuildup with 100% CPU load because of thermal throttling.
Yea, I am talking about all core load.
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
Agonist:

I usually see 75w in heavy cpu games, and full load is 95-105w, I also never see 4550 boost. Even stock settings with a custom loop. But my 5800x3d runs very hot without -30 offset.
That sounds right in line with mine. That max of 80W was in games, I'm sure it goes higher in say Cinebench. I use -25 instead of -30 because I thought I saw it squeezing out an extra few MHz in CB R23 over the -30, this is with a Noctua NH-D15 with the IF at 1900 (settings I no longer use, changed board/RAM). As for 4550MHz, I often see it, but only for a moment, like literally 1 second at a time, on 1 core while the others are at 3 to 4.45. All-core it's usually 4.45GHz in games/light loads. At stock, all-core in CB23 I'd get like 4.15-4.2GHz? Something really low, whatever it was. The -25 offset makes that more like 4.35GHz+. That got me curious, and I did a quick run of R23 and... all 8 cores bouncing between 4375-4425MHz with -25 in PBO2 with the IF down at 1833. That scored 14.8K, which I think is in normal range. Anyway, there's not much we can do about its low clocks, and it doesn't matter that much anyway unless you're aiming for 240 fps. For 165 fps it can still hang on with the right settings.
https://forums.guru3d.com/data/avatars/m/283/283844.jpg
TLD LARS:

He was talking about all cores at full load, at least that is how I understood it. The setup you are showing is at 20% usage, that setup is going to drop 500MHz on all cores when rendering, compressing/decompressing or shaderbuildup with 100% CPU load because of thermal throttling.
Well I guess I could argue all day with you or say I have a idea how these things work.Here 14 PC rigs for my personal use EG Ryzen 2600X 4400Mhz with 1080Ti [youtube=uF5FMGfNU7A] 3600X Up To 4525Mhz With RTX 2080 [youtube=iLG9EjOxkyA] 3600XT 4600Mhz with RTX 3080 [youtube=4dVOSw7ei-0] 3800X All Core Overclock 4625Mhz with RTX 2080 [youtube=BCxOkyQDdz4] 3800XT up to 4650Mhz RTX 3080 [youtube=9RH7jhfT0Ww] 5600x 4860Mhz 5800X up to 5150Mhz with RTX 3080 https://www.youtube.com/watch?v=Xc8kq8Wi8iQ&ab_channel=gerardfraser 7700X 5700Mhz with RTX 3090 https://www.youtube.com/watch?v=u8w0-DgyxDk&ab_channel=gerardfraser 10850K with RTX 3080 https://www.youtube.com/watch?v=_Nqihk8fuHQ&ab_channel=gerardfraser 12600k with RTX 4090 https://www.youtube.com/watch?v=tkugfvDNzJk&ab_channel=gerardfraser 12900k 5300mhz with RTX 3090 https://www.youtube.com/watch?v=8frB7Pw0TdQ&ab_channel=gerardfraser 13700Kf 6200Mhz with RTX 3070Ti https://www.youtube.com/watch?v=jPVkoRRdtCI&ab_channel=gerardfraser 14600K 4k up to 6000Mhz Cyberpunk 2077 https://www.youtube.com/watch?v=cvmx948VWso 14900KF BIOS Settings With Simple Undervolt Overclock Up To 6200Mhz https://www.youtube.com/watch?v=qS4lRaNqlas
data/avatar/default/avatar15.webp
bobnewels:

Well I guess I could argue all day with you or say I have a idea how these things work.Here 14 PC rigs for my personal use EG 14900KF BIOS Settings With Simple Undervolt Overclock Up To 6200Mhz [youtube=qS4lRaNqlas]
Only the last link is relevant when talking about the 14900k, kf, ks, so I am only going to comment on that one. This is not the same setup as the Cyberpunk 2077 gamming setup shown previously, but it is close enough. The CPU goes from 6200 to 5700 to prevent overheating and stays at 90-95C when running cinebench 23 on all cores. The voltage also falls from 1.4V to 1.2V when going from 6200 to 5700MHz. These results show that the 6000 on all cores Cyberpunk 2077 setup previously shown would also clock down to 5700 when running cinebench 23, so a 300MHz drop from 6000 to 5700. When not running on-shot Cinebench benchmarks, but longer loads, it would be safe to say it would fall another 100Mhz when the water heats up, a 400MHz drop in total. I predicted a 500MHz drop on the Cyberpunk 2077 benchmark setup when running Cinebench 23 and I was pretty close.