Intel Core i9-14900KS Performance to 6.4 GHz with MSI Firmware Update

Published by

Click here to post a comment for Intel Core i9-14900KS Performance to 6.4 GHz with MSI Firmware Update on our message forum
https://forums.guru3d.com/data/avatars/m/133/133128.jpg
Its fun how Intel needs above 6 GHz to still be competitive ... their 10nm tech is still hunting them since Ryzen was launched. I know I know people will say I'm wrong... its ok but its sure fun to watch the show 😀 We will have 8 GHz 10nm 17900KS with power consumption of 1kW (peak) and everything will be fine 😀 feels like the P4 days when they were smoking ! (hot)
https://forums.guru3d.com/data/avatars/m/284/284177.jpg
Zoom, Zoom...Fast!
https://forums.guru3d.com/data/avatars/m/284/284177.jpg
bernek:

Its fun how Intel needs above 6 GHz to still be competitive ... their 10nm tech is still hunting them since Ryzen was launched. I know I know people will say I'm wrong... its ok but its sure fun to watch the show 😀 We will have 8 GHz 10nm 17900KS with power consumption of 1kW (peak) and everything will be fine 😀 feels like the P4 days when they were smoking ! (hot)
Yea...The 7800X3D done took names and kicked ass later! If you're a PC gamer any X3D from AMD is going to be faster....
data/avatar/default/avatar04.webp
bernek:

Its fun how Intel needs above 6 GHz to still be competitive ... their 10nm tech is still hunting them since Ryzen was launched. I know I know people will say I'm wrong... its ok but its sure fun to watch the show 😀
The 14900K smokes the 7800X3D in productivity workloads, and it can easily beat the 7800X3D in gaming with some memory tuning. Gaming is mostly sensitive to memory timings and cache and because the 7800X3D has a humongous L3 cache, the 14900K needs manually tuned RAM timings where the latency is in the low to mid 50s to be capable of beating the 7800X3D. It's not hard to achieve that, you just need the gear and the knowhow. And to illustrate what I am saying, here is a stock 14900K with moderately tuned memory edging out a 7800X3D with moderately tuned memory. The 14900K picks up significantly more performance from the memory tuning than the 7800X3D, because the 7800X3D already has that massive L3 cache: CPU-Giganten im RAM-OC-Vergleich: Core i9-14900K gegen Ryzen 7 7800X3D (pcgameshardware.de)
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Carfax:

The 14900K smokes the 7800X3D in productivity workloads, and it can easily beat the 7800X3D in gaming with some memory tuning. Gaming is mostly sensitive to memory timings and cache and because the 7800X3D has a humongous L3 cache, the 14900K needs manually tuned RAM timings where the latency is in the low to mid 50s to be capable of beating the 7800X3D. It's not hard to achieve that, you just need the gear and the knowhow. And to illustrate what I am saying, here is a stock 14900K with moderately tuned memory edging out a 7800X3D with moderately tuned memory. The 14900K picks up significantly more performance from the memory tuning than the 7800X3D, because the 7800X3D already has that massive L3 cache: CPU-Giganten im RAM-OC-Vergleich: Core i9-14900K gegen Ryzen 7 7800X3D (pcgameshardware.de)
Not that the numbers wouldn't be in Intel's favour but... needs more expensive RAM, uses a lot more power (both CPU and RAM), gets hotter, and the test runs 8000 RAM VS 6400 RAM. For a massive 3fps on avg. and a substantial 0.8 points lead (100 vs 99.2) in their comparison. Just saying that on first look, it looks like a botched comparison. But interesting read non the less. It's a CPU for those who want the top of the top at any price and cost. If you are not exactly looking for that, buying decisions might look different than a 14900xx. Gaming that is, because for productivity I wouldn't buy a 7800X3D but that should be clear anyway.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
I love all the justifications how 14900k can beat 7800x3d by sucking alot more power and tweaking the memory :P Good to know.
data/avatar/default/avatar26.webp
fantaskarsef:

Not that the numbers wouldn't be in Intel's favour but... needs more expensive RAM, uses a lot more power (both CPU and RAM), gets hotter, and the test runs 8000 RAM VS 6400 RAM. For a massive 3fps on avg. and a substantial 0.8 points lead (100 vs 99.2) in their comparison. Just saying that on first look, it looks like a botched comparison. But interesting read non the less.
From my experience tuning Raptor Lake, the memory frequency isn't nearly as important as the memory timings. I would estimate that the memory latency of that setup used in that review by PCgameshardware.de is probably in the mid to upper 50s at best, because the timings are not that tight to be honest. My DDR5 7400 using tighter timings would likely outperform that rig. At any rate, my point was that the 14900K can be made to go faster in gaming quite easily, to the point where it outperforms the 7800X3D. Does that mean it's a better gaming chip? No, definitely not. The 7800X3D is undisputably the best gaming chip right now, but it's not the fastest. As for the 7800X3D running DDR5 6400, it's because it needs the memory to run at 1:1 ratio to achieve the highest performance. Running the memory above 6400MT/s would lower performance on the AMD rig.
Gaming that is, because for productivity I wouldn't buy a 7800X3D but that should be clear anyway.
But people continuously make comparisons between the 7800X3D and the 14900K as though they are competitors. The latter has 24 cores (32 threads with HT enabled) while the latter has only 8 cores. The actual competitor to the 14900K would be the 7950X3D, which is slower in gaming than the 14900K last time I checked.
data/avatar/default/avatar02.webp
Undying:

I love all the justifications how 14900k can beat 7800x3d by sucking alot more power and tweaking the memory 😛 Good to know.
Well you guys are the ones making false comparisons between a gaming CPU and a productivity CPU. Productivity CPUs are naturally going to consume more power than a gaming CPU with much less cores.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
No experience here in tuning much of anything recently, but I do believe you, sounds very reasonable. I personally bought my 7800X3D exactly out of the reason to get the better overall package for my taste and expected performance. While not being the fastest even in gaming (see your link), I did not want to invest too much time and effort into overclocking as of now. Less power and heat was also a thing to me. And less tuning on the RAM together with my simple want (irrational preference) to go with AMD this time around, well knowing I might lose out on a few % of the top results even (well, less than 5 I hoped 😀 ) And I'm doing literally zero productivity work at home besides office stuff, which isn't significant to any decision, really.
Carfax:

As for the 7800X3D running DDR5 6400, it's because it needs the memory to run at 1:1 ratio to achieve the highest performance. Running the memory above 6400MT/s would lower performance on the AMD rig.
Yeah I know, made it buying cheap RAM quite easy because what for spend extra 😀
But people continuously make comparisons between the 7800X3D and the 14900K as though they are competitors. The latter has 24 cores (32 threads with HT enabled) while the latter has only 8 cores. The actual competitor to the 14900K would be the 7950X3D, which is slower in gaming than the 14900K last time I checked.
This is a good point in my opinion. Comes down to carefully judge the usecases, expectations, and then choose the best. For me, 99% gaming, quite easy.
https://forums.guru3d.com/data/avatars/m/251/251046.jpg
LOL what a joke, yet there will be people who buy this furnace. It's simply a desperate attempt to stay relative against AMD.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
How do you cool something like this??? 😱 Are the best liquid coolers of the market a match for this thing?
data/avatar/default/avatar28.webp
fantaskarsef:

This is a good point in my opinion. Comes down to carefully judge the usecases, expectations, and then choose the best. For me, 99% gaming, quite easy.
You made the right decision. The 7800X3D is the best gaming CPU at the moment, hands down! 😉
https://forums.guru3d.com/data/avatars/m/248/248627.jpg
idk how anyone can defend the 14900k at least for me here in Canada its a no brainer to get the 7800x3d its 500$ cad vs 930$ for the 14900k so even ignoring all performance metrics the 14900k doesn't make any sense whatsoever if gaming is your priority
https://forums.guru3d.com/data/avatars/m/181/181063.jpg
motoman26:

LOL what a joke, yet there will be people who buy this furnace. It's simply a desperate attempt to stay relative against AMD.
Stop with this "furnace" thing already.... Sadly all top CPU's today (AMD and Intel) are "furnaces":( If you want a top dog system today is like walking in a nightmarish world: flames, hot as hell, green monsters trying to eat your wallet.... I'll stay with 5950X until I will find out that it can't cope with some workloads or AMD and Intel come with a solution for their next top "furnaces" - other than undervolt. https://cdn.thefpsreview.com/wp-content/uploads/2022/11/temperature_7950x.png
https://forums.guru3d.com/data/avatars/m/242/242134.jpg
location requirement: around the arctic circle power requirement: power plant close by sarcasm mode: OFF @barbacot (ignoring for a moment that i really like ppl that tell others to "stop" something, while not being partner/parent/admin/mod or similar) the facts dont change, just because we dont like them. the intel isnt purely advertised for productivity loads, so any oc is still aimed at gaming comparison/benchs/reviews, and that amd is running "hot" comes from the fact its designed to run at those temps, with cooling provided being the main "perf limit", not so much power, the previous gens werent running as hot, with the 5950 running almost 30% cooler, or that the 7xxx is a little bit more sensitive to what cooler is used.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Carfax:

Well you guys are the ones making false comparisons between a gaming CPU and a productivity CPU.
I wouldn't consider the 14900K a productivity CPU. Too power hungry - it would thermal throttle a little too quickly, and being a K-series CPU, it's more meant for enthusiasts than workstations. I also don't think the 7800X3D is comparable due to it being over $100 cheaper. The 7900X is a more apples to apples comparison, which is overall a little slower but also uses roughly 80W less power in real-world tests. IIRC, the 14900K doesn't have AVX512 either. 80W makes a significant difference in terms of cooling performance, whether that means you're trying to maintain clock speeds or lower fan noise.
Productivity CPUs are naturally going to consume more power than a gaming CPU with much less cores.
That's not really an accurate statement in modern CPUs - it's more a matter of how many threads are in use. Each additional core only contributes a few extra watts of consumption if they're idle. Since the average game tends to use fewer than 12 threads, the extra cores of the 14900K won't really make a big difference to power draw compared to something like a 14600K with alike clock speeds; cache size would likely account for the greatest difference in power consumption there. So - since a 7800X3D and a 14900K would be using the same thread count for the same game, it's not a good sign if the 14900K uses significantly more power despite not handily winning. For what it's worth: Y'know what ARM did to achieve more performance without increasing the power envelope? They added more cores. Power consumption goes up disproportionately with frequency, but due to ARM achieving nearly 0W for idle cores, they could proportionately increase performance by increasing core count. Efficiency is a top priority for ARM, so it makes more sense for them to double the power consumption with double the cores than it would be to double the power consumption with an overclock that falls below doubling performance. However, ARM didn't really need to double performance, which is why they had their big.LITTLE configuration - this allowed the little cores to handle background tasks while sipping watts. EDIT: In other words: ARM (for example) created a quad core 15W chip one year and then release an octa core 15W chip the next year. With a die shrink, better-optimized instructions, and little cores, they managed to nearly double the performance without increasing the peak wattage. If they didn't add the little cores but instead pushed clock speeds higher, it could have been a 20W chip for roughly the same performance.
https://forums.guru3d.com/data/avatars/m/181/181063.jpg
fry178:

location requirement: around the arctic circle power requirement: power plant close by sarcasm mode: OFF @barbacot (ignoring for a moment that i really like ppl that tell others to "stop" something, while not being partner/parent/admin/mod or similar) the facts dont change, just because we dont like them. the intel isnt purely advertised for productivity loads, so any oc is still aimed at gaming comparison/benchs/reviews, and that amd is running "hot" comes from the fact its designed to run at those temps, with cooling provided being the main "perf limit", not so much power, the previous gens werent running as hot, with the 5950 running almost 30% cooler, or that the 7xxx is a little bit more sensitive to what cooler is used.
Sorry if you felt offended / uncomfortable - it wasn't meant seriously - more like an easy "stop". The idea is that ALL of them are "furnaces" now - doesn't matter if it was designed to be a "furnace" or become like a furnace - so this prejudice with "Intel is a space heather/ furnace" and AMD conveniently forgot is long overdue now. I think that an enthusiast doesn't care about power draw and heat, if it's the fastest he will buy it. Now things are a little more complicated than years ago because you can't say "fastest" in absolute terms. As you implied, there is fastest in productivity and fastest in games. I can't wait to see next gens CPU's - both Intel and AMD reached 95 - 100 degrees Celsius temps with their architectures so where will they go from here?
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
barbacot:

Sorry if you felt offended / uncomfortable - it wasn't meant seriously - more like an easy "stop". The idea is that ALL of them are "furnaces" now - doesn't matter if it was designed to be a "furnace" or become like a furnace - so this prejudice with "Intel is a space heather/ furnace" and AMD conveniently forgot is long overdue now.
You're not wrong that it isn't hard to buy a furnace of a processor (from just about anyone) now, and it isn't limited to just GPUs. However, it's all relative. Nowadays exceeding 400W for just 24 cores (2/3 of which are "efficiency" cores) is quite bad, when you consider an Epyc 9754 (that's 128 fully-capable cores) uses roughly the same amount of power. Sure, those CPUs boost at nearly half the frequency, but that kinda drives home the point: the 14900KS is extremely hot for something that isn't going to be all that impressive. Even if it's #1 in every test against the similarly-priced 7950X3D, it will probably only be 2% faster overall while using 2X the power draw.
I think that an enthusiast doesn't care about power draw and heat, if it's the fastest he will buy it. Now things are a little more complicated than years ago because you can't say "fastest" in absolute terms.
Some enthusiasts are like that. Many of us (not even suggesting most; I don't know how many) don't feel like having a worse overall experience with a greater cost. Greater instability, fan noise, and increased room heat [in the summer] in particular. Nowadays with self-overclocking chips and pretty much all motherboards looking the same with the same few chipsets, it's kinda boring if being an enthusiast just means whoever has the most cash to throw at their system.
I can't wait to see next gens CPU's - both Intel and AMD reached 95 - 100 degrees Celsius temps with their architectures so where will they go from here?
Well, they've reached that for years now. Node shrinks, new instructions, and more optimized pipelines seem to be the simplest way to squeeze in more performance. Intel seems to be taking more drastic approaches with their E-cores and the upcoming architecture that apparently ditches HT, whereas AMD is starting to lean toward c-cores.
https://forums.guru3d.com/data/avatars/m/94/94773.jpg
barbacot:

Stop with this "furnace" thing already.... Sadly all top CPU's today (AMD and Intel) are "furnaces":( If you want a top dog system today is like walking in a nightmarish world: flames, hot as hell, green monsters trying to eat your wallet.... I'll stay with 5950X until I will find out that it can't cope with some workloads or AMD and Intel come with a solution for their next top "furnaces" - other than undervolt. https://cdn.thefpsreview.com/wp-content/uploads/2022/11/temperature_7950x.png
TBF that's how AMD's designed this gen of CPU in that it'll try to hit thermal limit to maximise performance if it calls for it which rarely happens unless all you do is run benchmarks and stress tests. My 7700X on cheap air cooling (DeepCool AK620) hits the advertised 5.5GHz and hovers in the 70's during gaming in a warm-ish room. Can't say I know how current Intel CPU's behave in the same situation though.