Minimum framerates of games evaluated on an engineering sample of Intel's Core i9-13900K have improved.

Published by

Click here to post a comment for Minimum framerates of games evaluated on an engineering sample of Intel's Core i9-13900K have improved. on our message forum
data/avatar/default/avatar35.webp
This will be a nice upgrade from 4770k and 3800x 😀 The jump in min fps is what we want to see in games. Looks like more cache is working for Intel too 🙂
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Something looks off here... looked at it this way: 1st graph shows gaming performance relative to 12900KF, second table power draw... first I see that the increases in power draw are way higher percentages than the increases of fps, which are just above margin of error more often than not... Horizon: Zero Dawn, they spend +39/+39/+28% more power to achieve an average of +2,8% more fps (taken from all three resolutions and all three numbers)? RDR2 -- +32/+41/+52% power draw increase for an average 21% gain? (When clearly +77% fps gains were botched to begin with) FF9 -- +11/+0/+15% with fps gains of ~ +4,5% Forza -- -2/+4.5/+10% to fps gains of +11,6% (well done here) Monster Hunter -- +29/+21/+32% for +4.75% fps PUBG -- +6.7/+12.9/+28% power for +18% fps 1080p -- power draw +19% for 12% fps gains (with a heavy weight on min frames) 1440p -- power +19.7% for fps +11% 2160p -- power +31% for fps +5.9% I don't know, power draw just looks way to much increased for what they actually gained from it. Might be just me missing something. But like that, this looks like they have quite a way to go...
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Adding more e cores while still having 8 p cores is a bad idea. The only difference between 12900k and 13900k will be higher core clock. That wont increase fps in games as much we want to. Wait for zen4 with v-cache boys.
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
Exactly. Adding just more e-cores is pointless, unless your are too concerned with cinebench scores. Intel should have added more cache or performance cores, instead of those pathetic E-cores.
fantaskarsef:

Something looks off here... looked at it this way: 1st graph shows gaming performance relative to 12900KF, second table power draw... first I see that the increases in power draw are way higher percentages than the increases of fps, which are just above margin of error more often than not... Horizon: Zero Dawn, they spend +39/+39/+28% more power to achieve an average of +2,8% more fps (taken from all three resolutions and all three numbers)? RDR2 -- +32/+41/+52% power draw increase for an average 21% gain? (When clearly +77% fps gains were botched to begin with) FF9 -- +11/+0/+15% with fps gains of ~ +4,5% Forza -- -2/+4.5/+10% to fps gains of +11,6% (well done here) Monster Hunter -- +29/+21/+32% for +4.75% fps PUBG -- +6.7/+12.9/+28% power for +18% fps 1080p -- power draw +19% for 12% fps gains (with a heavy weight on min frames) 1440p -- power +19.7% for fps +11% 2160p -- power +31% for fps +5.9% I don't know, power draw just looks way to much increased for what they actually gained from it. Might be just me missing something. But like that, this looks like they have quite a way to go...
It's probably pushing a higher clock speed. And power consumption increases in geometric way, relative to clocks. Also, this CPU is made in Intel's node 7, the same as the 12900K. This node 7 is not a full new node. It's just a tweaked node 10.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
Undying:

Adding more e cores while still having 8 p cores is a bad idea. The only difference between 12900k and 13900k will be higher core clock. That wont increase fps in games as much we want to. Wait for zen4 with v-cache boys.
not correct, the 13900k has more cache as well (not to mention a higher possible ring clock)
data/avatar/default/avatar29.webp
Undying:

Adding more e cores while still having 8 p cores is a bad idea. The only difference between 12900k and 13900k will be higher core clock. That wont increase fps in games as much we want to. Wait for zen4 with v-cache boys.
It looked like min fps is a good gain? More cache on 13900k vs 12900k. Dough, biggest min fps and avg fps gain is memorytuning. Like my test 4800 xmp vs 7000c30 resulted in 35% gain just with memory oc 🙂
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Astyanax:

not correct, the 13900k has more cache as well (not to mention a higher possible ring clock)
Its a small bump in cache compared to alder lake.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Horus-Anhur:

It's probably pushing a higher clock speed. And power consumption increases in geometric way, relative to clocks. Also, this CPU is made in Intel's node 7, the same as the 12900K. This node 7 is not a full new node. It's just a tweaked node 10.
Yeah, it's probably just that, but very subjectively while planning ahead and laying out my CPU / MB upgrade plans for later this year, I do not look at such releases with much rejoice. Also, I wondered why Intel would post such, argueably, bad results... we know where they're coming from, but why even try to work with that in marketing..
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
fantaskarsef:

Yeah, it's probably just that, but very subjectively while planning ahead and laying out my CPU / MB upgrade plans for later this year, I do not look at such releases with much rejoice. Also, I wondered why Intel would post such, argueably, bad results... we know where they're coming from, but why even try to work with that in marketing..
Power consumption in the 12900K was already pretty bad. But the 13900K is making it even worse. It also seems that AMD is going to increase power usage in Zen4. But they are starting at a much lower level. https://www.guru3d.com/index.php?ct=articles&action=file&id=75723
https://forums.guru3d.com/data/avatars/m/72/72189.jpg
These intel new cpu are really badly desingned. Reminds me of Pentium D. Power hungry,hot and slower for gaming ,compared to amd.
data/avatar/default/avatar02.webp
Mannerheim:

These intel new cpu are really badly desingned. Reminds me of Pentium D. Power hungry,hot and slower for gaming ,compared to amd.
Slower in gaming? 5800x3d is the only Amd cpu that beating Alderlake in gaming at slow xmp settings 🙂 Fast memory, and Alderlake is faster in pretty much everything 🙂 Ps: I'm still not a fanboy, but testing myself.
https://forums.guru3d.com/data/avatars/m/231/231931.jpg
Undying:

Its a small bump in cache compared to alder lake.
24MB of extra cache isn't a small bump. It's a pretty decent increase and will help in cache bound tests.
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
Agent-A01:

24MB of extra cache isn't a small bump. It's a pretty decent increase and will help in cache bound tests.
True. But that number is a bit for the L2 for P-Cores, a bit L2 for E-core and another for the L3. Meanwhile, Zen 3Dv cache is a big chunk that can be accessed by all cores.
https://forums.guru3d.com/data/avatars/m/231/231931.jpg
Horus-Anhur:

True. But that number is a bit for the L2 for P-Cores, a bit L2 for E-core and another for the L3. Meanwhile, Zen 3Dv cache is a big chunk that can be accessed by all cores.
P cores have full access to E core cache. So when they are dormant, either disabled or no activity, then p-cores will use those cache blocks.
data/avatar/default/avatar02.webp
Pretty please start testing CPU performance for things like Lua, Javascript, turn based games (turn times), PDF decoding/viewing/searching and other applications that do *not* benefit from IPC based mostly on larger CPU caches. Thanks in advances! 😉
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
Horus-Anhur:

Exactly. Adding just more e-cores is pointless, unless your are too concerned with cinebench scores. Intel should have added more cache or performance cores, instead of those pathetic E-cores.
Right now the e-cores count seems nothing more than a marketing gimmick, so Intel can say their products have lots of cores. For me, Intel should release CPUs only with P-cores for enthusiasts and professionals and CPUs with only E-cores, for those who want somehing cheaper and weaker. Combining them seems so pointless for me.
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
H83:

Right now the e-cores count seems nothing more than a marketing gimmick, so Intel can say their products have lots of cores. For me, Intel should release CPUs only with P-cores for enthusiasts and professionals and CPUs with only E-cores, for those who want somehing cheaper and weaker. Combining them seems so pointless for me.
I could see the use of those E-cores for laptop CPUs, as a way to save some power. But asides from that, they are a dumb idea.
data/avatar/default/avatar12.webp
H83:

Combining them seems so pointless for me.
I re-enabled Windows' fast-boot feature and turn off the PC instead of leaving it running, because idle wattage is too high at over 60 watts (5900X + 2700S). Better working standby would be welcome, but things like fan-speeds and some other stuff still run into problems with that aeons old feature. The NVidia GPU idles at unnecessary 20 watts alone and I don't understand why desktop GPUs don't switch between low power integrated and high power dedicated like laptops. Using E-cores for idle and low compute tasks is a welcome feature, especially as my room's temps are regularly higher than the other rooms in the apartment. And software that makes use of all cores still benefits from E-cores, because often the higher the core-count the less all cores are fully utilized anyway. Here is an example of Gigapixel AI processing an image (4.16% = 1 logical core fully utilized): https://i.imgur.com/X9aT65U.png
https://forums.guru3d.com/data/avatars/m/260/260828.jpg
Astyanax:

not correct, the 13900k has more cache as well (not to mention a higher possible ring clock)
Higher cache explains the higher min fps
https://forums.guru3d.com/data/avatars/m/271/271560.jpg
tty8k:

With the new generation of video cards aiming at 4k gaming, sure it makes sense to spend another 1000 for 3% improvement. AMD Zen 4 is in exactly the same boat don't expect miracles. On a second thought, this makes me happy since I have more budget for a better video card which makes more difference.
not quite. as was mentioned earlier they are starting from a lower power draw and their new power draw is still less than Intel gen 11 and on the gpu, rejoice as (some of) the rdna3 cards will rock the world under 350 watts