Minimum framerates of games evaluated on an engineering sample of Intel's Core i9-13900K have improved.
Click here to post a comment for Minimum framerates of games evaluated on an engineering sample of Intel's Core i9-13900K have improved. on our message forum
nizzen
This will be a nice upgrade from 4770k and 3800x 😀
The jump in min fps is what we want to see in games. Looks like more cache is working for Intel too 🙂
fantaskarsef
Something looks off here... looked at it this way:
1st graph shows gaming performance relative to 12900KF, second table power draw... first I see that the increases in power draw are way higher percentages than the increases of fps, which are just above margin of error more often than not...
Horizon: Zero Dawn, they spend +39/+39/+28% more power to achieve an average of +2,8% more fps (taken from all three resolutions and all three numbers)?
RDR2 -- +32/+41/+52% power draw increase for an average 21% gain? (When clearly +77% fps gains were botched to begin with)
FF9 -- +11/+0/+15% with fps gains of ~ +4,5%
Forza -- -2/+4.5/+10% to fps gains of +11,6% (well done here)
Monster Hunter -- +29/+21/+32% for +4.75% fps
PUBG -- +6.7/+12.9/+28% power for +18% fps
1080p -- power draw +19% for 12% fps gains (with a heavy weight on min frames)
1440p -- power +19.7% for fps +11%
2160p -- power +31% for fps +5.9%
I don't know, power draw just looks way to much increased for what they actually gained from it. Might be just me missing something. But like that, this looks like they have quite a way to go...
Undying
Adding more e cores while still having 8 p cores is a bad idea. The only difference between 12900k and 13900k will be higher core clock. That wont increase fps in games as much we want to.
Wait for zen4 with v-cache boys.
Horus-Anhur
Exactly. Adding just more e-cores is pointless, unless your are too concerned with cinebench scores.
Intel should have added more cache or performance cores, instead of those pathetic E-cores.
It's probably pushing a higher clock speed. And power consumption increases in geometric way, relative to clocks.
Also, this CPU is made in Intel's node 7, the same as the 12900K.
This node 7 is not a full new node. It's just a tweaked node 10.
Astyanax
nizzen
Undying
fantaskarsef
Horus-Anhur
https://www.guru3d.com/index.php?ct=articles&action=file&id=75723
Power consumption in the 12900K was already pretty bad. But the 13900K is making it even worse.
It also seems that AMD is going to increase power usage in Zen4. But they are starting at a much lower level.
Mannerheim
These intel new cpu are really badly desingned. Reminds me of Pentium D. Power hungry,hot and slower for gaming ,compared to amd.
nizzen
Agent-A01
Horus-Anhur
Agent-A01
Eini
Pretty please start testing CPU performance for things like Lua, Javascript, turn based games (turn times), PDF decoding/viewing/searching and other applications that do *not* benefit from IPC based mostly on larger CPU caches. Thanks in advances! 😉
H83
Horus-Anhur
Eini
https://i.imgur.com/X9aT65U.png
I re-enabled Windows' fast-boot feature and turn off the PC instead of leaving it running, because idle wattage is too high at over 60 watts (5900X + 2700S). Better working standby would be welcome, but things like fan-speeds and some other stuff still run into problems with that aeons old feature.
The NVidia GPU idles at unnecessary 20 watts alone and I don't understand why desktop GPUs don't switch between low power integrated and high power dedicated like laptops.
Using E-cores for idle and low compute tasks is a welcome feature, especially as my room's temps are regularly higher than the other rooms in the apartment. And software that makes use of all cores still benefits from E-cores, because often the higher the core-count the less all cores are fully utilized anyway.
Here is an example of Gigapixel AI processing an image (4.16% = 1 logical core fully utilized):
Picolete
tunejunky