Intel Core i9-13900K could get extreme performance mode at 350 Watt TDP

Published by

Click here to post a comment for Intel Core i9-13900K could get extreme performance mode at 350 Watt TDP on our message forum
https://forums.guru3d.com/data/avatars/m/238/238382.jpg
Imagine if intel pulled a bmw and you could only get "extreme performance mode" if you payed a monthly subscription.
data/avatar/default/avatar03.webp
KissSh0t:

Imagine if intel pulled a bmw and you could only get "extreme performance mode" if you payed a monthly subscription.
They already have that with some server hardware functionality.
https://forums.guru3d.com/data/avatars/m/266/266726.jpg
nizzen:

Question is: who is playing cinebench on a overclocked 12900k 24/7 😛 I'm playing games on 12900k @ 5.4ghz all core, an it hits 63c on the hottest core. Not delidded
I do admit that intel's powergating is pretty amazing.
data/avatar/default/avatar28.webp
user1:

I do admit that intel's powergating is pretty amazing.
What do you mean? 🙂
https://forums.guru3d.com/data/avatars/m/266/266726.jpg
nizzen:

What do you mean? 🙂
powergating, as in turning off parts of the chip when they are not being used, in this regard alderlake is ALOT better than any previous intel chip.
https://forums.guru3d.com/data/avatars/m/278/278016.jpg
TLD LARS:

They already have that with some server hardware functionality.

Opera Snapshot_2022-02-12_132230_wccftech.com.png
data/avatar/default/avatar06.webp
KissSh0t:

Yes, that is what I am referring to.
How old is this? Never seen it before 😛 LOL Manufacturer: Gateway Manufacturer model number: SX2841-09E PT.GBG02.001 Operating system: Windows 7 Home Premium 64-Bit CPU: Intel Pentium Dual Core G6951 2.8GHz Memory: DDR3 6Gb HDD: 1Tb ODD: DVDRW Product Type: Recertified
data/avatar/default/avatar20.webp
Does cpu matter in new games? 😛
spiderman cpu test.jpg
https://forums.guru3d.com/data/avatars/m/266/266726.jpg
nizzen:

Does cpu matter in new games? 😛
spiderman cpu test.jpg
I feel like they could've done a better job , that kind of crappy performance is hard to believe considering the original game ran on a 8 core jaguar cpu at >2ghz. maybe its memory bound?
https://forums.guru3d.com/data/avatars/m/34/34585.jpg
It's possible it might hit 350w... when doing AVX512, the 10900K can get to about 336w but only when doing AVX512 workloads and we don't know if the P cores support AVX or not.
data/avatar/default/avatar10.webp
nizzen:

Does cpu matter in new games? 😛
spiderman cpu test.jpg
Nice to see 12th gen so strong in this game. This explains why game runs perfect for me on 12700K and people on older AMD/Intel CPUs aren't so lucky.
data/avatar/default/avatar23.webp
Glottiz:

Nice to see 12th gen so strong in this game. This explains why game runs perfect for me on 12700K and people on older AMD/Intel CPUs aren't so lucky.
"luck" :D
team 5600 @7200c32 timings.jpg
data/avatar/default/avatar20.webp
nizzen:

Does cpu matter in new games? 😛
Some are reporting the game not scaling well past 4 cores. Some are reporting the game to run better with HT/SMT off. Some have memory leaks, needing to restart the game every hour. Looking at the graph you are showing it looks like the 12600k would reach the same score if it was 300Mhz faster to match the 12900k stock clocks, indicating that the game does not scale past 6 cores. So let them fix the game before using the game as a benchmark.
data/avatar/default/avatar33.webp
TLD LARS:

Some are reporting the game not scaling well past 4 cores. Some are reporting the game to run better with HT/SMT off. Some have memory leaks, needing to restart the game every hour. Looking at the graph you are showing it looks like the 12600k would reach the same score if it was 300Mhz faster to match the 12900k stock clocks, indicating that the game does not scale past 6 cores. So let them fix the game before using the game as a benchmark.
Nice job AMD 😛 Must be nVidia game 😀
AMD 5600x spiderman.jpg
data/avatar/default/avatar07.webp
nizzen:

Nice job AMD 😛 Must be nVidia game 😀
As previously mentioned, this looks bugged like hell, 2080 having the same performance like a 3080tie with a 5600x. Look at a Ryzen 3600x = 10900k. The 10900k is much faster then the 3600x in most other games. Also a Ryzen 3300 between 10600k and 10700k A AMD 4 core with same performance as Intel 6 and 8 cores. I hope you are able to see how bad this game is running right now.
data/avatar/default/avatar07.webp
TLD LARS:

As previously mentioned, this looks bugged like hell, 2080 having the same performance like a 3080tie with a 5600x. Look at a Ryzen 3600x = 10900k. The 10900k is much faster then the 3600x in most other games. Also a Ryzen 3300 between 10600k and 10700k A AMD 4 core with same performance as Intel 6 and 8 cores. I hope you are able to see how bad this game is running right now.
Game sux, that's for shure. Performance is good on Intel/nVidia. For some reason, it looks like there are more games that perform poor on AMD gpu than nVidia gpu in average. Key is maybe AMD is a bit slow with drivers. There are wins and losses on both sides that's true. AMD needs to step up on the driver side on the gpu, so maybe they will sell more gpu's 🙂
https://forums.guru3d.com/data/avatars/m/284/284177.jpg
can't wait for Hilbert to review the 13400f.... if it's better/faster than the 12400 @$179 then that is definitely my next upgrade.
https://forums.guru3d.com/data/avatars/m/238/238382.jpg
Airbud:

can't wait for Hilbert to review the 13400f.... if it's better/faster than the 12400 @$179 then that is definitely my next upgrade.
I just noticed your video card..... damn.