Core i9-13900K Early Review Indicates Significant Improvements in Comparison to the Core i9-12900K

Published by

Click here to post a comment for Core i9-13900K Early Review Indicates Significant Improvements in Comparison to the Core i9-12900K on our message forum
data/avatar/default/avatar08.webp
I dont consider 12% uplift in P core performance (which is what matters) "significant improvements". Better wait for 14th gen, which will be on a new node and @ 25% faster P cores.
data/avatar/default/avatar16.webp
Dragam1337:

I dont consider 12% uplift in P core performance (which is what matters) "significant improvements". Better wait for 14th gen, which will be on a new node and @ 25% faster P cores.
Maybe it's like 2700x VS 3800x. Some still upgraded 😛
data/avatar/default/avatar13.webp
nizzen:

Maybe it's like 2700x VS 3800x. Some still upgraded 😛
It's like the intel gens of old, where you could easily skip a gen, or 4.
data/avatar/default/avatar23.webp
Dragam1337:

It's like the intel gens of old, where you could easily skip a gen, or 4.
So is it a good thing or a bad thing? Skipping 4 gens sounds good to me, you save a ton of money. Or I could form this question in another way. If AMD CPUs are so good and long lasting, then why are so many AMD fans seem to be happy to buy new CPU every year? Seems incredibly wasteful, monetarily and for environment.
data/avatar/default/avatar05.webp
Glottiz:

So is it a good thing or a bad thing? Skipping 4 gens sounds good to me, you save a ton of money. Or I could form this question in another way. If AMD CPUs are so good and long lasting, then why are so many AMD fans seem to be happy to buy new CPU every year? Seems incredibly wasteful, monetarily and for environment.
It is. A 15% faster PC isn't worth unless you earn money per second on it or unless you love it and you have to throw cash at it. a 15% faster cpu will move you a bit forward or backward on the fps charts, well above 60fps all the time, around the 120 fps bar most of the time? Unless you are one of the 240hz or 165hz users i don't see a reason for upgrading so often. if you are on a 12900k, no you are not supposed to upgrade at all, maybe not even from 11900k. If you are on a 6600k or 8700k maybe now is the time? enough changes?
https://forums.guru3d.com/data/avatars/m/246/246564.jpg
Definitely a worthwhile upgrade if you're still on Skylake - but - this is the end of the line for this socket, so unless you're really CPU limited, might as well wait another cycle.
https://forums.guru3d.com/data/avatars/m/186/186763.jpg
Wouldn't exactly call it Intel of old, previous gens where generally low single digit IPC uplift, not to mention for years it was same core count too, not only going from alder to raptor lake gives you IPC uplift of 12% which about triple that of Intel of old but you're getting more cores again. Plus we all know generally you don't upgrade from one gen to the very next gen I'm on Zen3 and have no plans to go Zen4 or Raptor Lake for my daily. Won't consider anything until Zen5 or Meteor Lake at the earliest
data/avatar/default/avatar09.webp
asturur:

It is. A 15% faster PC isn't worth unless you earn money per second on it or unless you love it and you have to throw cash at it. a 15% faster cpu will move you a bit forward or backward on the fps charts, well above 60fps all the time, around the 120 fps bar most of the time? Unless you are one of the 240hz or 165hz users i don't see a reason for upgrading so often. if you are on a 12900k, no you are not supposed to upgrade at all, maybe not even from 11900k. If you are on a 6600k or 8700k maybe now is the time? enough changes?
Well I upgraded from 5820K (which lasted me 7 years and still felt plenty snappy, just needed more ooomf in latest games) to 12700K last year. Now I don't plan to upgrade for at least 5 years. IMO people who upgrade CPU more than once in 4-5 years are flushing money down the toilet. Also I never run into any socket compatibility issues or have to cry about motherboard support. I just buy new CPU+Mobo+RAM combo when the time comes and get all the latest features, latest USB ports etc. Let's say 1 motherboard was compatible with 10 years of CPUs. I would never put latest CPU into 10 year old motherboard lol.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
I sure hope that they're comparing to 12th gen performance today. With all the scheduler issues, 12th gen was initially perceived to be a lot slower than it really was.
Dragam1337:

I dont consider 12% uplift in P core performance (which is what matters) "significant improvements". Better wait for 14th gen, which will be on a new node and @ 25% faster P cores.
12% is actually pretty good when you consider Intel has obviously run out of ideas on how to further improve performance without pushing developers to use more instructions or threads. Keep in mind too Intel is burning a lot of time trying to make sure there aren't more security issues. I wouldn't be surprised if the E-cores is Intel's way of slowly phasing out HT, since that appears to be where most of their security issues come from. As for 14th gen, I'm not so certain IPC will be 25% faster, but I'd like to be proven wrong.
https://forums.guru3d.com/data/avatars/m/270/270008.jpg
schmidtbag:

I sure hope that they're comparing to 12th gen performance today. With all the scheduler issues, 12th gen was initially perceived to be a lot slower than it really was. 12% is actually pretty good when you consider Intel has obviously run out of ideas on how to further improve performance without pushing developers to use more instructions or threads. Keep in mind too Intel is burning a lot of time trying to make sure there aren't more security issues. I wouldn't be surprised if the E-cores is Intel's way of slowly phasing out HT, since that appears to be where most of their security issues come from. As for 14th gen, I'm not so certain IPC will be 25% faster, but I'd like to be proven wrong.
I agree. We won't see large IPC gains until they all get on GAAFET of some sort so that is Intel 20A or TSMC 2nm and there is a significant architecture rewrite.
https://forums.guru3d.com/data/avatars/m/259/259186.jpg
+ 12% performance but this CPU eats 320-350W. LOL,
data/avatar/default/avatar02.webp
Martin2603PL:

+ 12% performance but this CPU eats 320-350W. LOL,
If you play Cinebench 24/7... In gaming it is very efficent 🙂 60%+ more fps per watt in games than 5950x. Not too bad 😉
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
nizzen:

If you play Cinebench 24/7... In gaming it is very efficent 🙂 60%+ more fps per watt in games than 5950x. Not too bad 😉
You can keep bringing up CB over and over again but it seems you need to be reminded every time that most people who buy this CPU aren't going to just play games. The wattage is a real problem for anyone who does more than play games on decade-old engines at low detail. For those who only play modern games in 4K, a CPU like this is completely pointless to get.
data/avatar/default/avatar25.webp
schmidtbag:

You can keep bringing up CB over and over again but it seems you need to be reminded every time that most people who buy this CPU aren't going to just play games. The wattage is a real problem for anyone who does more than play games on decade-old engines at low detail. For those who only play modern games in 4K, a CPU like this is completely pointless to get.
So what program is a real problem with this cpu? Maybe I put it in my testsuite for when testing Raptor Lake.... I hope my 90$ Arctic Freezer 2 don't explode 😛
data/avatar/default/avatar14.webp
tty8k:

The leaks also show the very small advantage going for DDR5 at this point, especially at 2k / 4k gaming.
They "forgot" to use Hynix A-die @ 7800+mhz 😉 It's where the advantage is. Not 4800/5600 xmp....
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
nizzen:

So what program is a real problem with this cpu? Maybe I put it in my testsuite for when testing Raptor Lake.... I hope my 90$ Arctic Freezer 2 don't explode 😛
Um... literally anything scalable? For some things like file compression/decompression, the CPU would typically churn through it fast enough where the power consumption and heat output is negligible. But then there are things that can take a lot longer or are done repetitively in a production environment, like media transcoding, software development (especially code compiling but maybe even the software itself), 3D or simulation rendering, virtual machines, grid computing (like folding@home), or just simply heavy multitasking. I'm sure I'm forgetting a lot, and that's just the kind of stuff people who buy this CPU are likely to do, because there are a lot of heavily multithreaded tasks that this CPU is likely to never see (like anything you'd find in a mainframe or supercomputer). It's kind of weird to me that I have to explain any of this, as if gaming and CB are the only things home computers are used for.
data/avatar/default/avatar14.webp
tty8k:

If you do me a test with that vs say ddr4 3600 cl14 which is doable by most decent kits that don't cost a fortune. 4k test, if I see 10%+ more on avg I'm buyin 🙂
Why in a gpubound scenario? Overclock the gpu memory for better 4k gpubound. 10% IQ
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Glottiz:

Seems incredibly wasteful, monetarily and for environment.
yeah cause once an am4 owner is done with their cpu they all go into the ocean, choking millions of turtles.
data/avatar/default/avatar31.webp
schmidtbag:

Um... literally anything scalable? For some things like file compression/decompression, the CPU would typically churn through it fast enough where the power consumption and heat output is negligible. But then there are things that can take a lot longer or are done repetitively in a production environment, like media transcoding, software development (especially code compiling but maybe even the software itself), 3D or simulation rendering, virtual machines, grid computing (like folding@home), or just simply heavy multitasking. I'm sure I'm forgetting a lot, and that's just the kind of stuff people who buy this CPU are likely to do, because there are a lot of heavily multithreaded tasks that this CPU is likely to never see (like anything you'd find in a mainframe or supercomputer). It's kind of weird to me that I have to explain any of this, as if gaming and CB are the only things home computers are used for.
In 2022, we are rendering with GPU's 😉
data/avatar/default/avatar06.webp
cucaulay malkin:

yeah cause once an am4 owner is done with their cpu they all go into the ocean, choking millions of turtles.
Yep, can't sell them. Noone want's a used AMD cpu...