Intel Raptor Lake ES is 9% faster in Single-Core Performance over Alder Lake

Published by

Click here to post a comment for Intel Raptor Lake ES is 9% faster in Single-Core Performance over Alder Lake on our message forum
data/avatar/default/avatar31.webp
cucaulay malkin:

that's just not true
Right, show me what applications the average consumer use, which makes use of more than 16 threads.
https://forums.guru3d.com/data/avatars/m/249/249528.jpg
Dragam1337:

Sure, but they are singlethread limited for the most part - only heavy workststion applications makes use of that many cores, which the vast majority of consumers dont use.
Stop with the 'workstation' term bud. What age are you living in exactly? Are you pretending or you're just plain blind to what's going on right now? A 3D modeler using his 3970x threadripper is getting 10k monthly salary, meanwhile streamers with 12600K to 12900K pull 50k average monthly. Why the hell do you think intel went the big core little core way? Surely it was a last minute decision with no plans about the future. If you were a content creator you would absolutely love intel's 12900k with its unusual core design. Because then you'd realise a lot of plugins and effects you use in programs are actually single-threaded and having a 'workstation' xeon is nothing more than a bottleneck. You millenials have literally no sense of what's going on in the real word, this tunnel vision is just ridiculous.
data/avatar/default/avatar05.webp
IceVip:

Stop with the 'workstation' term bud. What age are you living in exactly? Are you pretending or you're just plain blind to what's going on right now? A 3D modeler using his 3970x threadripper is getting 10k monthly salary, meanwhile streamers with 12600K to 12900K pull 50k average monthly. Why the hell do you think intel went the big core little core way? Surely it was a last minute decision with no plans about the future. If you were a content creator you would absolutely love intel's 12900k with its unusual core design. Because then you'd realise a lot of plugins and effects you use in programs are actually single-threaded and having a 'workstation' xeon is nothing more than a bottleneck. You millenials have literally no sense of what's going on in the real word, this tunnel vision is just ridiculous.
Oh yeah, intel surely made it specifically for "content creators" - it had absolutely nothing to do with the fact that they couldn't have matched amd's core count with P cores only, thus were forced to use those E-waste cores... But seriously, lol at your post - your reading skillz are apparently a big fat 0/10...
https://forums.guru3d.com/data/avatars/m/249/249528.jpg
Dragam1337:

Oh yeah, intel surely made it specifically for "content creators" - it had absolutely nothing to do with the fact that they couldn't have matched amd's core count with P cores only, thus were forced to use those E-waste cores... But seriously, lol at your post - your reading skillz are apparently a big fat 0/10...
You can 'lol' at it all you want buddy. It won't change your comment about the 5/7/9s being consumer products. It won't make you look outside that tunnel you're looking through. And most importantly, it won't make intel's little cores disappear. So you keep going at it, whining on forums about how the e-cores are a waste, without second guessing the whole situation. I'm sure its healthy for you, obviously. Godspeed ancient guru.
data/avatar/default/avatar04.webp
So if we remove the clockspeed gain, that's +3.5% higher IPC....for 140W....nice
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
schmidtbag:

Funny thing is I thought the E-cores were going to be slower but less power-hungry.
Dragam1337:

E-waste cores ARE completely useless for gaming, and will more often than not lower gaming performance vs being disabled. Yes, e-waste cores increases workstation application performance, but the i5/i7/i9 are consumer products, on which the e-waste cores are wasted... no e-waste cores and more p-cores, tyvm !
I continue to think that the e-cores are the most interesting stuff Intel has released in the last 5 years and i`m still angry they didn`t release some parts with e-cores only, specially for laptops.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
Dragam1337:

E-waste cores ARE completely useless for gaming, and will more often than not lower gaming performance vs being disabled.
Except in practice, they don't.
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
Astyanax:

incomplete bios,
I still think it'll be a furnace if these reports are real.
IceVip:

Stop with the 'workstation' term bud. What age are you living in exactly? Are you pretending or you're just plain blind to what's going on right now? A 3D modeler using his 3970x threadripper is getting 10k monthly salary, meanwhile streamers with 12600K to 12900K pull 50k average monthly. Why the hell do you think intel went the big core little core way? Surely it was a last minute decision with no plans about the future. If you were a content creator you would absolutely love intel's 12900k with its unusual core design. Because then you'd realise a lot of plugins and effects you use in programs are actually single-threaded and having a 'workstation' xeon is nothing more than a bottleneck. You millenials have literally no sense of what's going on in the real word, this tunnel vision is just ridiculous.
Hey, when where you born and what do millennials (who are now almost 40) have to do with Intel's core strategy?
Dragam1337:

Oh yeah, intel surely made it specifically for "content creators" - it had absolutely nothing to do with the fact that they couldn't have matched amd's core count with P cores only, thus were forced to use those E-waste cores... But seriously, lol at your post - your reading skillz are apparently a big fat 0/10...
It has absolutely to do with the fact that modern computers need both good single threaded performance, and to maintaina lot of little services and I/O in the background, and it's a sensible design that AMD is bound to adopt fairly soon. The e-cores are slower, but comparatively much, much more efficient, and great for when you actually want to use a computer as anything else except only a game console. The Windows scheduler will most likely be the challenge in all this. Because of Android, Linux should be fine with heterogenous designs like this for almost a decade now. Calling them e-waste is annoying and adds nothing to this conversation except the certainty that you have no idea what you're talking about.
data/avatar/default/avatar32.webp
Dragam1337:

Oh yeah, intel surely made it specifically for "content creators" - it had absolutely nothing to do with the fact that they couldn't have matched amd's core count with P cores only, thus were forced to use those E-waste cores... But seriously, lol at your post - your reading skillz are apparently a big fat 0/10...
1. To answer your comment that Intel couldn't match AM D in core count: 12900K in CPUz is only tad below 5950x and from this article we see that 13900K is MUCH faster than 5950x that has native 16C, so these small cores are far from, useless, whoever buys 5950x can do the same on 12900K [and even faster on 13900K] but much better since you also get superior single thread, meaning you get wider usage with same product, you can do PS3 emulation and you can software render AV1, without compromising anything. 2. We see that with more performance, power usage goes higher, be it Intel, AMD or NVIDIA , so no need use P-Cores for the things you can use E-Core like web browsing, office work and so on, E-Cores can do it with low power 3. Intel can put more cores and 13900K beats ZEN3/ZEN4 in core count, [they said that 16 core is max so far] and here we have 8+16, 24 cores 4. E-Cores are getting faster too and they getting smaller, so there will be more of them and they will perform faster , its not like they slow now, they basically same speed as people with older Intel CPUs its same IPC as 10900K just lower clocks 5. Here is something EXTREMELY important, EU has an aggressive power plan for the next 5 -10 years , it goes for everything TVs, PC's, Air conditioners, and all electric appliances, its extremely aggressive,. IMO its unrealistic [like in 5 years OLED tv cant use more than 30w, something so low that will make the TV sold in EU less bright, unless the technology changes], but it is what it is and manufacturers have to abide by it, and this plan has W usage for sleep, idles and so on, its very detailed you can google it. So these E-Cores are important for that. Im sure there are otehr reasons that im not thinking about. BTW AMD will do the same in future, im sure you seen it too, i think last year or two years ago a leaked AMD roadmap, they have a plan for E-Cores and P-Cores, and its the only way forward because of these new power usage laws, and because the P cores are getting so fast that there is no need for so much performance for most software. Honestly if you run a game that isnt CPU bottlernecked you can game on E-Cores, I seen people testing them like with 3090 and RTX so you GPU bottlenecked and E-Cores perform just as good as P-Cores. Also older games can run on them fine, indie games too, every AAA game that came up to 2020 will run fine on e-cores P.S. Personally i see it as "2 Engines" in one, windows 11 is getting there, its like you have e-cores [first engine] dedicated for OS, web browsers, Office, email, Muiltimedia etc and then you have P-Cores [second engine] that wakes up to that run games, 7zip, rar, media compression, etc, each type of core doing its work..
data/avatar/default/avatar10.webp
Haha xD https://cdn.hinative.com/attached_images/103740/c9b7fbd5f13ed3eb8a4fcd3bf6de40330c662b68/large.gif?1494782306 In all seriousness though, while intel could not match AMD in core count with fullsize cores on lga1700, it does function well with the most common setup of 2 p-cores and 6 e-cores in business laptops - cause that is clearly what it is - a laptop chip design. But it does NOT work well for games, which is what i care about. Games either see no benefit from e-waste cores, or a decline in performance with them active - and yes, that is the case in real world performance @Astyanax https://tpucdn.com/review/intel-core-i9-12900k-alder-lake-12th-gen/images/cyberpunk-2077-1280-720.png Fact is that the chip is not optimized towards gaming, and that it happens to be fast at gaming is a side effect of it being fast in general. IMO Intel ought to make a chip specifically for gaming, like amd did with the 5800x3d. No e-waste cores, and a whole lot of l3 cache, along with 8 of their highest binned P-cores.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
Dragam1337:

Would you stop refering to polish sites, which non but you can read... if you want me to look at specific graph, then link that, rather than the entire page.
Statements like this make me really question your logic deduction skills. I don't know a browser that doesn't have the ability to auto translate to english. Sounds more like you simply want to have an issue with something irrelevant. That's called deflection. And yes i realize that it's likely you'll reply with something along the lines of "Translations aren't always perfect" or "i'm not going to press a single button to read something because i don't have to/don't want to" ....Even though guru3d is effectively a translated review website as it is....
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
Dragam1337:

Haha xD https://cdn.hinative.com/attached_images/103740/c9b7fbd5f13ed3eb8a4fcd3bf6de40330c662b68/large.gif?1494782306 In all seriousness though, while intel could not match AMD in core count with fullsize cores on lga1700, it does function well with the most common setup of 2 p-cores and 6 e-cores in business laptops - cause that is clearly what it is - a laptop chip design. But it does NOT work well for games, which is what i care about. Games either see no benefit from e-waste cores, or a decline in performance with them active - and yes, that is the case in real world performance @Astyanax https://tpucdn.com/review/intel-core-i9-12900k-alder-lake-12th-gen/images/cyberpunk-2077-1280-720.png Fact is that the chip is not optimized towards gaming, and that it happens to be fast at gaming is a side effect of it being fast in general. IMO Intel ought to make a chip specifically for gaming, like amd did with the 5800x3d. No e-waste cores, and a whole lot of l3 cache, along with 8 of their highest binned P-cores.
Define "gaming". The moment the first Direct Storage games that are not for old consoles land, anything below an 8-core WITH Direct Storage support will be a stutterfest, and everyone will complain to developers. The dual core design makes sense, it's being used in Android for more than a decade. Actually using a system is not just running a game only at 720p, even while gaming.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Dragam1337:

Would you stop refering to polish sites, which non but you can read.
Why dont you scram.
data/avatar/default/avatar19.webp
PrMinisterGR:

Define "gaming". The moment the first Direct Storage games that are not for old consoles land, anything below an 8-core WITH Direct Storage support will be a stutterfest, and everyone will complain to developers. The dual core design makes sense, it's being used in Android for more than a decade. Actually using a system is not just running a game only at 720p, even while gaming.
Time will tell regarding direct storage - i think it is blown way up by marketing, like so much else has been throughout history. I think that as long as you have an ssd with fast access times, then it will make very little difference in games. The big / little core makes sense in units where (low)power consumption is the most important goal - aka mobile devices, laptops. It does not make sense in a gaming desktop pc, where you just want as high performance as possible, and power consumption only matters to a very small degree. And while there will be background tasks running even while gaming, having E-waste cores enabled will more often than not lower performance, as also seen on the linked benchmark. Again, i'd like to see intel make a true gaming cpu, like amd did with the 5800x3d - no e-waste cores, and a heapton of l3 cache.
data/avatar/default/avatar16.webp
BLEH!:

Be nice to have some legal mandate on TDP - 150 W max at turbo maybe? Have to be able to be cooled by the stock air cooler...
Ofc there are such "mandates", why you think they still labels processors that can reach 5ghz as 3-3.5ghz base clock? When was "base clock" relevant? Never, except in OEM "mandates".
https://forums.guru3d.com/data/avatars/m/271/271560.jpg
Dragam1337:

Sure, but they are singlethread limited for the most part - only heavy workststion applications makes use of that many cores, which the vast majority of consumers dont use.
i get your point but you have your gaming glasses on. have you heard of TikTik or YouTube? more users than all the gamers in the world and guess what - they use multiple cores to compile, transcode, and edit their videos. and Blender is more than a stress test. and this is from someone who is not a fan of e-cores
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Dragam1337:

i'd like to see intel make a true gaming cpu, like amd did with the 5800x3d - no e-waste cores, and a heapton of l3 cache.
and 250eur difference for 64mb of cache
data/avatar/default/avatar27.webp
cucaulay malkin:

and 250eur difference for 64mb of cache
Shouldn't be any more expensive than the 12900k with those e-waste cores gone, and would be a hell of alot faster for games.