Alleged Intel Alder Lake CPU photo surfaces online

Published by

Click here to post a comment for Alleged Intel Alder Lake CPU photo surfaces online on our message forum
https://forums.guru3d.com/data/avatars/m/264/264593.jpg
An interesting approach hope this plays out well for Intel maybe even some Xe cores too
https://forums.guru3d.com/data/avatars/m/274/274789.jpg
os needs to learn how to choose the right cores
data/avatar/default/avatar16.webp
I seriously doubt that Intel can make PCIe 5 work already. They could not even make PCIe Gen 4 work for 10th Gen CPU´s. The extra pin count is needed for DDR5's high transfer rate. They will probably have at least 350-400 pins per Dimm.
https://forums.guru3d.com/data/avatars/m/271/271877.jpg
They will need more space for mixing too many things and different architecture chiplets, including those 8 bigger hotter cores. There will be a tremendous change in everything, and I'm afraid I don't wan't to be a beta tester of new DDR 5, PCIE 5, Big Little, etc. I'll try to go for the last DDR 4 architecture with Zen 3 and the second generation DDR 5 on mobile with nice drawing screen whatever, maybe when they'll have right prices... 🙁
https://forums.guru3d.com/data/avatars/m/250/250418.jpg
NightWind:


rsz_22y5q2.jpg
I fail to understand your point. First time I learn about this High efficiency + High performance symbiosis for PC. We have it on phones and it's great, if it works correctly. As stated before, OS needs to learn when to use what and for what applications. The software side is still so late compared to the hardware, I'm certain Linux will master this much faster than Windows ever will. And we need software to improve to see any benefits from this. Great to see Intel trying to push new things, next couple years will be interesting for sure!
https://forums.guru3d.com/data/avatars/m/283/283103.jpg
A metaphor for more heat.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
I just realized but, I know that a large chunk of these pins are for voltage and ground. The thing is, why not just make those as a big plate, instead of a pin? I imagine that would help make things simpler, cheaper, and more reliable. It would also be pretty much impossible to mistake the direction the CPU should be installed.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
karma777police:

PCIE 4 as I said turned out to be really f. useless so useless that 3080 performs same under both PCIE3 and PCIE4. Intel made a smart decision almost skipping PCIE4 all together and going for PCIE5 and DDR5
The idea that PCI-E 4 is only for GPUs is a really dumb take, even for you.
data/avatar/default/avatar27.webp
Bo Alenkaer:

I seriously doubt that Intel can make PCIe 5 work already. They could not even make PCIe Gen 4 work for 10th Gen CPU´s. The extra pin count is needed for DDR5's high transfer rate. They will probably have at least 350-400 pins per Dimm.
Most of the extra pins are for additional power/ground capacity. Only small portion is dedicated for the I/O signal overhead.
https://forums.guru3d.com/data/avatars/m/156/156348.jpg
I wish Intel would have logical names for their cpu like core 1st gen, core 2n gen, core 3rd gen and so on instead of rocket lake, magical lake, fairy lake, i don't know wtf this lake is anymore ...
data/avatar/default/avatar36.webp
So they are finally preparing to move to a smaller fab process but CPU is getting bigger. Kind of irony.
https://forums.guru3d.com/data/avatars/m/248/248627.jpg
karma777police:

PCIE 4 as I said turned out to be really f. useless so useless that 3080 performs same under both PCIE3 and PCIE4. Intel made a smart decision almost skipping PCIE4 all together and going for PCIE5 and DDR5 and irony is they will jump to it before AMD does.
Sad thing about this statement is Intel skipped it and their mobos still seem to have a premium in the majority of cases where as u have the option to take it or leave it with AMD. As for this little big design I really want to see how this is going to be leveraged I can see mobile benefiting from it but I fail to see a real use for it for the vast majority of desktops. Just seems like it will fill some kind of a niche.
https://forums.guru3d.com/data/avatars/m/160/160436.jpg
icedman:

Sad thing about this statement is Intel skipped it and their mobos still seem to have a premium in the majority of cases where as u have the option to take it or leave it with AMD. As for this little big design I really want to see how this is going to be leveraged I can see mobile benefiting from it but I fail to see a real use for it for the vast majority of desktops. Just seems like it will fill some kind of a niche.
I agree, it's annoying. I've still stuck with Intel, mostly for gaming performance and familiarity, but its things like this that keep stacking up against them. When single threaded performance is worse, and multi-threaded, and the cost is higher, etc it will become really hard to stomach sticking with them.
data/avatar/default/avatar04.webp
Different size probably need new heatsink. So if you are on gen 10, you will have to change everything. Just a year after. Or stay on amd.
data/avatar/default/avatar37.webp
karma777police:

PCIE 4 as I said turned out to be really f. useless so useless that 3080 performs same under both PCIE3 and PCIE4. Intel made a smart decision almost skipping PCIE4 all together and going for PCIE5 and DDR5 and irony is they will jump to it before AMD does.
PCI-E 4 is not just for GPU, but any component you can connect to PCI slots even the new NVME generations that have speeds of 5000mb/s
https://forums.guru3d.com/data/avatars/m/66/66148.jpg
Syranetic:

I agree, it's annoying. I've still stuck with Intel, mostly for gaming performance and familiarity, but its things like this that keep stacking up against them. When single threaded performance is worse, and multi-threaded, and the cost is higher, etc it will become really hard to stomach sticking with them.
Why would you stick with them? They aren't your local football team or anything like that. I've got both Intel and AMD systems depending on the use case or what's lying around. It's a sad fact that my Xeon 1680v2 is equal to any Ryzen 1st gen CPU in single core perf. It's taken Ryzen 3000 to catch Intel in single core and now Ryzen 5000 is going to take the lead. It's just hardware, having a tribal attitude towards it won't benefit you and your use case.
https://forums.guru3d.com/data/avatars/m/160/160436.jpg
Kaerar:

Why would you stick with them? They aren't your local football team or anything like that. I've got both Intel and AMD systems depending on the use case or what's lying around. It's a sad fact that my Xeon 1680v2 is equal to any Ryzen 1st gen CPU in single core perf. It's taken Ryzen 3000 to catch Intel in single core and now Ryzen 5000 is going to take the lead. It's just hardware, having a tribal attitude towards it won't benefit you and your use case.
Because I said "when", I will still take my 10700K over any other currently available AMD chip for my primary workload with this system -- gaming.
data/avatar/default/avatar35.webp
I had to stay with intel because its price. On some countries 3300x is impossible to find or way overpriced. Similar for 3600. Also can buy cheaper and more featured motherboards (sadly crippled by intel, but still) Everyone want amd now. And prices are reflecting that.
https://forums.guru3d.com/data/avatars/m/260/260103.jpg
Oh yay just what we need, another socket. Intel really needs to learn a lesson here from AMD.