Intel Shows 28-core processor die-shot

Published by

Click here to post a comment for Intel Shows 28-core processor die-shot on our message forum
https://forums.guru3d.com/data/avatars/m/237/237771.jpg
You first say that changing sockets is not needed and use AMD as a point of proof then only one post later admit that not changing a socket for Buldozee held it back. Then you sit here and say things like "I can just use add-on cards for all those things that new boards have" news flash a NVMe SSD on a AM3 990FX board through a PCI adapter will probably be no faster than a SATA3 drive. You posted a wall of text for really no reason. the original point I was making in here is the socket changes in the grand scheme of things affects no one. BTW you are basing this over ONE "unnecessary" socket change 1150-1155.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
You first say that changing sockets is not needed and use AMD as a point of proof then only one post later admit that not changing a socket for Buldozee held it back. Then you sit here and say things like "I can just use add-on cards for all those things that new boards have" news flash a NVMe SSD on a AM3 990FX board through a PCI adapter will probably be no faster than a SATA3 drive. You posted a wall of text for really no reason.
Here you go again, jumping to conclusions and failing to understand simple concepts. Using the same socket and northbridge on a completely different architecture is bound to hold back performance. It wasn't "needed" for AMD to change sockets, though maybe they should have. This is not by any means the same scenario Intel was in. Ivy Bridge alone exists on FIVE different CPU sockets (4 if you don't want to get nitpicky); 8 if you include BGA. 3 of those sockets were for consumer-level motherboards, and at least 2 of them targeted the same demographic. So seriously, think about it - you are insisting that there must be good reasons to have at least 3 CPU sockets for the same architecture and same type of PC. How do you not find this questionable? Anyway, NVMe SSDs are largely based on PCIe, so yes, it will in fact be faster than SATA3. Let's not pull numbers out of nowhere, ok? At least my rants actually have substance and reason.
the original point I was making in here is the socket changes in the grand scheme of things affects no one.
That applies strictly to people who have the money to always get the latest and greatest. Even Intel and MB manufacturers have to address the costs of changing sockets, but like I said, that's worth it for them in the end if it ensure mores sales. Most people on a budget would rather re-use a perfectly capable motherboard.
BTW you are basing this over ONE "unnecessary" socket change 1150-1155.
I explicitly stated 1150, 1155, AND 2011. Please keep up.
https://forums.guru3d.com/data/avatars/m/237/237771.jpg
Here you go again, jumping to conclusions and failing to understand simple concepts. Using the same socket and northbridge on a completely different architecture is bound to hold back performance. It wasn't "needed" for AMD to change sockets, though maybe they should have.
Make up your mind!
This is not by any means the same scenario Intel was in. Ivy Bridge alone exists on FIVE different CPU sockets (4 if you don't want to get nitpicky); 8 if you include BGA. 3 of those sockets were for consumer-level motherboards, and at least 2 of them targeted the same demographic. So seriously, think about it - you are insisting that there must be good reasons to have at least 3 CPU sockets for the same architecture and same type of PC. How do you not find this questionable?
5? Care to elaborate? [Quote] Anyway, NVMe SSDs are largely based on PCIe, so yes, it will in fact be faster than SATA3. Let's not pull numbers out of nowhere, ok? At least my rants actually have substance and reason. Faster but not near its theoretical performance. BTW I'm refering to SATA gen 3 not SATA 3Gb/s
That applies strictly to people who have the money to always get the latest and greatest. Even Intel and MB manufacturers have to address the costs of changing sockets, but like I said, that's worth it for them in the end if it ensure mores sales. Most people on a budget would rather re-use a perfectly capable motherboard.
You have spent as much with your CPU upgrades over the past 6 years on that 890FX board that you could have bought a z77 and 3570k instead of that bulldozer and would not NEED an upgrade right now. Hell you could have bought a p67 and a 2600k instead of your AM3 system and maybe saved money at this point.
I explicitly stated 1150, 1155, AND 2011. Please keep up.
1150 and 1155 are dual channel DDR3. 2011 is quad channel DDR3 with no IGPU and supports up to a 8 core CPU. 2011 v3 is quad channel DDR4. Try and keep up. FTR 1156 was tri-channel ddr3 so again only ONE "unnecessary" socket change.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Make up your mind!
My phrasing and mindset is consistent. You are the one who chooses to see things in a different way.
5? Care to elaborate?
Ivy Bridge is compatible with the following sockets: LGA 1155 LGA 2011 LGA 2011-1 (this is the one that I find nitpicky) EDIT: LGA 2011-3, for some motherboards LGA 1356 Socket G2 And these are the BGA models, which I have excused for argument's sake: BGA-1023 BGA-1224 BGA-1284
Faster but not near its theoretical performance
If you're on enough of a budget where you need to stick with a mobo that doesn't have M.2, you're going to care more about actual real-world performance over theoretical. And at that, in most cases, there is hardly a perceivable difference between M.2 and SATAIII: techreport [dot] com [slash] review/29221/samsung-950-pro-512gb-ssd-reviewed/4
You have spent as much with your CPU upgrades over the past 6 years on that 890FX board that you could have bought a z77 and 3570k instead of that bulldozer and would not NEED an upgrade right now.
I have spent roughly $220 on CPUs for this motherboard. As of right now, my CPU handles every game I have played at 60FPS @ 1080p, except Starcraft II under intense battles. So, not only were both of my CPUs combined roughly the same cost as a new 3570k, but I'd have also had to buy a new motherboard as well. In other words, I'd be spending more for no perceivable user experience. I couldn't care less about synthetic benchmarks or theoretical performance.
1150 and 1155 are dual channel DDR3. 2011 is quad channel DDR3 with no IGPU and supports up to a 8 core CPU. 2011 v3 is quad channel DDR4. Try and keep up.
I apparently wasn't aware the 115# sockets were so crippled compared to 2011. I thought they could handle more than quad cores, so yes, maybe socket 2011 does have a distinguished reason to exist (my confusion probably came from i7s in both sockets). I actually explicitly didn't mention 2011v3 because I knew there were several changes where they weren't very comparable, but the funny thing is some motherboards are backward and forward compatible. Anyway, you still haven't addressed the necessity of 1150 vs 1155. And more importantly, there is still no good explanation as to why, for example, socket 1151 CPUs (to my knowledge) aren't backward compatible with 1150 mobos. Many (if not all) 1151 CPUs are DDR3 and DDR4 compatible. There may be other combinations but I don't feel like researching all of them. Again - I don't care that much about there being new sockets, but the intentional breakage of backward and/or forward compatibility is my gripe.
https://forums.guru3d.com/data/avatars/m/237/237771.jpg
Ivy Bridge is compatible with the following sockets: LGA 1155 LGA 2011 LGA 2011-1 (this is the one that I find nitpicky) EDIT: LGA 2011-3, for some motherboards LGA 1356 Socket G2 And these are the BGA models, which I have excused for argument's sake: BGA-1023 BGA-1224 BGA-1284
2011 and 2011-1 are the same socket. AFAIK 2011 v3 does not support IB at all (because DDR4), and LGA 1356 is xenon are you serious? Seriously 2 sockets. One for consumer and one for enthusiast.
Anyway, you still haven't addressed the necessity of 1150 vs 1155. And more importantly, there is still no good explanation as to why, for example, socket 1151 CPUs (to my knowledge) aren't backward compatible with 1150 mobos. Many (if not all) 1151 CPUs are DDR3 and DDR4 compatible. There may be other combinations but I don't feel like researching all of them. Again - I don't care that much about there being new sockets, but the intentional breakage of backward and/or forward compatibility is my gripe.
Really?!?! Intel has officially listed support for DDR3L standard, not DDR3 (this is notebook only).
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
2011 and 2011-1 are the same socket. AFAIK 2011 v3 does not support IB at all (because DDR4), and LGA 1356 is xenon are you serious?
2011-1 may be the same socket but apparently (as Intel would like us to believe) they aren't functionally the same, as can be said about 2011 v3. Like I said, that one was nitpicky, so I wasn't counting that. As for whether IB is compatible with 2011 v3, like I said, that depends on the motherboard. According to Intel themselves, it is both backward and forward compatible, but it is up to the motherboard devs to allow for that. It's no different than AM3 boards using AM3+ CPUs - some may be fully equipped to do so, but the mobo manufacturer may deny that. That being said, IB isn't officially supported on 2011 v3. As for seriousness... there is no "Xenon", it's Xeon. And socket 2011 was primarily focused around Xeons with a handful of desktop parts, so what's your point?
Really?!?! Intel has officially listed support for DDR3L standard, not DDR3 (this is notebook only).
*sigh* seriously? I gave you the benefit of the doubt over my own nit-pickyness by excluding stuff like 2011-1, and yet you're getting anal over DDR3L? Get a grip.
https://forums.guru3d.com/data/avatars/m/237/237771.jpg
2011-1 may be the same socket but apparently (as Intel would like us to believe) they aren't functionally the same, as can be said about 2011 v3. Like I said, that one was nitpicky, so I wasn't counting that. As for whether IB is compatible with 2011 v3, like I said, that depends on the motherboard. According to Intel themselves, it is both backward and forward compatible, but it is up to the motherboard devs to allow for that. It's no different than AM3 boards using AM3+ CPUs - some may be fully equipped to do so, but the mobo manufacturer may deny that. That being said, IB isn't officially supported on 2011 v3. As for seriousness... there is no "Xenon", it's Xeon. And socket 2011 was primarily focused around Xeons with a handful of desktop parts, so what's your point? *sigh* seriously? I gave you the benefit of the doubt over my own nit-pickyness by excluding stuff like 2011-1, and yet you're getting anal over DDR3L? Get a grip.
THERE IS NO 2011-1 it's only 2011. short and long 2011-1 IS 2011 DDR3L is low voltage SODIMM RAM for laptops and AIO computers. BTW LGA 1356 is for dual socket.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
THERE IS NO 2011-1 it's only 2011. short and long 2011-1 IS 2011
Uh oh, somebody missed nap time. From wikipedia: "LGA 2011-1 (Socket R2), an updated generation of the socket and the successor of LGA 1567, is used for Ivy Bridge-EX (Xeon E7 v2)[6] and Haswell-EX (Xeon E7 v3) CPUs, which were released in February 2014 and May 2015, respectively." So, mind explaining how v3 is worthy of being distinguished but this doesn't? I don't see how the differences between v3 to -1 are any effectively different than -1 to plain 2011. Both of them are just simply upgrades allowing for different CPUs to run, but they're physically the same socket.
DDR3L is low voltage SODIMM RAM for laptops and AIO computers.
Again, you're being nitpicky and it's not contributing anything to your argument. But anyway, DDR3L does in fact exist on desktops. I have built micro ATX PCs that use it. Example: https://www.newegg.com/Product/Product.aspx?Item=9SIA0FU42J4496
https://forums.guru3d.com/data/avatars/m/237/237771.jpg
Uh oh, somebody missed nap time. From wikipedia: "LGA 2011-1 (Socket R2), an updated generation of the socket and the successor of LGA 1567, is used for Ivy Bridge-EX (Xeon E7 v2)[6] and Haswell-EX (Xeon E7 v3) CPUs, which were released in February 2014 and May 2015, respectively." So, mind explaining how v3 is worthy of being distinguished but this doesn't? I don't see how the differences between v3 to -1 are any effectively different than -1 to plain 2011. Both of them are just simply upgrades allowing for different CPUs to run, but they're physically the same socket. Again, you're being nitpicky and it's not contributing anything to your argument. But anyway, DDR3L does in fact exist on desktops. I have built micro ATX PCs that use it. Example: https://www.newegg.com/Product/Product.aspx?Item=9SIA0FU42J4496
You claimed 2011-1 is desktop so you aren't being nit-picky you are being ignorant. It's again the dual socket. AIO was meant to cover SFF I was not aware they were making it in full 240 pin however. This is niche though and will be phased out when DDR4 becomes more the standard. But don't think you can use DDR3 in a Skylake/Kabbylake setup because you will fry the memory controller. http://www.tomshardware.com/news/skylake-memory-support,30185.html Allow me to revisit this: Ivy Bridge is compatible with the following sockets: LGA 1155 desk top consumer LGA 2011 enthusiast consumer LGA 2011-1 (this is the one that I find nitpicky) xeon Does not exist according to Intel EDIT: LGA 2011-3, for some motherboards nope LGA 1356 xeon Sandybridge-EP Socket G2 mobile And these are the BGA models, which I have excused for argument's sake: BGA-1023 BGA-1224 BGA-1284 One last note on 2011-1 (R2) Ark.intel.com is not showing anything on socket R2. I'm finding tons of socket R (sandy and ivy bridge based xeons) and R3 (haswell and brodwell based xeons). Only info I've seen is that Wikipedia excerpt.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
You claimed 2011-1 is desktop so you aren't being nit-picky you are being ignorant. It's again the dual socket.
I said no such thing, and never implied it.
LGA 2011 enthusiast consumer
2011 also has Xeons. There are also Xeons for sockets 1150, 1155, and other "consumer" sockets. As you should be able to see, this gets confusing real fast. Note I have 2 separate arguments, of which you seem to be mixing together: 1. Ivy Bridge exists on several sockets despite there not being backward compatibility among most of them. 2. Intel makes various sockets (such as 1150 and 1155) despite there not being any noticeable good reason to do so.
https://forums.guru3d.com/data/avatars/m/237/237771.jpg
I said no such thing, and never implied it. 2011 also has Xeons. There are also Xeons for sockets 1150, 1155, and other "consumer" sockets. As you should be able to see, this gets confusing real fast.
Yes xeons can be used in consumer level boards, but why? They are locked down a bit where overclocking is a pain. They are tyipicaly more expensive or priced the same as consumer level CPU's and the only thing they really have over consumer CPU is support for error correcting RAM.
Note I have 2 separate arguments, of which you seem to be mixing together: 1. Ivy Bridge exists on several sockets despite there not being backward compatibility among most of them. 2. Intel makes various sockets (such as 1150 and 1155) despite there not being any noticeable good reason to do so.
You are assuming there is not reason to do so but there very well may be a reason. This is the difference I have found in a quick search. socket 1155 http://www.extremetech.com/wp-content/uploads/2012/04/Z77-blockdiagram.jpg socket 1150 http://images.anandtech.com/doci/6989/Intel%20Z87%20Block%20Diagram.png The difference is in the iGPU and off loading the display outputs from the chipset on the newer socket. Also the newer chipset has audio on it instead of it coming directly from the CPU.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
You are assuming there is not reason to do so but there very well may be a reason. This is the difference I have found in a quick search.
Can you seriously lay off the drugs and stop claiming things that were never said or implied? I never said anything about whether Intel should or shouldn't use Xeons in sockets 115#. I couldn't care less if someone chooses to do so. Honestly, I'm thankful Intel gives us the option, regardless of whether it's a better choice or not. But this has nothing to do with what we're discussing.
https://forums.guru3d.com/data/avatars/m/237/237771.jpg
You are assuming there is not reason to do so but there very well may be a reason. This is the difference I have found in a quick search.
Can you seriously lay off the drugs and stop claiming things that were never said or implied? I never said anything about whether Intel should or shouldn't use Xeons in sockets 115#. I couldn't care less if someone chooses to do so. Honestly, I'm thankful Intel gives us the option, regardless of whether it's a better choice or not. But this has nothing to do with what we're discussing.
Wait what does this have to do with what you quoted? Who is on drugs? Are you purposely ignoring the answer I gave to your second question just so you can continue to believe you have some moral superiority? Your first question is more of a straw-man so I will just leave it at that. I edited my previous post to help you out.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Are you purposely ignoring the answer I gave to your second question just so you can continue to believe you have some moral superiority?
Moral superiority? Not sure how morals have anything to do with this... The IGP and audio are completely unacceptable reasons to prevent backward compatibility. If backward compatibility were allowed, people would understand any lost features are a result of using the "wrong" socket (for example: using an AM3 CPU in a AM2 mobo and not having DDR3 access). To my knowledge, most motherboards (regardless of socket) came with their own audio chipset regardless of what the CPU offered. The CPU's audio is likely for things like DP or HDMI.
Your first question is more of a straw-man so I will just leave it at that.
Actually it directly relates to what I was talking about the entire time, but it's no surprise your series of inaccurate presumptions and claims would lead you to think this way. EDIT: BTW, real suspicious timing of your post edits. I'm sure I'll be seeing more of that. But, that's what I get for over-cropping quotes.
https://forums.guru3d.com/data/avatars/m/99/99142.jpg
schmidtbag, lay off the substance you're taking. It makes you come across a little bit confused.
https://forums.guru3d.com/data/avatars/m/237/237771.jpg
Moral superiority? Not sure how morals have anything to do with this... The IGP and audio are completely unacceptable reasons to prevent backward compatibility. If backward compatibility were allowed, people would understand any lost features are a result of using the "wrong" socket (for example: using an AM3 CPU in a AM2 mobo and not having DDR3 access). To my knowledge, most motherboards (regardless of socket) came with their own audio chipset regardless of what the CPU offered. The CPU's audio is likely for things like DP or HDMI. Actually it directly relates to what I was talking about the entire time, but it's no surprise your series of inaccurate presumptions and claims would lead you to think this way. EDIT: BTW, real suspicious timing of your post edits. I'm sure I'll be seeing more of that.
So let's get this strait you started off saying Ivy Bridge is supported by 5 sockets turns out it's only 3 and one is mobile. You ask for a reason for socket changes. Some are given with proof and now you start injecting your opinion into things. You keep using AMD as a point of reference and everyone can see they screwed up with AM3+( this is my opinion). iGPU is used by more people than dGPU so if Intel found that handling it differently on Haswell and on made a difference in power consumption then go for it I say. Remember there was a marked improvement from Ivy Bridge to Haswell in both iGPU performance (like 50% improvement in some instances) and power consumption. http://www.guru3d.com/index.php?ct=articles&action=file&id=4057&admin=0a8fcaad6b03da6a6895d1ada2e171002a287bc1 Edit to add proof/context to iGPU performance claim.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
So let's get this strait you started off saying Ivy Bridge is supported by 5 sockets turns out it's only 3 and one is mobile.
Actually, I specifically clarified that I was only referring to 3 of the 5 sockets. But, you were too busy ranting about too much text. This once again is your issue.
You ask for a reason for socket changes. Some are given with proof and now you start injecting your opinion into things.
I also specifically stated (directly quoting from one of my posts) "...I am not aware of any differences between them that warrant an entirely new backward-incompatible socket". A couple tweaks to a GPU and an audio chipset nobody uses does NOT warrant a backward-incompatible socket. Again, I don't care if it's a new socket - that in of itself is not the problem. Regardless - from the very beginning of this discussion, I made it very clear that my statements were opinionated. Yet here you are, acting like this is some revelation and weakness in my argument. Even though you're aware of this, you can't seem to just let it go.
You keep using AMD as a point of reference and everyone can see they screwed up with AM3+( this is my opinion).
I accept the idea that AM3+ was a screw-up. Convenient for consumers, but a screw-up. But I was focusing more on AM2 vs AM3. The architectures between those sockets didn't vary that much and are more comparable to what Intel could be doing.
iGPU is used by more people than dGPU so if Intel found that handling it differently on Haswell and on made a difference in power consumption then go for it I say. Remember there was a marked improvement from Ivy Bridge to Haswell in both iGPU performance (like 90% improvement in some instances) and power consumption.
Are you looking at desktop PCs or all PCs? Because laptops and servers would really skew those results. Laptops, as you have pointed out, use a different socket too. When just looking at desktops (where sockets 1150 and 1155 are used), the percentage of IGP users would drop considerably. Regardless, I do understand there are still likely tens of thousands of desktop IGP users out there. However, the differences between the chipsets should not not prevent people from using the IGP. Maybe a feature or two, but the IGP as a whole ought to still be usable. If Intel made a different socket because the GPU architecture became a complication, I would accept that. But that's not what happened, because Ivy Bridge used the same GPU architecture on both sockets. EDIT: GPU performance doesn't say much about whether a socket should change. The IGPs for FM2 range from uselessly slow to modestly good, but it's all 1 socket.
https://forums.guru3d.com/data/avatars/m/237/237771.jpg
Okay so I got bored and decided to look into the Xeon processors a little deeper. This is what I have found. The easily known sockets 1155 2011 G2 (mobile) Odd sockets that are purpose built. 1356: This one has a grand total of 29 supported CPU's across the Sandy Bridge EN and Ivy Bridge EN platform. It has single and dual socket support and tri-channel DDR3 (which I find odd) 2011-1: It is only for Ivy Bridge EX Xeon E7 28xx, 48xx and 88xx processors lowest priced one is just south of $2000 and this socket is only available in dual, quad and octo-processor configurations. The two odd ones should not be acknowledged in this discussion as they are not consumer level products.
I accept the idea that AM3+ was a screw-up. Convenient for consumers, but a screw-up. But I was focusing more on AM2 vs AM3. The architectures between those sockets didn't vary that much and are more comparable to what Intel could be doing. Are you looking at desktop PCs or all PCs? Because laptops and servers would really skew those results. Laptops, as you have pointed out, use a different socket too. When just looking at desktops (where sockets 1150 and 1155 are used), the percentage of IGP users would drop considerably. Regardless, I do understand there are still likely tens of thousands of desktop IGP users out there. However, the differences between the chipsets should not not prevent people from using the IGP. Maybe a feature or two, but the IGP as a whole ought to still be usable. If Intel made a different socket because the GPU architecture became a complication, I would accept that. But that's not what happened, because Ivy Bridge used the same GPU architecture on both sockets. EDIT: GPU performance doesn't say much about whether a socket should change. The IGPs for FM2 range from uselessly slow to modestly good, but it's all 1 socket.
Steam lists intel as most used GPU. Most consumer intel desktops sold in big box stores are iGPU. The Ivy Bridge with iGPU only came on the 1155. 2011 had no iGPU and no Xeons have iGPU. I believe the iGPU architecture changed on Haswell. Also the voltage regulator was moved on die for Haswell that would be another reason for socket change and no BC. AMD's APU GPU is based on GCN and is very scale-able plus make them very good all around systems especially the newer Pildriver based A10's
https://forums.guru3d.com/data/avatars/m/238/238382.jpg
Intel sockets are confusing *__*
https://forums.guru3d.com/data/avatars/m/237/237771.jpg
Intel sockets are confusing *__*
LOL They only really get confusing when you get to Xeon. Normally you only have consumer and enthusiast socket and that's it.