Tweaker mods Z170 motherboard and gets Core i3 8350 (Coffee Lake) working

Published by

Click here to post a comment for Tweaker mods Z170 motherboard and gets Core i3 8350 (Coffee Lake) working on our message forum
https://forums.guru3d.com/data/avatars/m/248/248627.jpg
Intel's practices are pathetic to get even an i5 on on z270 or z370 would have cost considerably more than what my b350/1600 cost and on top they're dead end platforms.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
This doesn't really surprise me. Of course Intel could have done a huge revamp on the pin order to make it utterly incompatible, but already back in the day when Coffee Lake was shining new, there were reports of someone trying and making it almost through the boot check, so it couldn't be that different. The reason why they didn't change the pin layout so much is that it would have cost them money. I wouldn't be surprised if at the Intel HQ employees had to bring their own toilet paper to work so that the company wouldn't lose profit. Intel is a financial corporation doing some technology on the side.
https://forums.guru3d.com/data/avatars/m/270/270233.jpg
typhon6657:

Maybe you should not have got that 1k CPU then if you are so hard up for cash. A new MB is $150-$250 and you spent 1,000 on just a cpu that's in another class. I get Intel but ppl like you I have a problem understanding.
He didn't mention anything about being hard up for cash, he just said that Intel was greedy. Even at the $1K price range, AMD offer so much more.
data/avatar/default/avatar02.webp
Ugh......this shit is getting ridiculous. It's already been well documented that while the physical socket is the same the 300 series chipsets includes additional power pins that were reserved (not used) in the 200 series chipset. These pins are used to provide power to the extra 2 cores in coffee lake. We already knew the 4 core coffee lake models would likely work, and surprise surprise that's the model that was used here. The problem is you can't get 6 core cpus to work with 200 chipsets because they weren't designed for it. Intel likely decided to just require the new chipset for the new cpus and be done with it rather than require different chipsets for different models within the new series which would have confused consumers and increased complaints from people unable to power up their new computers on christmas day because they bought the wrong product. No company wants to deal with that shit. What I think they should have done is called it LGA1151r2 like they did when they revised LGA2011 to further reduce confusion but that would not have stopped the "Intel is just doing this for greed!" arguments from the internet hatetrain that we inevitably hear every year. Seriously I have been lurking here for nearly a decade now and every time there is news concerning Intel or Nvidia it's either blown massively out of proportion, told in a misleading way, or just outright fabricated. Any negative rumors are automatically assumed to be true without evidence. Motivations are ALWAYS assumed to be greed even when there is a clear engineering constraint at play. But if AMD does the exact same thing next year all I see in the comments section are excuses. It's like this is some sort of feud between rival sports teams and everyone hates the teams that keep winning. I remember all of the crazy rumors about sandy bridge that made everyone here swear to never buy it. I'll bet you all forgot about that already. Go to the archives and do some reading. When you see just how ridiculous the posts were back then knowing what you know now today you should realize that 5 years from now the posts in this thread are going to look just a ridiculous to future readers. The problem is nobody does any research and immediately begins dog-piling whatever the most popular rhetoric is at the moment and shouting down anyone trying to say anything else. Any one of you could have avoided this discussion entirely and determined the real reason behind this decision by spending literally 1 minute doing research on google. 1 minute. But instead it's just assuming that Intel is evil so it must be greed, that's the reason, no research or evidence needed. This site needs to do something about this. It used to be just the comments so I could ignore them but this last year it now seems like the articles themselves are specifically designed to egg the commentators on. /AngryRant
data/avatar/default/avatar35.webp
I modded my Gigabyte GA-Z270X-Gaming 5 BIOS with CoffeeLake CPU Microcode using MMTool, updated the Intel UEFI GOP Driver and VBIOS 1054, and updated ME Firmware to version 11.8, I think doing that should resolve the iGPU problem but not sure about the PCI-E x16 slot. Unfortunately I do not have a CFL CPU to test out my mod. Look at win-raid forums for the resources for the mods.
https://forums.guru3d.com/data/avatars/m/67/67544.jpg
I haven't upgraded my rig since 2012 but when I do it will most likely be AMD. I can't support a company like Intel these days.
data/avatar/default/avatar26.webp
HOW would they do that? There are literally no power pins for the extra two cores. Check the pinouts if you don't believe me, it's physically impossible. Until you can explain that away any arguments you make are moot and calling me a fanboy doesn't change that. When they get 6 core coffee lake cpus working on 100/200 series chipsets then you can claim I'm wrong. Until then common sense and publicly available documentation that once again can easily be found through google clearly demonstrates the truth of the matter. You have no excuse for believing otherwise unless you want to and have decided to ignore any conflicting evidence. What ASUS probably meant to say was that the 4 core cpus will work. Edit: Here since you clearly unwilling to read the data sheets for yourself someone was kind enough to compile the pinout list from the data sheets into an image here: https://twitter.com/david_schor/status/914874843195219974 You can cross reference it to the data sheets to confirm that this is indeed accurate. I'm sorry I lost my temper but these armchair engineers make me so mad when they make these assumptions based on nothing. I have the feeling anyone who works in electronic engineering probably also feels their blood boil when they read this stuff but most of them are too busy to comment and have learned by now to just stay away from comments sections altogether.
https://forums.guru3d.com/data/avatars/m/271/271903.jpg
NaturalViolence:

Ugh......this crap is getting ridiculous. It's already been well documented that while the physical socket is the same the 300 series chipsets includes additional power pins that were reserved (not used) in the 200 series chipset. These pins are used to provide power to the extra 2 cores in coffee lake. We already knew the 4 core coffee lake models would likely work, and surprise surprise that's the model that was used here. The problem is you can't get 6 core cpus to work with 200 chipsets because they weren't designed for it. Intel likely decided to just require the new chipset for the new cpus and be done with it rather than require different chipsets for different models within the new series which would have confused consumers and increased complaints from people unable to power up their new computers on christmas day because they bought the wrong product. No company wants to deal with that crap. What I think they should have done is called it LGA1151r2 like they did when they revised LGA2011 to further reduce confusion but that would not have stopped the "Intel is just doing this for greed!" arguments from the internet hatetrain that we inevitably hear every year. Seriously I have been lurking here for nearly a decade now and every time there is news concerning Intel or Nvidia it's either blown massively out of proportion, told in a misleading way, or just outright fabricated. Any negative rumors are automatically assumed to be true without evidence. Motivations are ALWAYS assumed to be greed even when there is a clear engineering constraint at play. But if AMD does the exact same thing next year all I see in the comments section are excuses. It's like this is some sort of feud between rival sports teams and everyone hates the teams that keep winning. I remember all of the crazy rumors about sandy bridge that made everyone here swear to never buy it. I'll bet you all forgot about that already. Go to the archives and do some reading. When you see just how ridiculous the posts were back then knowing what you know now today you should realize that 5 years from now the posts in this thread are going to look just a ridiculous to future readers. The problem is nobody does any research and immediately begins dog-piling whatever the most popular rhetoric is at the moment and shouting down anyone trying to say anything else. Any one of you could have avoided this discussion entirely and determined the real reason behind this decision by spending literally 1 minute doing research on google. 1 minute. But instead it's just assuming that Intel is evil so it must be greed, that's the reason, no research or evidence needed. This site needs to do something about this. It used to be just the comments so I could ignore them but this last year it now seems like the articles themselves are specifically designed to egg the commentators on. /AngryRant
Bottom line Intel could do it full stop. That is just money grab and nothing else , they could have gone the route that would cost them some profit but it would give them much needed positive PR ie.they could enable it. Why there is more s**t thrown at Intel and Nvidia gee i don't know maybe because they do it more often then AMD. I don't know if you did check recent stats about that 3 companies, but let me inform you AMD doesn't exactly have resources and man power to artificially separate market. hell here's example when everybody took a s**t on AMD : VEGA .What is wrong with vega you ask , well it runs hot it is inefficient compared to competition and it doesn't bring anything new to market that we already didn't see from competition. And why is that you ask , simple answer AMD don't have resources to develop multiple architectures , one for consumers:graphic card pure and simple process as much polygons as possible efficiently , and one for business users : compute card , so amd instead made one jack of all trades that is not very efficient and everybody took one huge dump on them. With reason or not that is up to every person to decide for themselves.And if you want some other evidence that people don't spare AMD when they f**k up head to the thread on this forum about how they decided to allow OEM's to brand cards with 896sp (RX 460) as RX560 , and by the way before you decide to brand me as some AMD fanatic , i too think that what they did with RX460->560 is utter cr*p . It is simple both Intel and nvidia are bigger companies with much more s**t going on ,hell there was news i think this year or last year where intel decided to layoff 10% of their staff and the number of them was bigger then entire AMD. Think a little bit about that, if that 10% of former Intel personal decided to form new company that company would be bigger then entire AMD ( both CPU and GPU divisions)
data/avatar/default/avatar33.webp
I just want to see all of these sites posting this "news" that there is no hardware change keeping coffee lake from running on 200 series start issuing corrections. We've known for months now that's not the case yet every person on the internet "knows" the opposite due to the way unchecked/unsourced news spreads like wildfire on the modern internet. Especially if it has a title about an evil greedy company taking advantage of consumers. I know it won't happen since I've seen a clear pattern with this increasing in frequency each year but I'm hopeful things might change one day. By the time something begins to get discredited the internet has already moved onto the next tech scandal to get mad about. I want people to begin doing research and showing restraint before posting their opinions on the internet in the future. I want people to begin considering engineering constraints and waiting for more info. to be available before jumping to sweeping allegations about corruption and greed. I really don't think that's too much to ask for. Intel does plan ahead. They always do two cpu arches per chipset. If coffee lake was still supported then that would make three this cycle. As for why Intel doesn't go further than two years out quite frankly you can end up boxing yourself into a corner engineering wise because then you can't make pinout changes during the development cycle. Which bit AMD in the ass big time for a few years being unable to change their VRM design without breaking backwards compatibility. Now they chose to do this while keeping the physical socket because that's more efficient. It allows current HSF's to maintain compatibility and it's cheaper. They could have changed the socket slightly like that did with LGA1150->LGA1151 just to make them physically incompatible but then people would just claim (as they did when that happened) that they were changing it for no reason just to force people into buying new chipsets. I remember when people said the same thing about haswell despite the fact that they moved the VRM on die which is impossible to do without a socket change for what should be fairly obvious reasons.
https://forums.guru3d.com/data/avatars/m/271/271903.jpg
NaturalViolence:

I just want to see all of these sites posting this "news" that there is no hardware change keeping coffee lake from running on 200 series start issuing corrections. We've known for months now that's not the case yet every person on the internet "knows" the opposite due to the way unchecked/unsourced news spreads like wildfire on the modern internet. Especially if it has a title about an evil greedy company taking advantage of consumers. I know it won't happen since I've seen a clear pattern with this increasing in frequency each year but I'm hopeful things might change one day. By the time something begins to get discredited the internet has already moved onto the next tech scandal to get mad about. I want people to begin doing research and showing restraint before posting their opinions on the internet in the future. I want people to begin considering engineering constraints and waiting for more info. to be available before jumping to sweeping allegations about corruption and greed. I really don't think that's too much to ask for. Intel does plan ahead. They always do two cpu arches per chipset. If coffee lake was still supported then that would make three this cycle. As for why Intel doesn't go further than two years out quite frankly you can end up boxing yourself into a corner engineering wise because then you can't make pinout changes during the development cycle. Which bit AMD in the ass big time for a few years being unable to change their VRM design without breaking backwards compatibility. Now they chose to do this while keeping the physical socket because that's more efficient. It allows current HSF's to maintain compatibility and it's cheaper. They could have changed the socket slightly like that did with LGA1150->LGA1151 just to make them physically incompatible but then people would just claim (as they did when that happened) that they were changing it for no reason just to force people into buying new chipsets. I remember when people said the same thing about haswell despite the fact that they moved the VRM on die which is impossible to do without a socket change for what should be fairly obvious reasons.
Problem is pattern , they moved vrm on die ok , stupid move (increases cpu package temp) but ok , here comes next generation and you guessed what happens vrm out of cpu and new mobo's ,people a starting to get a little pissed but ok move on, here comes 14 nm we want discuss here about short lived or impossible to buy 5xxx series let's look instead at lake series, they are all the same no difference between 6xxx,7xxx and 8xxx series all the same, same arch same every last detail about them so why then 3 chipsets, granted 6 and 7 you can use interchangeably and to 8 they just add two more cores, and then you look who is what is doing their competition. Small pale weak AMD who can with just 1/10 of Intel size (both financially and with number of staff employed, and i think they are even smaller in finances then 10%) make chipset and socket that is going to support 14,12 and 7nm processors. Thing is Intel doesn't exists in vacuum where is just intel and nobody else, they have competition and when you look what is competition is doing and giving to their customers lot of Intel's customers are feeling ripped-of, and the other thing is you just like me and lot of guys and girls on this forum are having tunnel vision , we probably all visit same sites and spend most of our time on some tech forum or site, and that makes us above average knowledgeable about tech stuff. We are not yours average Joe , and lot of us work for tech companies or in tech sector. For average Joe there is no "scandal" about intel and their ripping of their customers, he just uses his PC or mobile phone until it dies or windows crash;) and then he goes to store and buys new shiny thingy , we are different, we "know" more or less what is possible and what is not from our experience working in the field or from just years of following news and participating in different forums. And common consensus is Intel is full of s**t with this, it could have been done but they want to jeopardize their profit and revenue even with this little thing. It says a lot about them at least from my perspective.
https://forums.guru3d.com/data/avatars/m/261/261787.jpg
What is up with the modders date and time on his bios screen? You guys noticed that it's from 2016? Well, i hope someone comes up with an idea, i mean if the Z270 can run the 8700K, why not mod it. Not sure what the PCI-E slot is all about but it is part of the cpu for sure.
https://forums.guru3d.com/data/avatars/m/267/267641.jpg
If difference are only 2 power pins, is pretty easy such use power from other source or pins, if im not wrong.. But it would require HW modding not just firmware change.
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
NaturalViolence:

HOW would they do that? There are literally no power pins for the extra two cores. Check the pinouts if you don't believe me, it's physically impossible. Until you can explain that away any arguments you make are moot and calling me a fanboy doesn't change that. When they get 6 core coffee lake cpus working on 100/200 series chipsets then you can claim I'm wrong. Until then common sense and publicly available documentation that once again can easily be found through google clearly demonstrates the truth of the matter. You have no excuse for believing otherwise unless you want to and have decided to ignore any conflicting evidence. What ASUS probably meant to say was that the 4 core cpus will work. Edit: Here since you clearly unwilling to read the data sheets for yourself someone was kind enough to compile the pinout list from the data sheets into an image here: https://twitter.com/david_schor/status/914874843195219974 You can cross reference it to the data sheets to confirm that this is indeed accurate. I'm sorry I lost my temper but these armchair engineers make me so mad when they make these assumptions based on nothing. I have the feeling anyone who works in electronic engineering probably also feels their blood boil when they read this stuff but most of them are too busy to comment and have learned by now to just stay away from comments sections altogether.
The updated socket contains a few extra pins. It's quite easy to put that new socket on a motherboard with an older chipset and run the extra traces to it. It also would have been quite easy for Intel to have placed those pins on the original socket. Intel develops processors years ahead of launch, so they knew they would need the extra pins down the road, unless you're ignorant enough to believe that Intel completely re-engineered CFL at the last minute just to counter AMD's Ryzen processors.... While it is an anti-consumer practice, businesses exist to generate a profit. I don't agree with how Intel chooses to do so at times, but the enthusiast market is small enough that we really don't matter. The majority of Intel's earnings from CPUs and Chipsets comes from pre-built systems where new CPUs are generally sold with new motherboards containing new chipsets. Some enthusiasts just need to learn their true place in the market and accept that Intel is looking at a much larger picture, namely, the OEM market. If AMD were in Intel's or NVidia's position, I'd venture to say that AMD would probably do the same things Intel and NVidia are doing. (At the time of this post, I own 5 Intel processors and 2 AMD processors, as well as 1 NVidia based graphics card and 2 AMD based)
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
ruthan:

If difference are only 2 power pins, is pretty easy such use power from other source or pins, if im not wrong.. But it would require HW modding not just firmware change.
Each pin within the socket has a specific purpose. You can't just pick 2 pins and decide they're going to suddenly become power delivery pins. Once the socket and CPU are finalized, there is no changing the pin designations. Power pins are power pins, data pins are data pins. The only exception, as we see with CFL, is the pin blanks. Blank pin locations can be used for whatever purpose is deemed necessary at a later time, for a new product, by simply releasing a revised socket design such as what Intel has done for CFL and previously with LGA2011. I haven't looked at LGA2011 and LGA2011v2, but if my understanding here is correct, the only difference between the sockets used for LGA1151 is 2 missing power pins from the original socket design. Intel could have easily added those 2 pins to the original socket design from the start, providing compatibility with CFL, and avoided all the whining about it but chose not to. Intel also could have allowed the microcode for the CFL i3's to be added to boards using the 100 and 200 series chipsets which would have minimized the whining a bit.
https://forums.guru3d.com/data/avatars/m/261/261787.jpg
If it has a few extra pins, why did they call it 1151 socket, the same as the previous two before?
data/avatar/default/avatar24.webp
if it's not fake, as if MS doesn't do that, as if AMD doesn't do that, as if NVIDIA doesn't do that? They all are liars, scammers....and it's You who let this continue...
data/avatar/default/avatar02.webp
@ruthan @sykozis 2 pins? Jesus christ you guys STILL didn't read the data sheets or look at the pinout diagram? Even after I did the searching for you? I'm done with this conversation. If you guys are too lazy to take 1 minute to look at the data before posting then there is nothing I can do at this point to further this conversation. @Guru01 Because the physical socket is still the same. Most modern sockets have reserved pins on the socket that are not actually wired to the motherboard. That includes AMD's current socket, AM4 by the way. Some of these pins that were not wired in the 100/200 series are now wired in the 300 series. They could have wired all of them but that would add unnecessary cost and size to the board design (a burden that would be on the board manufacturer's not Intel) since you're now adding traces that aren't actually being used by the cpus. Intel's engineers must have decided to compromise on this which is why they have the two year cycle. It's a balancing act between inefficient design and backwards compatibility. They chose a particular spot on that scale for their business. People can criticize their choice, I have no problem with that, but they can't assume that only greed played a role in that decision. @kruno Yes on die VRM raises temp. a little bit but it allows them to dramatically cut mobo cost and size and improve power efficiency (this is why we saw an explosion of mini-itx boards that generation). All of which were clearly design goals for the project. This is another example of what I'm talking about. When that controversy came up everyone just went straight to accusations that they only did it to "force" a socket change to get people to buy more chipsets. No one stopped to consider any legit reasons why an engineer might want to make that change. People don't see these companies as being run by real people or consider that their might be more complex reasons for decisions made by engineers with 8+ years of college and 20+ years of industry experience under their belt. That they might know something about electronic engineering that they don't. It boggles my mind to see that thought never cross anyone's mind online. "I don't know" is always a better answer than making assumptions without evidence. Yet I only ever see the latter in the comments section here. And that disturbs me. @NAMEk Yes because the data sheets provided to board manufacturers are definitely fake. That's why we don't have working 300 series motherboards.....oh wait. Seriously? You see something positive about Intel and immediately assume it must be fake based on nothing?
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
Well I could flash my z87 and use broadwell on it. Imei firmware flash.. So no doubt this is possible too. But at own risk. Intel released it to manufacturers back then now it looks all quiet, well there are "custom" firmware tools @ stationdrivers but more complicated to update, by me it was a simple tool for flashing.
data/avatar/default/avatar25.webp
NaturalViolence:

@NAMEk Yes because the data sheets provided to board manufacturers are definitely fake. That's why we don't have working 300 series motherboards.....oh wait. Seriously? You see something positive about Intel and immediately assume it must be fake based on nothing?
Dude, read my post again, what are you talking about? I haven't said anything like that. Idk if it's fake or not. I see You sure do. Do I see positive? I see only negative. You've taken my post totally opposite...Jeez....
NAMEk:

if it's not fake, as if MS doesn't do that, as if AMD doesn't do that, as if NVIDIA doesn't do that? They all are liars, scammers....and it's You who let this continue...
btw http://www.guru3d.com/news-story/radeon-rx-560-amd-silently-changes-gpu-specification.html