Review: Intel Core i5 8600K CPU

Published by

Click here to post a comment for Review: Intel Core i5 8600K CPU on our message forum
data/avatar/default/avatar24.webp
I'm just amazed that my i7 4790k is still doing well in Single threaded performance considering how old this chip is which means still great for emulation.
Ziggymac:

Well, with 6 real cores, the argument for spending extra on the i7 for gaming has been massively reduced with the 8600K. In quite a few games there was a real, tangible difference in framerates between the 7600K & 7700K, but that's no longer the case with the 8600K & 8700K. If all your after is a CPU for gaming, then the $100 saved and put towards a better GPU, is by far the more sensible option this time.
While you have a valid point. But if you are doing stuff other than gaming like Video rendering or even live streaming the extra threads will help. Plus having 2 extra cores and threads from the previous generation i5 is a huge bonus as well.
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
airbud7:

it will be a long time before game developers make a 6 core the minimum requirement... They would shoot thereself in the foot if they did that....
Yes they would not make 6 core minnimun soon , although i did not talked about minimun , i think it is safe to say when you aim for a 8700 or 8600 and aiming for 144 hz(for 60 hz the extra cost is pointless if only gaming is in mind else you would by the cpu with the most raw compute power your wallet can handle) then they become a factor big time and most likely we talk about a gtx 1080 or ti , maybe a 1070, bellow that on a 580 /1060 pretty much any ryzen or core i cpu is plenty
D3M1G0D:

I dunno, I think quad cores are still very relevant for gaming (even though I've retired my old 4790K, it could probably still do very well in games for the next couple of years). As I said before, these new hexa-core CPUs mostly add multi-core benefits, not single-core (those looking to buy Coffee Lake should do so for better multi-threading, not better single-threading). Of course games in the future may be better optimized for more cores, but I wouldn't bet the farm on that (considering the fact that most gamers today have dual-core CPUs, it'll likely take years before six-core gaming becomes the norm).
I am still on 3770k @4.3 and play at 1080 p60 hz just fine including youtube on my second monitor quiet often! So far i am good and as far i don't go into extreme 144hz.. it will serve me farther!
https://forums.guru3d.com/data/avatars/m/245/245459.jpg
MaCk0y:

Agreed. I would suggest benchmarking by lowering the graphics settings instead of lowering the resolution. That's what I do to try to achieve 144FPS.
Even though we both game at 1080p 144Hz by turning down game details rather than lowering resolution, I still think we can be happy that the 720p tests included here will provide us a good way of comparing CPUs for our 144Hz gaming needs. I think the load on the CPU will be pretty much the same, as long as you're outputting a high framerate. Seeing as Hilbert has started testing already at 720p it's simpler to just keep on doing that rather than switching to 1080p at reduced game details - it makes the testing procedure simpler too because you don't have to decide on different detail levels for different games. I think we'll continue to get good info if he keeps on with the 720p.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Robbo9999:

Even though we both game at 1080p 144Hz by turning down game details rather than lowering resolution, I still think we can be happy that the 720p tests included here will provide us a good way of comparing CPUs for our 144Hz gaming needs. I think the load on the CPU will be pretty much the same, as long as you're outputting a high framerate. Seeing as Hilbert has started testing already at 720p it's simpler to just keep on doing that rather than switching to 1080p at reduced game details - it makes the testing procedure simpler too because you don't have to decide on different detail levels for different games. I think we'll continue to get good info if he keeps on with the 720p.
Well, those 720p tests. Yes, they show how high fps one will have if we get superb GPUs from future. But one of them shows that HT vs. NO-HT starts to make hidden difference today even on 6-core modern CPU.
https://forums.guru3d.com/data/avatars/m/245/245459.jpg
Fox2232:

Well, those 720p tests. Yes, they show how high fps one will have if we get superb GPUs from future. But one of them shows that HT vs. NO-HT starts to make hidden difference today even on 6-core modern CPU.
It also shows us what kind of fps we can expect on current GPUs when we turn down game details on 144Hz monitors, so it's relevant today as well as tomorrow. Yes, about the HT vs No-HT, you're comparing the 8700K vs the 8600K I guess - the 8700K does win, but not by much, but it also does have a slightly higher frequency. In one of the games the 8700K leads by quite a large margin over the 8600K, I'm guessing this is the title that benefits from more threads (HT). Apart from that one game (although not many games tested in the 720p tests), the 8600K was pretty much as good as the 8700K. The future bodes better for the 8700K though. EDIT: yes, I re-read your post, you were referring to that one game too - the one that does better on the 8700K with HT. Yes, I agree - HT can make a positive difference in gaming.
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
Borys:

Very well remembered my dear HH. Add to this a nice overclock gains that 1600/x can do and BINGO, the best buy with sure! Other thing, the 8600K has some more FPS with a 1080 card, but i have a question. How much % in FPS this CPU gains with a intermediate VGA that I think this CPU was designated to?
That totally depends on the card and game.But small, very small and at the mainstream level of say an Rx 570 or GTX 1060 close to NIL.
https://forums.guru3d.com/data/avatars/m/227/227853.jpg
ScoobyDooby:

And the dumb as bricks comment award of the day goes to....
To you darling, for being uneducated: Video removed, it's been plastered throughout the forums already. One post is enough. The whole 'I disagree with you so you must be stupid' fallacy also supports that same award. Congratulations.
https://forums.guru3d.com/data/avatars/m/271/271769.jpg
Prices in my country, 8600k - €276.90 8700k - €405.90 1600 - €209.90 1600x - €239.90 Clearly AMD is the best choice price performance, there is no doubt about it. The difference in fps between the amd and intel, is only noticeable is you have a 1080ti paired with one of them. If you own a RX580/GTX1060/GTX1070 you will not notice any difference in fps between those cpus. Memory wise the performance of the same 3200 CL 14 kit, is faster on AM4 compared side by side with INTEL CL, as seen in several reviewers tests. B350 can overclock for cheaper price. As proven by consumers that bought 8600K/8700K they can not aciev those 5.0GHZ OC that the top reviewers got with ease, that tells me that INTEL sent only golden cpu to the reviews to inflate the hype. Friend of mine asked me to build a 8700k last monday, at 4.7 ghz I was already entering the voltage limit of the CPU and hitting 98ºC on stress with AIO liquid cooler. My own experience tells me that that if i compared the 8 ryzen 1600X that i assembled on the last 4 month, i could hit 4.1 on air with all of them and getting under 1.39V, some got 1.37, others 1.38 and 2 of them 1.39. At 4.7GHZ on 8700K and compared to a 1600X at 4.1, the difference between them in 7.56%. Its €166 plus for 7.56% more performance, those is compensate, taking in consideration the values i used to compare? No!
https://forums.guru3d.com/data/avatars/m/245/245459.jpg
Jorge Nascimento:

Clearly AMD is the best choice price performance, there is no doubt about it. The difference in fps between the amd and intel, is only noticeable is you have a 1080ti paired with one of them. If you own a RX580/GTX1060/GTX1070 you will not notice any difference in fps between those cpus.
That's not true, it doesn't really matter what GPU you pair it with, it's what fps you choose to game at that matters. You can turn down taxing GPU details and game at a high fps with a lesser GPU, and in this case you will want a fast gaming CPU like the Intels for 144Hz for instance. And besides, GTX 1070 can output well over 100fps at max details in a lot of games, for which you also need a good gaming CPU. It's a fallacy that you need to pair a certain CPU with a certain GPU - if you turn down game details for higher fps you will always need a good CPU no matter what GPU you're running with (within reason) - it's mainly just about the fps you're running that dictates CPU performance required, not directly what GPU it's paired with.
data/avatar/default/avatar20.webp
Very nice review, but I think CPU benchmarks need the minimum fps value and not only average fps. The thing is, the benchmarks for each game do almost nothing to show the actual CPU performance because we are looking at average fps only. If the bechmark had mininum fps included as well, then we would actually see the difference in performance between processors. Most people will look at the graphs for each game and see that the difference is minimal, but in fact the minimum fps would show a very different picture in my opinion. The "Removing the GPU Bottleneck" certainly helped to show the difference between AMD and Intel cpus, where we see a 46 fps difference between 1600x and 8600k in Tomb Raider for example. But minimum fps is the most important value in my opinion for processor benchmarks, because it's where we actually see the drops while playing (i7 7700k vs i5 7600k in BF1 for example have a big difference when looking at min fps, while avg is not so different).
Jorge Nascimento:

Clearly AMD is the best choice price performance, there is no doubt about it. The difference in fps between the amd and intel, is only noticeable is you have a 1080ti paired with one of them. Its €166 plus for 7.56% more performance, those is compensate, taking in consideration the values i used to compare? No!
Hey man, portuguese guy here too. Don't take this the wrong way, but I don't agree with you when you say this. Just look at the Tomb Raider/Hitman values here and see the difference between the 1600x and the 8600k at 720p. Like I said above, in my opinion if we look only at the average fps as seen per game benchmark in this review, we don't see the actual performance difference between CPUs.
https://forums.guru3d.com/data/avatars/m/232/232130.jpg
psychic717:

But minimum fps is the most important value in my opinion for processor benchmarks, because it's where we actually see the drops while playing (i7 7700k vs i5 7600k in BF1 for example have a big difference when looking at min fps, while avg is not so different).
This is valid point, and we probably will see CPU with more cores having higher min FPS. 4 cores already stutter in heavy CPU games like BF1, especially when many events happening the time.
data/avatar/default/avatar09.webp
Jorge Nascimento:

Prices in my country, 8600k - €276.90 8700k - €405.90 1600 - €209.90 1600x - €239.90 Clearly AMD is the best choice price performance, there is no doubt about it. The difference in fps between the amd and intel, is only noticeable is you have a 1080ti paired with one of them. If you own a RX580/GTX1060/GTX1070 you will not notice any difference in fps between those cpus. Memory wise the performance of the same 3200 CL 14 kit, is faster on AM4 compared side by side with INTEL CL, as seen in several reviewers tests. B350 can overclock for cheaper price. As proven by consumers that bought 8600K/8700K they can not aciev those 5.0GHZ OC that the top reviewers got with ease, that tells me that INTEL sent only golden cpu to the reviews to inflate the hype. Friend of mine asked me to build a 8700k last monday, at 4.7 ghz I was already entering the voltage limit of the CPU and hitting 98ºC on stress with AIO liquid cooler. My own experience tells me that that if i compared the 8 ryzen 1600X that i assembled on the last 4 month, i could hit 4.1 on air with all of them and getting under 1.39V, some got 1.37, others 1.38 and 2 of them 1.39. At 4.7GHZ on 8700K and compared to a 1600X at 4.1, the difference between them in 7.56%. Its €166 plus for 7.56% more performance, those is compensate, taking in consideration the values i used to compare? No!
Can you site your source saying where consumers couldn't achieve 5GHZ on either 8600k or the 8700k? I am sure there are some consumers out there that got a Golden chip. Also your stats where you claim over a 7.5% increase what did you use to arrive at that number? Gaming or a CPU synthetic benchmark? Also I would say that the Ryzen 1600 is the best bang for the buck on the AMD side of things and the 8600k for Intel. Also claming there is no difference with AMD and Nvidia GPUs paired with these CPUs wouldn't that be on a game by game basis and if the game is optimized very well for both AMD and Nvidia cards?
https://forums.guru3d.com/data/avatars/m/266/266438.jpg
Robbo9999:

Even though we both game at 1080p 144Hz by turning down game details rather than lowering resolution, I still think we can be happy that the 720p tests included here will provide us a good way of comparing CPUs for our 144Hz gaming needs. I think the load on the CPU will be pretty much the same, as long as you're outputting a high framerate. Seeing as Hilbert has started testing already at 720p it's simpler to just keep on doing that rather than switching to 1080p at reduced game details - it makes the testing procedure simpler too because you don't have to decide on different detail levels for different games. I think we'll continue to get good info if he keeps on with the 720p.
Again agreed. But I suggested it since Hilbert stated: "This page purely exists based upon forum feedback for a small group of people, we do not feel that 720p testing is representable for your gaming experience whatsoever, neither do people game at 720p". He is right. It isn't. I wouldn't lower the resolution to 720p. That's why lowering the settings is a better representation and a realistic scenario. I still appreciate Hilbert for doing this test and listening to the feedback. 🙂
https://forums.guru3d.com/data/avatars/m/237/237771.jpg
Looks like a great value for those wanting high refresh-rate gaming. Matches 7700k in most synthetics at a $230 MSRP (that we will likely not see until Q1 2018). @Hilbert Hagedoorn in the overclock section the article states you achieved 5.2Ghz but all screen shots show 5.0Ghz.
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
Thanks, correcting.
https://forums.guru3d.com/data/avatars/m/270/270233.jpg
Venix:

I am still on 3770k @4.3 and play at 1080 p60 hz just fine including youtube on my second monitor quiet often! So far i am good and as far i don't go into extreme 144hz.. it will serve me farther!
I game on a 144 Hz monitor (@ 1440p), but I don't always try to max it out. Usually, I try to keep a nice balance between high FPS (around 90 is fine) and quality, since I don't want to play with graphics that are complete garbage 😛. Smooth and consistent gameplay is what I want most, but graphics quality is also very important to me (I didn't get a high-end GPU for nothing).
https://forums.guru3d.com/data/avatars/m/232/232130.jpg
TIL 2 mid range GPU = high end.
https://forums.guru3d.com/data/avatars/m/245/245459.jpg
MaCk0y:

Again agreed. But I suggested it since Hilbert stated: "This page purely exists based upon forum feedback for a small group of people, we do not feel that 720p testing is representable for your gaming experience whatsoever, neither do people game at 720p". He is right. It isn't. I wouldn't lower the resolution to 720p. That's why lowering the settings is a better representation and a realistic scenario. I still appreciate Hilbert for doing this test and listening to the feedback. 🙂
Well if Hilbert wants to include 1080p at reduced details into CPU testing then I have no argument against it, although I don't think it's strictly necessary given his current 720p testing.
https://forums.guru3d.com/data/avatars/m/270/270233.jpg
sverek:

TIL 2 mid range GPU = high end.
I assume that was directed at me? FYI, the 2 x RX 580 is for my Threadripper system. I use 2 x GTX 1080 in my gaming rig, which was the high-end card before the Ti.
https://forums.guru3d.com/data/avatars/m/268/268854.jpg
Hmm i don't get it. For ppl that use the PC for pure gaming and STILL only have full HD monitors..i mean come on if you are a gamer looking for good gaming you do NOT game at full hd anymore. Soon as we go to 1440P even this i5 is not great anymore. It only beats even the R5 1600 by what 1-2 FPS? and it still costs (here in norway) 20-25% more? If my games already run 90+ fps the measly differences of 1-5 fps for cpu's 20-40% more expensive does not make sense at all.