Retro review: Intel Sandy Bridge Core i7 2600K - 2018 review

Processors 199 Page 1 of 1 Published by

Click here to post a comment for Retro review: Intel Sandy Bridge Core i7 2600K - 2018 review on our message forum
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Great review Boss.
data/avatar/default/avatar39.webp
Nice Review Hilbert. I'm still running the 2500K and at 4.7GHz not having issues at all. Offcourse some games do stutter sometime but nothing big. Good to see this chip is a great oldie that can still come along in games.
https://forums.guru3d.com/data/avatars/m/226/226864.jpg
Nice. I'd love to see one of those with the Core i7 3930K or 3960X Sandy Bridge E (including 4-4.5GHz overclock) as well. I somehow doubt it's worth an upgrade right now, especially when taking the high prices into account.
https://forums.guru3d.com/data/avatars/m/254/254955.jpg
Does there really exist people that use 2600k at stock clocks? What's the point of review? Let's make strawpoll and ask 2600k owners at what clocks do they use this CPU. I think 90% of users would use it under OC conditions. And where is more processor oriented games like BF1 etc. ? MAX and LOW FPS? ..
https://forums.guru3d.com/data/avatars/m/47/47947.jpg
Thx man. I'm still fine with my 2500k @4.5 and a 1060. πŸ™‚
https://forums.guru3d.com/data/avatars/m/173/173869.jpg
running 3930k @4.5. not even thinking to upgrade. this is strong beast for gaming and other tasks. and I still can't believe the price in 2011 vs 2018.
https://forums.guru3d.com/data/avatars/m/163/163032.jpg
one of the simple reasons thises chips oc so well is cos of the solder, therefore better cooling. Good ol days now we have to deal with the piegon poop turd BA&#$^$*! F&*# NEECH!!!
data/avatar/default/avatar11.webp
Great article. I still use a 3770k @4.7 and a 1080ti. This allows me to game at high res using dsr so my gpu is usually the bottleneck. The only title I play when I feel the need for a better cpu is ac origins. Certain places in that game struggles to hold 60fps but then again, this is at ultra settings. I think for me, the fact remains that a Β£1000 upgrade is still just not worth it. Perhaps if I had a weaker gpu and played at 1080 then it may be worth it. The amount of times I have nearly bit the bullet and upgraded is crazy but at my gaming resolutions, the extra 3-8 fps surely isn't worth it is it especially when we are talking like 110 fps when it could be 116fps with 8700k?
https://forums.guru3d.com/data/avatars/m/115/115462.jpg
For 1440p and above it's still more than enough for most games (with exception the ones that use more CPU power/cores like BF1 for example). Legendary CPU really. 😱
https://forums.guru3d.com/data/avatars/m/239/239175.jpg
Hilbert Hagedoorn:

Gaming, however, was an interesting topic. Here the reality is simple the 2600K runs out of juice in CPU bound situations like low resolutions. The fact remains though, at 1080P is still has enough oomph to deliver decent enough numbers on anything below a GTX 1080, I mean not hugely great but certainly decent enough. When we take the GPU out of the equation and look solely at 720P performance, here you can see and measure a rather dramatic effect where Sandy Bridge limps behind. But let's always remember, a GPU bottleneck is far more apparent than a CPU bottleneck.
To me, and you're not mentioning this at all it seems, it looks like there's no difference at all in QHD (1440p). So if you're on Sandy Bridge and are split between getting a new GPU + a completely new system, vs getting a new GPU + a 1440p g-sync monitor, you're probably well advised to get a GPU and 1440p monitor. If, on the other hand, you're all about 1080p or lower + highest FPS you can get ("300FPS competitive stuff",) you should opt for new platform instead. (Of course if you have loads of money to throw, you'll upgrade everything... πŸ˜›)
https://forums.guru3d.com/data/avatars/m/260/260048.jpg
Mega Guide: Do you have 1080p screen or 1440p/4k? 1080p - Yes, worth an upgrade. 1440p/4k - No, keep using it as GPU is the bottleneck.
data/avatar/default/avatar05.webp
RealNC:

To me, and you're not mentioning this at all it seems, it looks like there's no difference at all in QHD (1440p). So if you're on Sandy Bridge and are split between getting a new GPU + a completely new system, vs getting a new GPU + a 1440p g-sync monitor, you're probably well advised to get a GPU and 1440p monitor. If, on the other hand, you're all about 1080p or lower + highest FPS you can get ("300FPS competitive stuff",) you should opt for new platform instead. (Of course if you have loads of money to throw, you'll upgrade everything... πŸ˜›)
In a nutshell this is true but bear in mind that general computing tasks (and particularly multitasking) will likely also feel lot better with a modern CPU and platform overall. With next generation of GPUs, whenever it'll come, the 2600K might even starts falling behind in WQHD/UHD. Those on a budget can certainly drive their Sandy and Ivy for a little bit longer but they are at the verge of falling helplessly behind - if not this year then I suspect 2019 to be year in which an era comes to an end. They've had fantastic run, which last but not least is also thanks to Intel's overall stagnation until AMD threw Ryzen our ways.
data/avatar/default/avatar27.webp
Kinda nice review, but I would say that providing only avarage FPS stats for game benchmarks is so retro these days. πŸ˜€ Where are the minimal FPS stats? %1 and 0.1% FPS drops?
data/avatar/default/avatar34.webp
2600K overclocked to ~ 4.5ghz with 16gb 2133mhz RAM turns into a whole other story, running it at stock with 1333mhz RAM is kinda "what is wrong with this picture" stuff, but hey, some grandma might run it at stock as an "interweb thingy" πŸ˜›
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
burebista:

Thx man. I'm still fine with my 2500k @4.5 and a 1060. πŸ™‚
When you see that 1st child still use an old core2 Q9550 OC with a 1060 and R6 siege put everything at max by default at 1440... then of course you are still fine with your 2500K πŸ™‚
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
Stock 2600k @ 3.4ghz. Add 30-40% to the benches since these chips can easily OC to that.
https://forums.guru3d.com/data/avatars/m/166/166706.jpg
Well You all bought in your overclocking capabilities which has nothing to do with the cleverness of the old days of overclocking - it was all done with a lot of risk but not paid. What it means basically U did not OCed at all, it was bought in speed.
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
Ourasi:

2600K overclocked to ~ 4.5ghz with 16gb 2133mhz RAM turns into a whole other story, running it at stock with 1333mhz RAM is kinda "what is wrong with this picture" stuff, but hey, some grandma might run it at stock as an "interweb thingy" πŸ˜›
Non OC intel 2*00 are not as weak as you think...
data/avatar/default/avatar12.webp
Great review HH!! Thanks.
data/avatar/default/avatar35.webp
Great review , very nice processor ! but I think it s time to change : I ve just changed my 3770k@4.5Ghz for 8700k@4.7Ghz (on all core) I had too many stuttering on BF1 , Battlefront 2 , Pubg or Assassin credd origin. Now no more stuttering , games are perfectly smooth (1440p) And I think , we ll have more and more new games that demands very good processor (Far Cry 5 etc...)