Intel announces first Kaby Lake 7th gen Core-processors

Published by

Click here to post a comment for Intel announces first Kaby Lake 7th gen Core-processors on our message forum
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
I bet they will be whole 3% faster than Skylake. That's gonna blow socks off seriously! Maybe they'll be only 10% more expensive to compensate for the incredible speed increase.
https://forums.guru3d.com/data/avatars/m/128/128096.jpg
I bet they will be whole 3% faster than Skylake. That's gonna blow socks off seriously! Maybe they'll be only 10% more expensive to compensate for the incredible speed increase.
Funny guy, actually they are projected to be 12-16% faster at the same price points. Which is more than can sometimes be achieved from an architecture update. Or for the same performance plus the feature updates, you now pay $112 less. Seriously, what's with people hating/trolling everything on forums these days?
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
The Speed Shift is the only interesting new feature. Should help getting to boost clocks way faster, which will help in gaming. It'll take a while until it trickles down to enthusiasts.
Skylake already has speedshift and honestly it's not going to do much for gaming. It's mainly for mobile "race to sleep" situations.
https://forums.guru3d.com/data/avatars/m/166/166706.jpg
The Speed Shift is the only interesting new feature. Should help getting to boost clocks way faster, which will help in gaming. It'll take a while until it trickles down to enthusiasts.
I do not think Your CPU needs upgrading for another couple of years, its not like you are going to feel any difference in gaming when faster CPU boosts easy for graphic card moments to render, but when gpu begins to choke there is no magic potion and +- 1 frame will look exactly the same.
data/avatar/default/avatar30.webp
The best shift Intel would do for the desktop CPUs should be bringing the Broadwell-C architecture for a bigger audience. That's one exciting processor with serious gains over previous generations - and even over Skylake, for crying out loud! EDRAM should be the norm for every new Intel chip. Not only would be possible to use the Iris Pro graphics for those who doesn't need or don't want a dGPU, but also, it would provide some serious boost on CPU related scenarios with the iGPU disabled.
https://forums.guru3d.com/data/avatars/m/115/115710.jpg
I bet they will be whole 3% faster than Skylake. That's gonna blow socks off seriously! Maybe they'll be only 10% more expensive to compensate for the incredible speed increase.
It's about 2% faster IPC wise. The rest are all frequency gains.
https://forums.guru3d.com/data/avatars/m/166/166706.jpg
It's about 2% faster IPC wise. The rest are all frequency gains.
this is insane leap forward - I now understand why existing skylake boards are expendable
data/avatar/default/avatar34.webp
Do you think we will ever get to the point where I would be able to merit upgrading my 3570K? I've been running it at 4.6GHz. Meanwhile, I just did 2 GPU jumps within a year. GTX660 to GTX 970, then to GTX 1070 (brother bought the 970). Massive leaps in performance. And yet I've been 4 years on this ivy-bridge part.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Do you think we will ever get to the point where I would be able to merit upgrading my 3570K? I've been running it at 4.6GHz. Meanwhile, I just did 2 GPU jumps within a year. GTX660 to GTX 970, then to GTX 1070 (brother bought the 970). Massive leaps in performance. And yet I've been 4 years on this ivy-bridge part.
Probably not but then I really don't know what people are expecting in terms of CPU performance. For the most part, the best looking AAA games are still heavily GPU bottlenecked and it's only going to get worse as 4K becomes mainstream. General PC performance has been pretty much "fast enough" for 99.9% of general tasks. So what improvements are people looking for anyway? Even if Intel puts out a CPU 40% faster than Skylake in synthetics, that will only translate into like 2-3% in modern AAA titles at modern resolutions and probably 0% in desktop applications. If you're looking for cheaper prices, I get it -- but I really don't see how more performance is going to come.
https://forums.guru3d.com/data/avatars/m/220/220188.jpg
Probably not but then I really don't know what people are expecting in terms of CPU performance. For the most part, the best looking AAA games are still heavily GPU bottlenecked and it's only going to get worse as 4K becomes mainstream. General PC performance has been pretty much "fast enough" for 99.9% of general tasks. So what improvements are people looking for anyway? Even if Intel puts out a CPU 40% faster than Skylake in synthetics, that will only translate into like 2-3% in modern AAA titles at modern resolutions and probably 0% in desktop applications. If you're looking for cheaper prices, I get it -- but I really don't see how more performance is going to come.
that would make 160hz+ gaming an easy task
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
At this point waiting for Zen to show up seems reasonable, even if you are on a Sandy Bridge. I dont see Kaby Lake interesting at all, maybe Coffee Lake or Cannonlake. Which one will bring us six core i5's again?
https://forums.guru3d.com/data/avatars/m/231/231931.jpg
Probably not but then I really don't know what people are expecting in terms of CPU performance.
Faster CPU is necessary in older titles especially where core clock/IPC is more important than core count. Fast cards such as TX pascal or SLI will benefit as well.
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
At this point waiting for Zen to show up seems reasonable, even if you are on a Sandy Bridge. I dont see Kaby Lake interesting at all, maybe Coffee Lake or Cannonlake. Which one will bring us six core i5's again?
CoffeLake, but idk I would rather get 8core (16ht) then, well I will, sometime in 2018 -2019.. Although I saw in 2020 new cpus will use DDR5 (Zen+ maybe too?).. Might as well wait for that, its been 3yrs now on 4770k, whats more another 3-4 😀
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
i have 3770k stable at 4.3 with a mugen 3 ....close to 4 years now ... and i do not feel the need to upgrade ...although my gtx 770 aged ...and badly ! i should have picked the 290 after all for 10 extra euros :P
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Probably not but then I really don't know what people are expecting in terms of CPU performance. For the most part, the best looking AAA games are still heavily GPU bottlenecked and it's only going to get worse as 4K becomes mainstream. General PC performance has been pretty much "fast enough" for 99.9% of general tasks. So what improvements are people looking for anyway? Even if Intel puts out a CPU 40% faster than Skylake in synthetics, that will only translate into like 2-3% in modern AAA titles at modern resolutions and probably 0% in desktop applications. If you're looking for cheaper prices, I get it -- but I really don't see how more performance is going to come.
This is such a chicken and egg question. If Intel had shifted to mainstream 6-core and perhaps 8-core enthusiast years ago, meaning lots and lots of people would have had that much, why wouldn't software take advantage of it when applicable? But since Intel decided nothing needs to happen in a decade, there's no way any software house would specifically count on 6-8 cores because only 0.5% customers might have them. Of course there are lots of exceptions like all media encoders, graphical rendering software, and in general stuff that requires heavy compute in a simple form, which all can use more cores than most people would ever have. Which why Intel does make such CPUs, even if the price is an arm, a leg, and a kidney. I just miss the times when something actually changed from generation to generation. Now when I upgraded from Ivy to Skylake, I got maybe 10% more power. There were two whole generations of Intel CPUs between those two. I don't know whether I should laugh or cry. Would I have actually needed more power? No, not really, as I still stuck to i5. But it's just sad when the world becomes stagnant and development stops. There was a time when you could actually believe tech companies like Intel had visionaries with dreams and ambitions. Now they just make money and that's it.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
Basically it´s just more of the same, only a little faster... I really hope Zen delivers because CPUs releases are becoming boring as hell...
https://forums.guru3d.com/data/avatars/m/186/186763.jpg
Will be buying an i7 7700K in the new year! Looking forward to overclocking it, what with the improvements to the transistors could be a blast to OC!!
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
This is such a chicken and egg question. If Intel had shifted to mainstream 6-core and perhaps 8-core enthusiast years ago, meaning lots and lots of people would have had that much, why wouldn't software take advantage of it when applicable? But since Intel decided nothing needs to happen in a decade, there's no way any software house would specifically count on 6-8 cores because only 0.5% customers might have them. Of course there are lots of exceptions like all media encoders, graphical rendering software, and in general stuff that requires heavy compute in a simple form, which all can use more cores than most people would ever have. Which why Intel does make such CPUs, even if the price is an arm, a leg, and a kidney. I just miss the times when something actually changed from generation to generation. Now when I upgraded from Ivy to Skylake, I got maybe 10% more power. There were two whole generations of Intel CPUs between those two. I don't know whether I should laugh or cry. Would I have actually needed more power? No, not really, as I still stuck to i5. But it's just sad when the world becomes stagnant and development stops. There was a time when you could actually believe tech companies like Intel had visionaries with dreams and ambitions. Now they just make money and that's it.
I don't think Intel stopped innovating, they just shifted in a direction you nor most on Guru3D are interested in. The entire industry forecasted mobile as the future. You can agree or disagree but it's not really relevant because that's where the industry headed. While Intel's performance hasn't increased much on the desktop general processing side, the mobile market performance has increased several fold, especially with integrated graphics. The competition in that market is extremely fierce and Intel is doing everything it can to shrink it's relatively massive X86 processors into phones/tablets/ultrabooks.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Will be buying an i7 7700K in the new year! Looking forward to overclocking it, what with the improvements to the transistors could be a blast to OC!!
What overclocking? The chip is already at 4.5ghz with boost. How much more do you think it will go?
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
I don't think Intel stopped innovating, they just shifted in a direction you nor most on Guru3D are interested in. The entire industry forecasted mobile as the future. You can agree or disagree but it's not really relevant because that's where the industry headed. While Intel's performance hasn't increased much on the desktop general processing side, the mobile market performance has increased several fold, especially with integrated graphics. The competition in that market is extremely fierce and Intel is doing everything it can to shrink it's relatively massive X86 processors into phones/tablets/ultrabooks.
The Atom x5/x7 is a really nice CPU actually. A tiny Core CPU at 10nm, in a Surface Phone that could run Win32 apps would be a killer.