Intel Announces Core i9 9900K Processor

Published by

Click here to post a comment for Intel Announces Core i9 9900K Processor on our message forum
https://forums.guru3d.com/data/avatars/m/103/103120.jpg
H83:

500€ for a mainstream CPU is just too much!!!
This is how it was just 10 years ago. 4-core Yorkfield was $530. Intel lowered mainstream prices to $350 without even competition.
data/avatar/default/avatar37.webp
Good thing I'm not in a rush to upgrade my CPU. 6600k at 4.6ghz is still keeping a solid 60fps at 4k.
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
Alright folks, this is not a thread to troll Intel. Let's move along.
https://forums.guru3d.com/data/avatars/m/66/66219.jpg
kakiharaFRS:

sli being pretty much dead we don't care about cpu lanes as much
SLI isn't the only reason for more CPU lanes, far from it. If you have even one pcie M.2 drive on a non-HEDT motherboard then you've just bottlenecked your m.2 drive.
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
Intel still living in dream land or something? I like Intel but it seem like they still acting like there the only CPU worth damn and still string to peddle those super premium prices, I also not sure I buy the 95 TDP if that is actual true then I could possible see it worth price diffrence vs AMD but even then. Does the perfomance back the price up vs AMD and it new Ryzen's?
https://forums.guru3d.com/data/avatars/m/274/274577.jpg
[youtube=4PZw75K9ydY] instresting
https://forums.guru3d.com/data/avatars/m/274/274006.jpg
Hi, when can we expect a 9900K review?
https://forums.guru3d.com/data/avatars/m/232/232130.jpg
SniperX:

Hi, when can we expect a 9900K review?
As it gets released, October 19th I think.
https://forums.guru3d.com/data/avatars/m/199/199386.jpg
I think, genuinely that; CPU focus for gaming is deader than dead. Hilbert ran a test for the sandybridge a few months ago and it showed the power for that CPU is more than enough for gaming. There is no reason to spend huge $ on CPU's for gaming. There just isn't.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Loobyluggs:

I think, genuinely that; CPU focus for gaming is deader than dead. Hilbert ran a test for the sandybridge a few months ago and it showed the power for that CPU is more than enough for gaming. There is no reason to spend huge $ on CPU's for gaming. There just isn't.
Hilbert's testing was as flawed as all other tests I saw for it. Generally all use resource light benchmarks. Be it stand alone or in-game. How does benchmark requiring 500MB of ram and 1,5GB of VRAM shows that Sandy with regular 1333/1600MHz memory stutters? And that PCIe 2.0 in those cases performs bit worse than Ivy w/ PCIe 3.0? It simply does not, there is no sudden caching/swap of resources. Benchmarks preload all data required because they bench GPU, not CPU, memory and disk subsystem. And those Game benchmarks. TR benchmark requires almost no resources and has no sudden moves, there camera moves so slowly that you can die of boredom. BF1 bench with tank moving forward... No sudden turning of camera... Even playing one BF1 match and recording frametimes and then playing on other CPU on same map... that's what will show perfectly clear wild frametimes as you look around, turn. Clear framerate dips, lower minimum fps, stutter. Yes, you can't compare those data on time line, but statistically, it tells story well. Same thing goes for SLI in PCIe 2.0 x4/x8/x16 mode vs. PCIe 3.0 x16 mode. Difference in benchamarks is tiny, because those simply do not have need to move resources around. Take one of those "bit crazy" games which have unique texture in every object and you'll see much bigger variance in frametime dips as so many resources need to be cycled on random. (Yes, game can't know if you are going to take sharp turn left/right and which 1GB of textures has to be ready, so it precaches a lot more than is actually needed.)
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
Fox2232:

Hilbert's testing was as flawed as all other tests I saw for it.
So everyone is wrong but you. Got it.
https://forums.guru3d.com/data/avatars/m/232/232130.jpg
Sandy bridge does show age. and its harder to stabily overclock 7 years old CPU without messing things up. For gpu bound games or 60fps target, Sandy might still cut it, but not for 100+fps gaming.
https://forums.guru3d.com/data/avatars/m/275/275145.jpg
The 9900K will stomp everything in the mainstream, many cores/threads and excellent clocks. The problem is that it will reach in EU 600€ or close to that. It will not be easy to recommend when a 2700X costs basically half and although it´s not the best CPU in absolute performance, it offers more than enough for most people. I think the i9 will end up being a niche CPU, the sales champion should be the 9700K, now also with 8 cores (but lacking HT). I'm curious to see how it will behave compared to the 6C/12T of the 8700K.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Aura89:

So everyone is wrong but you. Got it.
Well, you do not show limitation of slow memory subsystem without actually putting load on it. Right tool for the right job...
https://forums.guru3d.com/data/avatars/m/199/199386.jpg
sverek:

Sandy bridge does show age. and its harder to stabily overclock 7 years old CPU without messing things up. For gpu bound games or 60fps target, Sandy might still cut it, but not for 100+fps gaming.
So - which games require 100fps+ and are CPU bound anno 2018? Surely the focus for developers is GPU using VRAM?
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
kings:

The 9900K will stomp everything in the mainstream, many cores/threads and excellent clocks. The problem is that it will reach in EU 600€ or close to that. It will not be easy to recommend when a 2700X costs basically half and although it´s not the best CPU in absolute performance, it offers more than enough for most people. I think the i9 will end up being a niche CPU, the sales champion should be the 9700K, now also with 8 cores (but lacking HT). I'm curious to see how it will behave compared to the 6C/12T of the 8700K.
Years ago when intel came with iP4 HT there was not that many games/programs benefiting. But those which did (unreal engine) shown great difference. That specific HT code did not benefit from real cores, so 1C/2T chip won against 2C/2T. Today with many cores, it is not that easy to say how much HT helps in each case. But disabling SMT on Zen CPUs has quite some negative impact on well threaded applications.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Loobyluggs:

So - which games require 100fps+ and are CPU bound anno 2018? Surely the focus for developers is GPU using VRAM?
Are you talking about overkill? Then you can say that anything above 1080p is not needed too. Then 2080Ti is not needed. And anything above 8700K is not needed. But good experience, that's needed and stutter means bad experience.
https://forums.guru3d.com/data/avatars/m/232/232130.jpg
Loobyluggs:

So - which games require 100fps+ and are CPU bound anno 2018? Surely the focus for developers is GPU using VRAM?
100fps+ for any action game. Even civ feels smoother by moving a camera. CPU bound are any games in which you remove GPU bound. So if changing settings from ultra to high, will offset load from GPU and stress more CPU. In BF4, I am capped by CPU (2500K) @90 fps. While GPU (GTX970) is on 60% load. Game on Medium settings, 1440p resolution. So yeah, its more common than you think. Not even mentioning stutters that caused by CPU when explosion happens and client have to calculate all debris, overloading CPU.
https://forums.guru3d.com/data/avatars/m/270/270233.jpg
kings:

The 9900K will stomp everything in the mainstream, many cores/threads and excellent clocks. The problem is that it will reach in EU 600€ or close to that. It will not be easy to recommend when a 2700X costs basically half and although it´s not the best CPU in absolute performance, it offers more than enough for most people. I think the i9 will end up being a niche CPU, the sales champion should be the 9700K, now also with 8 cores (but lacking HT). I'm curious to see how it will behave compared to the 6C/12T of the 8700K.
Well, you can't really compare a 2700X or 8700K to a 9900K - the former is a mainstream product while the latter is HEDT. No mainstream processor costs 600 euros.
https://forums.guru3d.com/data/avatars/m/270/270233.jpg
Seems that Intel is up to their old tricks. They commissioned a report by Principled Technologies (ironic name) where they showed large differences between the Intel CPUs and AMD ones. Intel CPU used Noctua cooler while AMD used the default cooler, memory timings were looser on AMD, and they tested at 1080p @ medium (on a 4K monitor no less). Steve with the lowdown: https://www.techspot.com/article/1722-misleading-core-i9-9900k-benchmarks/ [youtube=6bD9EgyKYkU]