New Roadmap says Core i9 9900K in September, the rest in 2019

Published by

Click here to post a comment for New Roadmap says Core i9 9900K in September, the rest in 2019 on our message forum
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
las:

I guarantee you will, if they use solder. I've seen delidded 8700K doing 5-5.1 GHz using air coolers. The only reason why they get very hot is because of the bad TIM. Solder will fix this. This is the reason why Intel has been able to push clocks even higher on top of increasing the core/thread count. I would not be surprised to see 9900K/9700K hits 5.2-5.4 GHz using 240mm AIO or better. A few people are already running their delidded 8086K's at 5.5 GHz on OC.net ... 24/7 that is and rock solid.
Another 0.01% argument? So, because few people delid and you see 5.5GHz scores in HWBot suicide run, you think it is standard clock soldered Chip will have?
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
I seen in the past what was up, this time it won't be any different.
https://forums.guru3d.com/data/avatars/m/270/270233.jpg
Fox2232:

Another 0.01% argument? So, because few people delid and you see 5.5GHz scores in HWBot suicide run, you think it is standard clock soldered Chip will have?
If what las is saying is correct then the 9900K will only appeal to a very small minority. Competitive gaming is of course a niche market, as well as those looking for extreme overclocks. I find it funny how he thinks competitive gaming is high-end gaming, LOL.
data/avatar/default/avatar09.webp
Irenicus:

Silly argument. You could keep skipping gens and always be happy. Very happy with my 8600k, but in 3 gens If I upgraded I could also say "oh I'm glad I waited for x " You realise how silly it sounds?
well considering he comes from the last good intel chip (cooling wise) it's likely he's pushing 4.3-4.5Ghz on a quadcore with hyperthreading. taking those numbers and looking at the past ton of gen's and considering he'd need a mobo+cpu+probably ram it seems like a dumb upgrade to go from a 4.5Ghz quadcore with decent thermals spend 1000 bucks and end up with a 5Ghz quadcore/hexa that will try to burn itself to a crisp thx to tim regardless of cooling unless delid. if the 9900k is indeed a 8 core 5.0-5.3Ghz capable soldered (aka actually responding to cooling) yea it's actually a worthwile upgrade instead of a well... i guess it's newer and slightly faster upgrade. if you had a 2600K and you applied this comment to anything before the 8700k yea silly comment but i feel it's relevant if the rumors are true. it sure as hell is the first intel chip in years that if rumors pan out i am getting this one way or the other (or the rest has to also be soldered) having a timtastic 4670k i dare say never again, i even contemplated voiding warrenty day 1 on the 7700k/8700k cause screw this tim that makes my liquid cooling on par with aircooling (on the cpu) meanwile gpu's chilling at 53 degree's at 100% load >.> never again. so yea soldering instead of toothpaste...big deal..
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
D3M1G0D:

If what las is saying is correct then the 9900K will only appeal to a very small minority. Competitive gaming is of course a niche market, as well as those looking for extreme overclocks. I find it funny how he thinks competitive gaming is high-end gaming, LOL.
Well, even if it was priced at proclaimed $450 before taxes, it is already in price range where people say: "U Mad Bro?" Average gamer expects CPU to cost no more than $250. So it is like Ryzen 2600X. And then GPU under $450, so something like GTX 1070. If they can get away with cheaper, then they are happy. Average gamer cares little about screenshots showing aliasing, therefore they are fine with GPU with which they run w/o AA. Here we can have another debate about SLI 4K BF1 and kind of borked AA. We can look at screenshots and see how it behaves. But in game it is just fine due to motion and pixel response time. And same thing applies to that average user who wants to play game instead of looking at screenshot of the game. From @las posts, I have bad impression that he is not playing games, he is watching corner of screen showing fps. (+ he is likely fps insensitive outside of numeric obsession.) Otherwise he would not say that move from Sandy/Ivy to Ryzen is side grade. User experience is on another level. fps may look similar, but that's where similarities end. I bet 1/2 of forums remember me complaining about having gaming console as I could not even use browser while gaming as it worsened stutter. All things I can do while game is running. It is just pure comfort and game's performance sees no degradation.
data/avatar/default/avatar03.webp
las:

Nah, it will be one the same 1151 v2 platform, even Z370 boards will be able to run i9-9900K after a firmware update. Probably every single 300 series chipset board will. 8700K is not soldered, 9900K is. 9900K will have higher clocks, 2C/4T more, more cache. From leaked benchmarks it's close to 30% faster than 8700K stock vs stock. This CPU will easily OC to 5.2+ just wait and see. Delidded 8086K already hits 5.3-5.4 for 24/7 usage. 5 GHz all-cores will be very easy using aircooling. Intel's crappy thermal paste has been holding back their chips, they are OC beasts. They simply don't hit an OC wall like Ryzen chips. Tons of people are running Intel chips at 5-5.5 GHz on OC.net, for 24/7 usage and without insane overvolting. And they will be soldered. Alot of leaks suggests this and it's the only way Intel would be able to arhieve such high clocks on a 8C part. No way they are using their crappy thermal paste here. It would overheat and throttle.
its 30% faster in synthetic benchmarks that can utilize all these cores. it needs to be soldered because its TDP is gonna be HIGH. 16 threads. if you actually look at the leaks it hits 5GHZ boost. same as the 8700K. for Z370 we'll see how well they support it. sounds like very high power draw with 16 threads and 5GHZ boost.
data/avatar/default/avatar16.webp
las:

Nah it needs to be soldered to keep temps down, because of insanely high clockspeeds compared to Ryzen, while still maintaining same core count as Ryzen 7. 8700K does not boost to 5 GHz. 8086K does, and it's a LE. Delidded 8086K easily does 5.2-5.5 GHz. Do you even OC?
it is gonna have the same clocks as the 8086K(BTW 8086K vs 8700K shows almost 0 performance difference so ya good for you)
https://forums.guru3d.com/data/avatars/m/270/270233.jpg
las:

And I find it funny that you run SLI in 2018. You like lower minimums than a single card, crappy support and bad frametimes I guess.
LOL, I do not use SLI - I do grid computing on my computers when I'm not gaming and use these GPUs independently. At any rate, competitive games are not particularly demanding (most of them can be played comfortably on budget hardware, like a 2400G). High-end gamers are those who play at 4K, those who play demanding AAA games, those who are eagerly anticipating the release of the 1180.
https://forums.guru3d.com/data/avatars/m/232/232130.jpg
las:

And I find it funny that you run SLI in 2018. You like lower minimums than a single card, crappy support and bad frametimes I guess.
Keeping Intel’s side is fine and all, but no need to offend other users setup. People who might run SLI here know what they doing. I am glad to hear opposite opinion, but again there no any price or benchmarks, only rumors. No need to get too attached to it.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
las:

Nah it needs to be soldered to keep temps down, because of insanely high clockspeeds compared to Ryzen, while still maintaining same core count as Ryzen 7. 8700K does not boost to 5 GHz. 8086K does, and it's a LE. Delidded 8086K easily does 5.2-5.5 GHz. Do you even OC?
First Law of Thermodynamics. @HardwareCaps is right. And you just said he is not right, but paraphrased his post confirming he is right πŸ˜€
data/avatar/default/avatar13.webp
Fox2232:

First Law of Thermodynamics. @HardwareCaps is right. And you just said he is not right, but paraphrased his post confirming he is right πŸ˜€
proof: https://www.anandtech.com/show/12945/the-intel-core-i7-8086k-review/12 https://www.tomshardware.com/reviews/intel-core-i7-8086k-cpu-8086-anniversary,5658-7.html https://www.tomshardware.com/reviews/intel-core-i7-8086k-cpu-8086-anniversary,5658-6.html https://www.anandtech.com/show/12945/the-intel-core-i7-8086k-review/11 Both hit 4.9~5GHZ with the 8700K BTW... which proves my second point. same gaming performance just more money and heat. Intel already maxed out their IPC & clocks in the 14nm process node, if you don't need the extra cores its pointless
data/avatar/default/avatar38.webp
las:

Nah it needs to be soldered to keep temps down, because of insanely high clockspeeds compared to Ryzen, while still maintaining same core count as Ryzen 7. 8700K does not boost to 5 GHz. 8086K does, and it's a LE. Delidded 8086K easily does 5.2-5.3 GHz. Some does 5.4-5.5. We're talking 24/7 here. Soldered 8C will do 5 GHz all cores with ease. Soon you'll see.
Toms even added a clock comparison for the 8700K vs the 8086K(which costs more ofc) https://www.tomshardware.com/reviews/intel-core-i7-8086k-cpu-8086-anniversary,5658-3.html as you can see the June 2018 8700K are very close in terms of clocks/vol to the 8086K.... and in real world gaming performance it actually performs within margin of error hitting 5GHZ with 8 cores won't help gaming performance....most games can't take advantage of the extra cores that's the whole point
data/avatar/default/avatar23.webp
las:

You have zero clue. 8086K is better binned and reaches ~200 MHz more on avg. It's also faster stock vs stock (stop looking at GPU bound gaming like a casual gamer please, this is the reason people think Ryzen is "good enough" for gaming). Building a rig for 144-240 Hz gaming is a completely different matter, unless you don't want to get the most out of the monitor ofcourse, then feel free to buy Ryzen and see lower fps all over. "Maxed out their IPC" LOL! It scales very well past 5 GHz. This is why high fps gamers aim for 5.2-5.4 GHz on these chips. Silicon Lottery / Caseking "Golden Samples" or luck of the draw. More than 60% of all 8086K hits 5.2+ post delid. Sub 20% of 8700K hits 5.2+ ... There's night and day difference between CFL 6C 5.2 GHz and Ryzen 7 4.2 GHz in CPU bound gaming, this is fact. We're talking 25-50% higher minimum fps in most games. Do you even have experience with 144-240 Hz gaming monitors? All games becomes very CPU bound when aiming for very high fps and Ryzen falls very short here. I suspect most people in this thread are 60 Hz gamers. Or thinks "30 fps is more than enough". hahaha Casuals, begone.
Wow IPC scales with clocks? wth. IT IS PER CLOCK, no sense in what you wrote. you can go and spend 500$ on the 9900K. nothing I can do to convince you, there will be minimal improvements over OCed 8700K.(no one talked about Ryzen..) they tested rocket league it was my first link with 260 FPS there. also Civ 6 which is CPU intensive. still same performance....
data/avatar/default/avatar17.webp
Fox2232:

Jokes aside. I am yet to run into CPU limited situation. M
My 6700K is definitely bottlenecking my 1080 TI. The first team to make a CPU that doesn't overheat and scales well with the 1080 TI gets my money. So far AMD is far away from that point.
https://forums.guru3d.com/data/avatars/m/231/231931.jpg
Fox2232:

Jokes aside. I am yet to run into CPU limited situation.
You don't play a lot of games then. Way too many games where ryzen is significantly slower compared to intel architecture. As a competitive gamer and being used to 165hz, frame drops to even 120 is jarring to me; I want my min frames to be the same as my refresh rate. Ryzen is usually fine in AAA games like battlefield etc where GPU performance is always the limiting performance but games like fortnite, Ryzen is just not cutting it for me. Games like those plus older games where they are limited to 1 or 2 threads will always be much slower; this will always be the case unless ryzen 2 is significantly faster.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Agent-A01:

You don't play a lot of games then. Way too many games where ryzen is significantly slower compared to intel architecture. As a competitive gamer and being used to 165hz, frame drops to even 120 is jarring to me; I want my min frames to be the same as my refresh rate. Ryzen is usually fine in AAA games like battlefield etc where GPU performance is always the limiting performance but games like fortnite, Ryzen is just not cutting it for me. Games like those plus older games where they are limited to 1 or 2 threads will always be much slower; this will always be the case unless ryzen 2 is significantly faster.
Nope, you just forgot that I have AMD's GPU, That at this moment means, It is limiting factor everywhere. Because if it has minimum fps over 120, I just pull up details and enjoy more visuals. But those who claim those ridiculous : no more than 100fps, no more than 75fps, or 'Or thinks "30 fps is more than enough"' ... ... they are just deranged. Or they attempt to beat people into submission with type of argument: "If you disagree, you are tech noob/loser/plebs..."
https://forums.guru3d.com/data/avatars/m/270/270233.jpg
Fox2232:

Nope, you just forgot that I have AMD's GPU, That at this moment means, It is limiting factor everywhere. Because if it has minimum fps over 120, I just pull up details and enjoy more visuals. But those who claim those ridiculous : no more than 100fps, no more than 75fps, or 'Or thinks "30 fps is more than enough"' ... ... they are just deranged. Or they attempt to beat people into submission with type of argument: "If you disagree, you are tech noob/loser/plebs..."
Have you considered going Nvidia? I have both Polaris and Pascal cards, but game almost exclusively on Pascal (I tried briefly on a RX 580, and it reminded me of the old GTX 780 days :P).
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
D3M1G0D:

Have you considered going Nvidia? I have both Polaris and Pascal cards, but game almost exclusively on Pascal (I tried briefly on a RX 580, and it reminded me of the old GTX 780 days πŸ˜›).
I think the performance difference would have to be like 500%+ before Fox would switch to Nvidia and even then he'd have to shower after each gaming session to wipe away all the guilt. πŸ˜›
https://forums.guru3d.com/data/avatars/m/270/270233.jpg
Denial:

I think the performance difference would have to be like 500%+ before Fox would switch to Nvidia and even then he'd have to shower after each gaming session to wipe away all the guilt. πŸ˜›
LOL. Yeah, I can understand not supporting Nvidia (I'm a customer and investor but I don't like their anti-consumer tactics). I bought a GSync monitor last year and felt guilty about it - supporting proprietary tech and all. I'd like to get my hands on a Vega GPU, and perhaps a FreeSync monitor to go with it, but prices are still a bit too high.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
D3M1G0D:

Have you considered going Nvidia? I have both Polaris and Pascal cards, but game almost exclusively on Pascal (I tried briefly on a RX 580, and it reminded me of the old GTX 780 days πŸ˜›).
Yep, month ago. Then I said Big No again as I do for quite a few years. Considering my plans for another FS monitors... And other technologies in works. It would be quite costly. +I really dislike nVidia, likely 10 times as much than before... since I learned how Huang felt so good about worst event in GPU industry ever which he orchestrated. So, their love for proprietary and ignorance against open standards is just cherry on the cake.