Core i9-10900K can boost to 5.3 GHz, more specifications of 10th Gen Core Comet Lake-S leak

Published by

Click here to post a comment for Core i9-10900K can boost to 5.3 GHz, more specifications of 10th Gen Core Comet Lake-S leak on our message forum
https://forums.guru3d.com/data/avatars/m/253/253070.jpg
Keeping this CPU properly cooled is probably not gonna be an easy ordeal... Wonder if Intel will still be plagued by an onslaught of discovered vulnerabilities once this hits the market.
data/avatar/default/avatar17.webp
anticupidon:

And what about people who create content and make whatever minus gaming, your opinion still stands?
no, about productivity on ryzen looks same or even little better sometimes, except on virtual machines, virtualization on intel yet are way better
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
oxidized:

I mean regarding usage of PCIe bandwidth it's not my opinion, it's just facts, even the fastest card on the planet won't occupy 8 lanes of Gen 3 PCIe.
Made a difference for the 5500XT given the recent fiasco regarding that.
https://forums.guru3d.com/data/avatars/m/220/220188.jpg
wifi6, exactly what i was waiting for to upgrade my desktop /s what else? why would anyone go with this instead of ryzen? an extra 10fps on older games?
https://forums.guru3d.com/data/avatars/m/269/269625.jpg
And the next squeeze out of this chip with not fat left ? Boy oh boy, 5.3 on 2 of them, lets see what the heat maybe and will the warranty on this chip be 1 year?
https://forums.guru3d.com/data/avatars/m/242/242134.jpg
@squalles and? that doesnt mean its the biggest share on the market. from all gamers more than 80% are on 1080/60 (maybe even 720 in certain countries), only a small part does up to 100/120, even less do +120, even less +144 Hz. not even talking about the fact that only really affect shooter games (maybe some simulator), an even smaller group of ppl (out of all gamers). so yeah, the 9900 is a great cpu for more than double what i paid for my 3600, that can run Siege at 1440p/75hz with maxed settings incl TAA x4, and with fastsync i get steady +120.
data/avatar/default/avatar02.webp
fry178:

@squalles and? that doesnt mean its the biggest share on the market. from all gamers more than 80% are on 1080/60 (maybe even 720 in certain countries), only a small part does up to 100/120, even less do +120, even less +144 Hz. not even talking about the fact that only really affect shooter games (maybe some simulator), an even smaller group of ppl (out of all gamers). so yeah, the 9900 is a great cpu for more than double what i paid for my 3600, that can run Siege at 1440p/75hz with maxed settings incl TAA x4, and with fastsync i get steady +120.
Wrong, who buy a so expensive and top processor obviously want more framerate, if you talking about full hd and 60hz so ryzen 3600x or i5 9400 is sufficient, dont why reason to compare octo core and so expensive processors, obviously the ryzen 3700x and i9 9900k have a specific public
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
oxidized:

ROFL, videocards nowadys don't even manage to completely busy PCIe 3.0 x8 lanes also intel will be completely dead on the desktop, this made my day. But then again you have 16 messages you're probably new to this world.
Actually, not sure about Nvidia but there's already evidence that even the 5500 is in need of more bandwidth on x8 lanes (that's all the GPU provides). That's not a particularly impressive GPU by any standards, and there is a measurable performance difference when you compare it from PCIe 3.0 to 4.0 (a 4.0 x8 slot has roughly the same bandwidth as a 3.0 x16 slot), and the 5500 is a PCIe 4.0 GPU). Seeing as Nvidia is making substantially more powerful GPUs, I think it's safe to assume that yes, they will in fact be using PCIe 4.0. This is especially true for multi-GPU setups, which they are still working with on servers and workstations.
https://forums.guru3d.com/data/avatars/m/270/270017.jpg
squalles:

amazing, i will wait to buy one i see with my own eyes a i9 9900k vs ryzen 3700x (my friends had both) and ryzen 3700x are really weak and i9 is soo stronger but heat like a hell so i choose wait next generation
So I state my case here. I do a lot of game-content production here, that's my primary purpose it seems. As of this summer I needed to replace my dated, tired old 4790k with something with more cores. i9 9900K was 500$ Decent motherboards were generally to the tune of 30~50$ more than x470 / x570 AMD equal-featured boards. Plus you NEED a water cooler or it's going to be throttling constantly, so up goes the price! R7 3700X by comparison was 330$ + 50$ discount on the motherboard in addition to boards being generally cheaper. I can use the stock cooler which is relatively quiet and also doesn't block my RAM slots / won't LEAK and destroy the system / sound like an aquarium is in here (I'm used to fan noise). Considering I need reliability, this precludes me from water cooling and there-for getting the most out of a 9700k/9900k processor. I saved somewhere to the tune of 200~250$ while having a quieter system. I don't care about the last 1~8% of FPS I could get by spending another 250$ on the processor / cooling combo, I care that my AMD system was a great deal and very powerful at the same time, having all the features I need. I can even upgrade the processor down the road - something I could NOT do on the last intel system I had. I also have PCI-E 4.0 for the next-gen devices (should I buy one). I didn't want a platform that was a year or two old at that point - I wanted something that just came out with the newest this and that, to keep my machine relevant the longest. I don't do competitive online multiplayer high-FPS shooter games, I take my real gun to a real range and practice with it. These types of games lose their appeal when you have satellite internet (out in the boonies), and you start to get close to or hit 40 years old. Never really was crazy about the modern G.I.Joe games* these days, though, for the record (sorry, I didn't deliberately attempt to offend; but I could see it may happen with someone). Everyone's use-case differs, but I'd certainly hope the intel beats the AMD chips (listed above) in *something* because it cost somewhere around 200$ more. I will say the difference of this R7 3700X (3000mhz bargain-bin Micron CL-15 RAM) vs my old 4790k (with 2400mhz CL-11 RAM) in content production and general desktop usage is night and day different. Switching tasks is effortless. Zip/Unzip ops are multitudes time faster in getting done, and working with models in Maya or Blender, and graphics / rendering in Substance Designer is amazingly better by a long shot and well worth the price of entry. I absolutely love the system, there's zero issues with it, no disco-tech (a bit off-topic) and there's not much noise either (and no fish-tank noises). My 4790k delidded could barely keep it's turbo speeds with liquid metal above and below the IHS and a 100$ air cooler on it. Overclocking MY HIDE! Sheesh! The AMD will overclock itself when and where needed and back down if it ever hits 80C (rarely). I can't be bothered to test and re-test stability of an overclock when I could spend the time earning money doing content production related stuff. Now if intel could sell me a system that let me upgrade as easily, with such a new platform like X570 was this past summer (and still is), without all the heat issues, and still under-cut the competition, I'd be willing to consider it just as anyone else would do. *note, for the record, I still enjoy me some old MS-DOS DOOM / Duke 3D / ROTT and even Quake II RTX these days, so shooter games aren't totally out. The old ones were fun though, back when I was a teenager in the 90's. I DO game on this, it is always up to the task and runs anything I throw at it with my RTX 2070 Super in here and bargain-bin 32gb of RAM without hesitation. Wait a minute, you need a 100$+ water cooler setup for the 8086k / 8700k / 9700k / 9900k to get the most out of it? Why didn't you just buy a better CPU with more cores / better platform or move to x299 at that rate, or spend less on the CPU and more on the GPU? That all being said, to avoid sounding snide, for those who bought the 9900k or 9700k or similar, DO enjoy your systems, you'll get years out of them yet regardless. But always remember someone has to MAKE the games you play, and those folks (if they are on budget, especially indie studios / developers) might just select a Ryzen in-stead - just as server operators such as the servers serving this message or someone's online multiplayer game server might often soon be an EPYC processor.
https://forums.guru3d.com/data/avatars/m/215/215813.jpg
Intel need to stop wasting people’s time with 14nm tech. It’s no wonder AMD are obliterating them right now
data/avatar/default/avatar36.webp
bobblunderton:

So I state my case here. I do a lot of game-content production here, that's my primary purpose it seems. As of this summer I needed to replace my dated, tired old 4790k with something with more cores. i9 9900K was 500$ Decent motherboards were generally to the tune of 30~50$ more than x470 / x570 AMD equal-featured boards. Plus you NEED a water cooler or it's going to be throttling constantly, so up goes the price! R7 3700X by comparison was 330$ + 50$ discount on the motherboard in addition to boards being generally cheaper. I can use the stock cooler which is relatively quiet and also doesn't block my RAM slots / won't LEAK and destroy the system / sound like an aquarium is in here (I'm used to fan noise). Considering I need reliability, this precludes me from water cooling and there-for getting the most out of a 9700k/9900k processor. I saved somewhere to the tune of 200~250$ while having a quieter system. I don't care about the last 1~8% of FPS I could get by spending another 250$ on the processor / cooling combo, I care that my AMD system was a great deal and very powerful at the same time, having all the features I need. I can even upgrade the processor down the road - something I could NOT do on the last intel system I had. I also have PCI-E 4.0 for the next-gen devices (should I buy one). I didn't want a platform that was a year or two old at that point - I wanted something that just came out with the newest this and that, to keep my machine relevant the longest. I don't do competitive online multiplayer high-FPS shooter games, I take my real gun to a real range and practice with it. These types of games lose their appeal when you have satellite internet (out in the boonies), and you start to get close to or hit 40 years old. Never really was crazy about the modern G.I.Joe games* these days, though, for the record (sorry, I didn't deliberately attempt to offend; but I could see it may happen with someone). Everyone's use-case differs, but I'd certainly hope the intel beats the AMD chips (listed above) in *something* because it cost somewhere around 200$ more. I will say the difference of this R7 3700X (3000mhz bargain-bin Micron CL-15 RAM) vs my old 4790k (with 2400mhz CL-11 RAM) in content production and general desktop usage is night and day different. Switching tasks is effortless. Zip/Unzip ops are multitudes time faster in getting done, and working with models in Maya or Blender, and graphics / rendering in Substance Designer is amazingly better by a long shot and well worth the price of entry. I absolutely love the system, there's zero issues with it, no disco-tech (a bit off-topic) and there's not much noise either (and no fish-tank noises). My 4790k delidded could barely keep it's turbo speeds with liquid metal above and below the IHS and a 100$ air cooler on it. Overclocking MY HIDE! Sheesh! The AMD will overclock itself when and where needed and back down if it ever hits 80C (rarely). I can't be bothered to test and re-test stability of an overclock when I could spend the time earning money doing content production related stuff. Now if intel could sell me a system that let me upgrade as easily, with such a new platform like X570 was this past summer (and still is), without all the heat issues, and still under-cut the competition, I'd be willing to consider it just as anyone else would do. *note, for the record, I still enjoy me some old MS-DOS DOOM / Duke 3D / ROTT and even Quake II RTX these days, so shooter games aren't totally out. The old ones were fun though, back when I was a teenager in the 90's. I DO game on this, it is always up to the task and runs anything I throw at it with my RTX 2070 Super in here and bargain-bin 32gb of RAM without hesitation. Wait a minute, you need a 100$+ water cooler setup for the 8086k / 8700k / 9700k / 9900k to get the most out of it? Why didn't you just buy a better CPU with more cores / better platform or move to x299 at that rate, or spend less on the CPU and more on the GPU? That all being said, to avoid sounding snide, for those who bought the 9900k or 9700k or similar, DO enjoy your systems, you'll get years out of them yet regardless. But always remember someone has to MAKE the games you play, and those folks (if they are on budget, especially indie studios / developers) might just select a Ryzen in-stead - just as server operators such as the servers serving this message or someone's online multiplayer game server might often soon be an EPYC processor.
Its a little relative, like my friend with a ryzen 3700x and rtx 2080 doing same framerate than another with i9 9900kf and rtx 2070 (no super) 200 dollars investment trashed out But yes to your purpose, looks a good investment
https://forums.guru3d.com/data/avatars/m/269/269912.jpg
squalles:

amazing, i will wait to buy one i see with my own eyes a i9 9900k vs ryzen 3700x (my friends had both) and ryzen 3700x are really weak and i9 is soo stronger but heat like a hell so i choose wait next generation
At least compare two cpu's that are priced the same--i9-9900k $490 vs Amd-3900x $499. Then see with your own eyes whos stronger. https://media2.giphy.com/media/MeJCv5Ub2nu00/giphy.gif
https://forums.guru3d.com/data/avatars/m/115/115462.jpg
Corbus:

Damn,so intel turned out like AMD after all when it comes to overclocking. Manual oc is dying fast.
Yep, there will probably be a tiny-tiny bit of room left, that will likely not be worth the effort and the power/heat that comes with it. It started with the GPUs, and now it's made its way into CPUs as well. On one hand it's good for the super noobs as they will get what they pay for out of the box, but for "mid-range" enthusiasts, that won't got full h2o/ln2, it's a definite loss. Gone are the days of a i7 920 OC'ed to 4.2 GHz and the like. 😳
https://forums.guru3d.com/data/avatars/m/271/271576.jpg
Denial:

Made a difference for the 5500XT given the recent fiasco regarding that.
schmidtbag:

Actually, not sure about Nvidia but there's already evidence that even the 5500 is in need of more bandwidth on x8 lanes (that's all the GPU provides). That's not a particularly impressive GPU by any standards, and there is a measurable performance difference when you compare it from PCIe 3.0 to 4.0 (a 4.0 x8 slot has roughly the same bandwidth as a 3.0 x16 slot), and the 5500 is a PCIe 4.0 GPU). Seeing as Nvidia is making substantially more powerful GPUs, I think it's safe to assume that yes, they will in fact be using PCIe 4.0. This is especially true for multi-GPU setups, which they are still working with on servers and workstations.
If the 5500XT uses more bandwidth than a much more powerful card, take it up with the card itself, not with the bus. https://www.techpowerup.com/review/nvidia-geforce-rtx-2080-ti-pci-express-scaling/ Within margin of error most of the times with the most powerful card on the market.
data/avatar/default/avatar15.webp
omagic:

My yawnometer just hit the red area.
Ive no shits to give.
data/avatar/default/avatar07.webp
Angantyr:

Wonder if Intel will still be plagued by an onslaught of discovered vulnerabilities once this hits the market.
Put money on it.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
oxidized:

If the 5500XT uses more bandwidth than a much more powerful card, take it up with the card itself, not with the bus. https://www.techpowerup.com/review/nvidia-geforce-rtx-2080-ti-pci-express-scaling/ Within margin of error most of the times with the most powerful card on the market.
The 5500XT shows more of a difference because the 4GB edition is VRAM starved. It's a niche scenario and arguably that experience should be avoided but it occurs and the bus can help alleviate it if it's available and there is effectively no downside to having it available. Even in the example you posted though, games like Hellblade show a 14% increase in performance, Wolfenstein 2 shows a 12% increase - Wildlands shows a 6% increase. The rest are margin of error but those examples aren't. So now we have numerous example of specific titles saturating that bus. Theoretically a next generation card with more performance would increase that difference - so if someone was buying a new CPU/Motherboard now with the intention of keeping it through multiple generations of GPU (given the sandybridge people on Guru3D suggest a number of people do this), I'd definitely recommend PCI4 for that person.
data/avatar/default/avatar06.webp
I can only imagine how many watts these will suck. An average 8700K would run 200W in Blender @5GHz. Some 9900K users can't even stresstest their OCs properly because of the CPU or VRMs throttling, even with Intel's optimizations and STIM. Now add 4 more threads and try that 5.3GHz on all 10 cores. I feel they should just start making a special interface for CPUs as well, just like for GPUs, so we can have direct2die factory cooling capable of doing just fine with 300-400W and the VRMs too.
https://forums.guru3d.com/data/avatars/m/271/271576.jpg
Denial:

The 5500XT shows more of a difference because the 4GB edition is VRAM starved. It's a niche scenario and arguably that experience should be avoided but it occurs and the bus can help alleviate it if it's available and there is effectively no downside to having it available. Even in the example you posted though, games like Hellblade show a 14% increase in performance, Wolfenstein 2 shows a 12% increase - Wildlands shows a 6% increase. The rest of margin of error, those examples aren't. So now we have numerous example of specific titles saturating that bus. Theoretically a next generation card with more performance would increase that difference - so if someone was buying a new CPU/Motherboard now with the intention of keeping it through multiple generations of GPU (given the sandybridge people on Guru3D suggest a number of people do this), I'd definitely recommend PCI4 for that person.
Until it's the minority showing any difference there's honestly no argument to be made for this, and you can't know how next gen cards will react to that, you can hypothesize that, but nothing's for certain, and besides next gen will only see 1 card faster than the 2080Ti, and that's probably 3080Ti (or whatever it'll be called), rest will be the same. Anyways i'm not saying that we won't need PCIe 4.0 or 5.0 eventually, but it's just not now, and nor it'll be 2020 because cards won't be magically double the performance we have now, or even 1.5x
https://forums.guru3d.com/data/avatars/m/212/212533.jpg
Intel is a dead horse.