Core i9-10900K CPU Score Spotted in 3DMark - let's chart that up

Published by

Click here to post a comment for Core i9-10900K CPU Score Spotted in 3DMark - let's chart that up on our message forum
https://forums.guru3d.com/data/avatars/m/219/219428.jpg
NewTRUMP Order:

I find this entertaining, every time AMD puts out a new cpu, Intel's answer is to push more voltage, a whopping 125 watts, through their old processor and call it "new".
Don't forget that the TDP of 125W is without turbo so this baby will heat your house during winter and will give you a heat stroke during summer πŸ˜‰.
https://forums.guru3d.com/data/avatars/m/66/66148.jpg
Umm either that's not a good score, or the 10980XE is really a pile of trash. Looks like Time Spy CPU score is not much use in determining anything. Which has been known by anyone doing any overclocking for years. If anything it shows it can hold a nice high all core boost for the 1min or so the test runs for. Now lets test with Blender Gooseberry render test...
https://forums.guru3d.com/data/avatars/m/270/270008.jpg
I do find it humorous when large core count CPU's are ran in single threadeded tests and lots of conclusions are drawn from that result. Im sure this will be similar to the 9900K in that it will win out in single threaded tests and loose to AMD in multithreaded tests. I think it's very safe to assume since Intel made no architecture level changes. This CPU likely won't be that special for multithreading as I see no way you can run those 10 cores at higher frequencies without having an extremly exhotic cooler.
https://forums.guru3d.com/data/avatars/m/229/229509.jpg
I suppose the more pertinent question is "how much power does it draw"...?
https://forums.guru3d.com/data/avatars/m/260/260150.jpg
Intel need to pull there finger out, i love intel but this cpu to me aint worth it, i look at it as an overclocked older cpu (nothing worth talking about) ever since i had the amd fx8350 it was dead for years with no worthy upgrades, then i move to intel 4790k and now the 8700k but if intel keep this track record of high tdp and 14nm then il end up where i started back with amd, atm im happy enough with the 8700k for at least 1 more year but i could be happier sooner than later Hope when intel release proper 10nm cpu for gaming it will be 8+ cores and much lower temps and tdp when under load
data/avatar/default/avatar38.webp
Glottiz:

This is mighty impressive showing from Intel. It beats 800EUR 16c/32t CPU and almost beats 1500EUR 24c/48t CPU. Now if pricing of 10900K is in 550EUR range it will be amazing bang for buck. EDIT: While I expect these high thread AMD CPUs might still come on top in pure rendering workloads, 10900K looks like to be a killer gaming and general enthusiast home use CPU.
You must be a huge troll, really.
data/avatar/default/avatar11.webp
Intel, the game is end with your 300W hots cebureks..
data/avatar/default/avatar20.webp
BReal85:

You must be a huge troll, really.
Why is he a troll? Intel IS faster in cpubound games, and Zen 2 is faster per core in Cinebench. Different cpu's for different use πŸ™‚ I have 3900x and 9900k, and Love both πŸ˜‰
data/avatar/default/avatar01.webp
nizzen:

Why is he a troll? Intel IS faster in cpubound games, and Zen 2 is faster per core in Cinebench. Different cpu's for different use πŸ™‚ I have 3900x and 9900k, and Love both πŸ˜‰
Yea because like gamers .... that bunch of dumb shits that play dopamine hit games like apex legends for the loot boxes. hahaha. nizzen nobody gives a shit about if the game is cpubound or whatever ok they just buy the damn game and play the fucker.ya know two-faced prick , when intel was top dog it was ok for them to brag about cinebench scores, but now that amd is in the lead just by a little all you disregard the results by cinebench and say crap like the above. It's called double standards mate.My advice to you is to keep lobing that asshole up because if intel does beat zen 3 they are going to continue fuck ppl with prices.
data/avatar/default/avatar40.webp
The only thing i find particularly impressive about this Intel CPU, is how intel keeps pushing their 14nm++++++ Process it does actually shows that their Arch was or is really, really solid. It also shows that they were just playing around with customers by releasing old crappy tech at really high prices. But aside from this, that 14nm Intel process is amazing honestly. even 5.3ghz on a single core is already impressive. Having 10 cores running at very nice high clocks by using a kind of an old process is also amazing. Then again, Ryzen CPUs are superior not just because of price/performance but they draw less energy they support new tech like PCIe 4.0 and they perform better clock per clock and as a CPU package, there is only a single diff which is gaming but also then AMD is really really close if not equal when playing anything above 1440p, and anyone who owns a 2080ti + a 3900x or 9900k paired with a 1080p monitor is out reality.
https://forums.guru3d.com/data/avatars/m/232/232349.jpg
neikosr0x:

The only thing i find particularly impressive about this Intel CPU, is how intel keeps pushing their 14nm++++++ Process it does actually shows that their Arch was or is really, really solid. It also shows that they were just playing around with customers by releasing old crappy tech at really high prices. But aside from this, that 14nm Intel process is amazing honestly. even 5.3ghz on a single core is already impressive. Having 10 cores running at very nice high clocks by using a kind of an old process is also amazing. Then again, Ryzen CPUs are superior not just because of price/performance but they draw less energy they support new tech like PCIe 4.0 and they perform better clock per clock and as a CPU package, there is only a single diff which is gaming but also then AMD is really really close if not equal when playing anything above 1440p, and anyone who owns a 2080ti + a 3900x or 9900k paired with a 1080p monitor is out reality.
Another reason to stay away from the blue side for a while.....
data/avatar/default/avatar18.webp
Yikes. Toxicity on hardware websites is in an all time high.
data/avatar/default/avatar07.webp
Glottiz:

Yikes. Toxicity on hardware websites is in an all time high.
Toxic3d.com/ noobs3d.com
data/avatar/default/avatar40.webp
Get a life people. Find some meaning now. Regardless how much you cry, you will never make so much money as CEOs of Intel, Nvidia and AMD, not even their group directors or anyone with a potition of critical responsibility.
https://forums.guru3d.com/data/avatars/m/271/271560.jpg
1) it's about time Intel, aka this is the cpu they should've released last year. 2) at the expected price tag, the only "gamers" who acquire this cpu are on esports (where the sponsor buys the product) or are fanboys - even if they won't say so themselves. the fractional difference between this cpu and a $300 ryzen can and will be made up for by getting a better gpu...like $200 or what the the price differential actually is. 3) Intel is going to lose as much ground in mobile as they have in desktop as these power pigs cannot compare with the new gen ryzen mobile.
data/avatar/default/avatar21.webp
Angushades:

Yea because like gamers .... that bunch of dumb shits that play dopamine hit games like apex legends for the loot boxes. hahaha. nizzen nobody gives a crap about if the game is cpubound or whatever ok they just buy the damn game and play the fucker.ya know two-faced prick , when intel was top dog it was ok for them to brag about cinebench scores, but now that amd is in the lead just by a little all you disregard the results by cinebench and say crap like the above. It's called double standards mate.My advice to you is to keep lobing that asshole up because if intel does beat zen 3 they are going to continue frack ppl with prices.
Did you burn yourself in Australia? Is that why you are so toxic? πŸ˜› Love from Norway πŸ™‚ Looks like you need it πŸ˜‰
data/avatar/default/avatar33.webp
@nizzen Supporting NvIntel causes RagnarΓΆk.
data/avatar/default/avatar18.webp
kohashi:

Seriously?! Can it even be considered as a serious comparison when AMD is associated with a DDR4 2400. Specially when you know how better AMD performs with higher freq. AMD : 4 x 8 GiB DDR4-2400 vs INTEL : 4 x 16 GiB DDR4-2666
^^^^^^^ This, a million times this! We all know how Ryzen scales with Memory. So 2400Mhz was used, why? To what I understand, the default spec on box is 3200, so they didn't even use the recommended memory? God knows what they did to memory timings, which aren't even published.
data/avatar/default/avatar04.webp
Am i missing something? I does not even beat the 3900X in the extreme test and probably uses 50-100w more, cost more i guess and needs a new socket. Sticking to my plan of waiting for the 4000 series release, to buy a hopefully discounted 3900X.
https://forums.guru3d.com/data/avatars/m/273/273822.jpg
TLD LARS:

Am i missing something? I does not even beat the 3900X in the extreme test and probably uses 50-100w more, cost more i guess and needs a new socket. Sticking to my plan of waiting for the 4000 series release, to buy a hopefully discounted 3900X.
It doesn't cost more.