New CacheOut Speculative Execution Vulnerability Hits Intel Processors

Published by

Click here to post a comment for New CacheOut Speculative Execution Vulnerability Hits Intel Processors on our message forum
https://forums.guru3d.com/data/avatars/m/232/232130.jpg
Evildead666:

Hey, at least the older CPU's like my i5-3570K aren't involved for once šŸ™‚
2500K user checks in. Old Intel CPU gang rejoice!
https://forums.guru3d.com/data/avatars/m/234/234122.jpg
karma777police:

Honestly do not care about any of these security issues.
You should. How many rogue VM's do you think is trying to exploit these vulnerabilities? Simply to gain access to data from other VM's in the same system ?
https://forums.guru3d.com/data/avatars/m/63/63170.jpg
ngoni615:

But the older cpus are also affected lol including the 3570K. It is a "Core" series cpu and therefore you are not spared. lol One of the many reasons why i decided to go AMD
Only Intruder:

Yup, and this includes the Core2 line as well, so ye olde Q6600 as an example... Heck when the Spectre and Meltdown software mitigations were released, it tanked performance on the Q6600 tremendously... So much so that a 3.5GHz overclock was necessary to maintain 2.4GHz stock performance, it really did cut performance by about 40%.
If you actually read the article, this CacheOut problem only applies to Skylake CPU's and up. Core 6xxx series and up. not down. Check this list : https://software.intel.com/security-software-guidance/insights/processors-affected-l1d-eviction-sampling Ivy Bridge are NOT affected.
data/avatar/default/avatar22.webp
squalles:

its true, maybe with more 3 or 4 security fixes and losing performance, finally amd can beat the i7 8700k
AMD was beating the 8700K even with the 2700X. Zen2 is eating the 8700K in almost any way.
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
AMD must love this free advertising :P
https://forums.guru3d.com/data/avatars/m/232/232349.jpg
Oh hell....... Here we go again!.!.! WTF Intel!?! All of these performance gains over the years are being negated by all of these hardware level issues. Sad as hell. Just crazy that AMD is untouched by any and all of these issues..... Conspiracy!?! I hope not....
https://forums.guru3d.com/data/avatars/m/40/40086.jpg
It's fortunate for us that AMD is making really good CPUs these days.
data/avatar/default/avatar25.webp
Not one attack or malware in the wild that has exploited any of the vulnerabilities since Spectre/Meltdown. The latest CA microcodes for my 8700K were kind of disastrous, they reduced the Uncore multi to 43x at defaults and dropped about 4-5% performance in a few benchmarks, both CB versions included. First microcode to have such a dramatic effect. Planning to just return to the previous BIOS and stop patching these things. It's just getting ridiculous, especially if you don't really have super-secret info you could lose.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Gomez Addams:

There is NO case where that could happen. In addition, I have not read of any exploit where the data acquired was from the same process as the exploit. It is always left over from the context switches of other processes. I have not read that disabling HT can mitigate the attack either. It just might make it occur less frequently since fewer threads would run simultaneously when HT is disabled. Personally, I view this is a chicken-and-egg type of problem. There will no exploits of this nature what so ever if no malicious code ever runs on your machine. That is the place to take preventative measures.
Maybe you should have read that paper ...
squalles:

No, 2700x are completelly destroyed even the 3900x hardly you can see beating the 8700k https://www.techpowerup.com/review/amd-ryzen-9-3900x/15.html
Well, that bold part actually translates to 10% difference. If that means "completely destroyed", what should someone with 10% higher IQ than you say about you? Linking gaming category and ignoring all other uses. Same applies to 3900X. Except that 3900X is not even gaming CPU. It is productivity CPU. And there it comes cheaper than i9-9900K. Tell me which is better? i9-9900K because it delivers 5% higher fps than 3900X @3,8GHz in average? Or 3900X that finishes all your productivity through week quite faster and lets you play 6 hours more? And mind that today 3900X all of 12C/24T boost to 4,175~4,2 GHz under full load. And that you can actually game while you run quite some stuff in background. Therefore your actual time when you can game is not just few hours more than with i9-9900K. Then TPU test for Zen2 used 3200MHz memory => 1600MHz IF instead of today's quite easily achievable 1800MHz+. (I got there with 4 memory sticks.)
https://forums.guru3d.com/data/avatars/m/270/270233.jpg
toyo:

Not one attack or malware in the wild that has exploited any of the vulnerabilities since Spectre/Meltdown. The latest CA microcodes for my 8700K were kind of disastrous, they reduced the Uncore multi to 43x at defaults and dropped about 4-5% performance in a few benchmarks, both CB versions included. First microcode to have such a dramatic effect. Planning to just return to the previous BIOS and stop patching these things. It's just getting ridiculous, especially if you don't really have super-secret info you could lose.
Ignorance is bliss? I remember when NCIX went out of business and read an article several years later about how customer data was being sold off. Then there was the Newegg hack which stole payment information from customers as they were checking out. My credit card was implicated in both and I immediately cancelled it - nobody had tried to use it yet, but I felt it was better to get rid of it anyways since I knew it had been compromised. Those who leave a known exploit unpatched are basically deciding not to cancel a compromised credit card. It doesn't matter if someone's tried to use it or not, if you know it's been compromised then you should do something about it (and if the cost is a bit of performance loss then so be it). The argument of "I don't have anything important on my computer" is also moot - if you do online banking or have website login information stored on your PC (as anyone who uses their PC frequently undoubtedly will) then you are putting that information at risk.
data/avatar/default/avatar24.webp
D3M1G0D:

Ignorance is bliss? I remember when NCIX went out of business and read an article several years later about how customer data was being sold off. Then there was the Newegg hack which stole payment information from customers as they were checking out. My credit card was implicated in both and I immediately cancelled it - nobody had tried to use it yet, but I felt it was better to get rid of it anyways since I knew it had been compromised. Those who leave a known exploit unpatched are basically deciding not to cancel a compromised credit card. It doesn't matter if someone's tried to use it or not, if you know it's been compromised then you should do something about it (and if the cost is a bit of performance loss then so be it). The argument of "I don't have anything important on my computer" is also moot - if you do online banking or have website login information stored on your PC (as anyone who uses their PC frequently undoubtedly will) then you are putting that information at risk.
Many of the vulnerabilities require admin access to the PC. I am the only one accessing it. It's been 15 years+ since my info has not been compromised in any way, because I understand how the web works. My plan is to keep the Meltdown/Spectre/L1TF/MDS patches in places, because they have basically 0 impact on performance for my PC. I'm done with patching what's next though for sure, I'll try the new microcode for CacheOut when it's released, and if the performance hit is still there, I'm revering back to B4 microcode. It will do fine for 1-2 years until I switch the 8700K, and in the end, it's a very, very small risk I am willing to assume.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
toyo:

Many of the vulnerabilities require admin access to the PC. I am the only one accessing it. It's been 15 years+ since my info has not been compromised in any way, because I understand how the web works. My plan is to keep the Meltdown/Spectre/L1TF/MDS patches in places, because they have basically 0 impact on performance for my PC. I'm done with patching what's next though for sure, I'll try the new microcode for CacheOut when it's released, and if the performance hit is still there, I'm revering back to B4 microcode. It will do fine for 1-2 years until I switch the 8700K, and in the end, it's a very, very small risk I am willing to assume.
Many of them can be executed by script in web page you load. In worst case scenario, one of those advertisement banners you see on this very site will have payload that will get your banking certificate details. Or credit card info from other tab. Or anything else they may target in your system. To put you into reality of your situation. You may be only one who sits in front of your computer. But you have no control and knowledge over what is executed on your CPU. Don't get defensive just yet, none of us have that control. Not even on linux, that's nature of using web.
https://forums.guru3d.com/data/avatars/m/270/270233.jpg
toyo:

Many of the vulnerabilities require admin access to the PC. I am the only one accessing it. It's been 15 years+ since my info has not been compromised in any way, because I understand how the web works. My plan is to keep the Meltdown/Spectre/L1TF/MDS patches in places, because they have basically 0 impact on performance for my PC. I'm done with patching what's next though for sure, I'll try the new microcode for CacheOut when it's released, and if the performance hit is still there, I'm revering back to B4 microcode. It will do fine for 1-2 years until I switch the 8700K, and in the end, it's a very, very small risk I am willing to assume.
What you are doing is basing your decision on the performance impact, and then rationalizing it afterwards - which is completely backwards. You should think of security regardless of performance. My guess is that if there was a critical exploit that allowed anyone to easily gain access to your PC, but the fix resulted in a 10% performance hit to games, then you'd reject it.
https://forums.guru3d.com/data/avatars/m/242/242134.jpg
not really that surprised anymore how ppl "weigh" things. working at a parts shop i had ppl buying stuff like 15$ brake pads below oem/stock, because of cost, when proper ones (around 30$) would not only brake better and last longer, but probably save your life, by stopping the car before you hit something. and even when i explained they would easily spend more money, over the same time frame, just by buying 0.3$/gal more expensive gas.
data/avatar/default/avatar09.webp
Fox2232:

Maybe you should have read that paper ... Well, that bold part actually translates to 10% difference. If that means "completely destroyed", what should someone with 10% higher IQ than you say about you? Linking gaming category and ignoring all other uses. Same applies to 3900X. Except that 3900X is not even gaming CPU. It is productivity CPU. And there it comes cheaper than i9-9900K. Tell me which is better? i9-9900K because it delivers 5% higher fps than 3900X @3,8GHz in average? Or 3900X that finishes all your productivity through week quite faster and lets you play 6 hours more? And mind that today 3900X all of 12C/24T boost to 4,175~4,2 GHz under full load. And that you can actually game while you run quite some stuff in background. Therefore your actual time when you can game is not just few hours more than with i9-9900K. Then TPU test for Zen2 used 3200MHz memory => 1600MHz IF instead of today's quite easily achievable 1800MHz+. (I got there with 4 memory sticks.)
sorry but i donĀ“t use productivity tools, i donĀ“t play cinebench or winrar benchmark, i using my pc to play games on fullhd 144hz, and 3900x are slower than old i7 8700k, i7 9700k and i9 9900k
https://forums.guru3d.com/data/avatars/m/242/242134.jpg
running 1080p. once you take that a way and play a 1440p/144Hz, they arent. so basically anyone that looks at everything else, will be better off, which is a much bigger market. i havent seen someone i know (personally) spending money on a i7 or higher, a decent gpu, and then still runs 1080p (not matter if fps or not, no matter if larger screen and/or res), as your wasting a lot of HP.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
squalles:

sorry but i donĀ“t use productivity tools, i donĀ“t play cinebench or winrar benchmark, i using my pc to play games on fullhd 144hz, and 3900x are slower than old i7 8700k, i7 9700k and i9 9900k
Then explain why are you bringing productivity CPU into discussion instead of something like 3700X? Maybe because that's quite cheaper and is still just some 5% slower than i9-9900K? 5%, contemplate that. Then again, you brought into discussion 2700X. Even new one costs less than half of what 9700K costs. You are justifying more than double price by 10% higher achievable performance in games when fps is uncapped. And guess what? In 7 out of 10 games used there, all do get you quite more than 144fps therefore all that bonus you may get is capped by your screen. And again, if any of those Zen2 chips were configured properly and with newer BIOSes, they would win in some of those games.
data/avatar/default/avatar32.webp
Fox2232:

Then explain why are you bringing productivity CPU into discussion instead of something like 3700X? Maybe because that's quite cheaper and is still just some 5% slower than i9-9900K? 5%, contemplate that. Then again, you brought into discussion 2700X. Even new one costs less than half of what 9700K costs. You are justifying more than double price by 10% higher achievable performance in games when fps is uncapped. And guess what? In 7 out of 10 games used there, all do get you quite more than 144fps therefore all that bonus you may get is capped by your screen. And again, if any of those Zen2 chips were configured properly and with newer BIOSes, they would win in some of those games.
maybe because productivity cpu are threadripper and not ryzen 5%? not in far cry 5, gta, assassins creed, only games who donĀ“t hitting 144fps and almost 20 fps of difference
data/avatar/default/avatar33.webp
fry178:

running 1080p. once you take that a way and play a 1440p/144Hz, they arent. so basically anyone that looks at everything else, will be better off, which is a much bigger market. i havent seen someone i know (personally) spending money on a i7 or higher, a decent gpu, and then still runs 1080p (not matter if fps or not, no matter if larger screen and/or res), as your wasting a lot of HP.
wrong, donĀ“t even 2080 are capable to run at 144fps on 2k resolution in most of games youre just trying justify the defeat of ryzen