Intel Loses 5X More Average Performance Than AMD From Mitigations: Report
Intel has published its own set of benchmark results for the mitigations to the latest round of vulnerabilities, but Phoronix, a publication that focuses on Linux-related news and reviews, has conducted its own testing and found a significant impact. Phoronix's recent testing of all mitigations in Linux found the fixes reduce Intel's performance by 16% (on average) with Hyper-Threading enabled, while AMD only suffers a 3% average loss. Phoronix derived these percentages from the geometric mean of test results from its entire test suite.
From a performance perspective, the overhead of the mitigations narrow the gap between Intel and AMD's processors. Intel's chips can suffer even more with Hyper-Threading (HT) disabled, a measure that some companies (such as Apple and Google) say is the only way to make Intel processors completely safe from the latest vulnerabilities. In some of Phoronix's testing, disabling HT reduced performance almost 50%. The difference was not that great in many cases, but the gap did widen in almost every test by at least a few points.
To be clear, this is not just testing with mitigations for MDS (also known as Fallout, Zombieload, and RIDL), but also patches for previous exploits like Spectre and Meltdown. Because of this, AMD also has lost some performance with mitigations enabled (because AMD is vulnerable to some Spectre variants), but only 3%.
Have you disabled hyperthreading?
(Score: 3, Insightful) by shortscreen on Monday May 20 2019, @01:23PM (1 child)
Old home computers and DOS-based systems didn't have the concept of multiple users. And I liked it that way. They were called "personal computers" and they usually only had one user so it made sense.
But then Win NT copied the idea of multiple users from VMS or *nix or whereever and it was foisted on everyone. Although I never accepted it. I still run as admin on FAT32 so I that I NEVER have to enter a password to access my own disk. But "they" said we shouldn't do that because we needed to run with reduced permissions for security. So basically, it is assumed that the user will inevitably run malicious code and limiting the user's actions is somehow the solution to this, as if the user's files being trashed or leaked is less bad than than the same thing happening to standard OS or application files which can simply be reinstalled from whatever medium. And in the case of Windows, that's before the OS itself started shipping with adware, spyware, and DRM built in, raising the question of what is being "secured" from who.
The multi-user model is now breaking down due to bad assumptions about the hardware, but the security strategy for a single-user system is not affected. That strategy being: don't run malicious code.
(Score: 4, Insightful) by EEMac on Monday May 20 2019, @01:56PM