Google security researchers have come to the conclusion that speculative execution attacks are here to stay without drastic changes to modern CPU architectures, such as removing speculative execution entirely.
Spectre is here to stay: An analysis of side-channels and speculative execution
Related:
Patch for Intel Speculative Execution Vulnerability Could Reduce Performance by 5 to 35% [Update: 2]
Qualcomm Joins Others in Confirming its CPUs Suffer From Spectre, and Other Meltdown News
Congress Questions Chipmakers About Meltdown and Spectre
What Impact Has Meltdown/Spectre Had on YOUR Systems?
Intel Admits a Load of its CPUs Have Spectre V2 Flaw That Can't be Fixed
Intel FPU Speculation Vulnerability Confirmed
New Spectre Variant SpectreRSB Targets Return Stack Buffer
Intel Discloses a Speculative Execution Attack in Software Guard eXtensions (SGX)
Intel 'Gags' Linux Distros From Revealing Performance Hit From Spectre Patches
MIT Researchers Claim to Have a Solution for Some Speculative Execution Attacks
Spectre, Meltdown Researchers Unveil 7 More Speculative Execution Attacks
New Side-Channel Leak: Researchers Attack Operating System Page Caches
(Score: 2, Interesting) by Anonymous Coward on Saturday February 16 2019, @04:05PM (2 children)
Maybe we can make a software fast enough?
Seriously, There are lots of power used because some programmer had deadline too close and used another library on a framework on a library on a non-standard extension to the framework.
From my experience in IT studies, students of 2nd year have a small assembler course. Most of them have no idea how the low-level program operates nor how to program standard devices. Maybe we should go back to teaching programmers, not users of libraries?
I know temptation is big. Rich bosses buy better and better hardware for developers, for open source too, but it has a price, and speculative execution errors are a tip of the iceberg.
(Score: 0) by Anonymous Coward on Sunday February 17 2019, @01:49PM (1 child)
Not sure where to drop this so I'm putting it here just because.
Big family, lots of computing. Mac, Windows, Linux. Yes.
Worked at IBM back in the mid-late '80s on big mainframes, RS6K, AS400...
We discovered that clients could save millions by putting an AS400 emulator on an RS6000 and also benefit from monstrous performance improvements. Management killed that project and slapped a gag order on us real quick. (Transaction Protocol Council, TPC/A, TPC/C -- I was on the committee that created those tests and also the reports.)
I'm the type who generally laughs at conspiracy theorists, but I know enough about the nuts and bolts of software, OS's and also upper management types (corporate and government) to recognize some peculiar patterns. When both my windows and Mac os's, at different locations, on different networks start glitching in the same way at the same times, something nefarious is definitely going on. (I don't use the Linux box enough to see the patterns there, so can't say about that one, but some of what systemd does seems awfully suspect to me)
I solved the glitching problem by getting a 2007 Mac Pro and a 2008 Macbook Pro and using OSX 10.6 on both of them. It was like a breath of fresh air. These are the fastest computers in my house (and also I have up-to-date windows and a modern Macbook Pro and a 2013 Mac Pro as well.) BUT, The older machines are only faster if they are NOT connected to the Internet. The second I plug them into the net and launch a web browser -- even to just the google home page -- machine speed goes noticeably slower for all the software on it.
So instead, the computers I use the most are at least ten years old and connected by wire to an internal network that is NOT connected to the internet. They run great. Added benefit -- I don't have to constantly re-learn how to use my software after every other update. When I need data from on-line, I get it with the sacrificial laptop and transfer the data via SD card. Its a little inconvenient, but now there's no more glitching. I can work in peace on a good, snappy system.
If you use ANY modern computing system, you are being eavesdropped, monitored, manipulated, who knows what. The processor is only the tip of the iceberg. We live in dangerous times.
(You remember that scene in the Snowden movie where they put their phones in the microwave? Amateurs! The phone can tell when it's in a faraday cage and can still hear soundwaves, and record them, and store them until it's not in a faraday cage anymore, then transmit them. I honestly don't care if they want to monitor me (they may even have a good reason for doing it) but I draw the line when they start impacting my ability to do good work by glitching my system up. That's when I cut them off, or at least, raise the bar so they have to work a little harder.)
(Score: 2) by takyon on Sunday February 17 2019, @09:37PM
Just wrap the phone in foil and put in the fridge or something. Then move into another room. Signal will be dead and it's unlikely to pick up your conversation unless it has year-2050-grade microphone arrays.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]