Seventy years into the computer age, Moshe Y. Vardi at ACM wants to know why we still do not seem to know how to build secure information systems:
Cyber insecurity seems to be the normal state of affairs these days. In June 2015, the U.S. Office of Personnel Management announced it had been the target of a data breach targeting the records of as many as 18 million people. In late 2016, we learned about two data breaches at Yahoo! Inc., which compromised over one billion accounts. Lastly, during 2016, close to 20,000 email messages from the U.S. Democratic National Committee were leaked via WikiLeaks. U.S. intelligence agencies argued that the Russian government directed the breaches in an attempt to interfere with the U.S. election process. Furthermore, cyber insecurity goes way beyond data breaches. In October 2016, for example, emergency centers in at least 12 U.S. states had been hit by a deluge of fake emergency calls. What cyber disaster is going to happen next?
[...] The basic problem, I believe, is that security never gets a high-enough priority. We build a computing system for certain functionality, and functionality sells. Then we discover security vulnerabilities and fix them, and security of the system does improve. Microsoft Windows 10 is much, much better security-wise than Windows XP. The question is whether we are eliminating old vulnerabilities faster than we are creating new ones. Judging by the number of publicized security breaches and attacks, the answer to that question seems to be negative.
This raises some very fundamental questions about our field. Are we investing enough in cybersecurity research? Has the research yielded solid scientific foundations as well as useful solutions? Has industry failed to adopt these solutions due to cost/benefit? More fundamentally, how do we change the trajectory in a fundamental way, so the cybersecurity derivative goes from being negative to being positive?
Previously:
It's 2015. Why do we Still Write Insecure Software?
Report Details Cyber Insecurity Incidents at Nuclear Facilities
(Score: 2) by Thexalon on Tuesday April 18 2017, @03:26AM (10 children)
We absolutely know how to build secure computer systems. We do for some very critical stuff, like the space shuttle. The thing is, we usually don't, because it's much cheaper, faster, and easier to build an insecure system and swear up and down that it's secure than it is to build a secure system. And by the time the problem shows itself, the people responsible for the decision are long gone.
The only thing that stops a bad guy with a compiler is a good guy with a compiler.
(Score: 0) by Anonymous Coward on Tuesday April 18 2017, @03:30AM (1 child)
A secure system can only be secure from the ground up, and only as far as its users respect that security. The space shuttle had the benefit of being secured by the entire US military-industrial complex, and the astronauts were selected to be the very best of the very best of the very best :P
There's still no such thing as "absolutely secure".
(Score: 2) by takyon on Tuesday April 18 2017, @03:34AM
Aliens Gone Wild on Steroids: SLS Hacked Near Jovian Moon Europa
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 2) by frojack on Tuesday April 18 2017, @04:43AM (2 children)
Well, as TFS says "The basic problem is that security never gets a high-enough priority."
I also think software in general suffers from the German Disease. If one layer of abstraction is good, three, four, or five layers must be better. Pointless complexity expands the the number of attack surfaces.
The unix-like systems had a great deal more security built in, but even these suffer from weakness built into every single system component, partly because the the language they gravitated toward for coding didn't protect against buffer overruns and data type misuse.
No, you are mistaken. I've always had this sig.
(Score: 0) by Anonymous Coward on Tuesday April 18 2017, @07:22AM
Pfft if I wanted protection from buffer overrun and data type misuse, I'd use managed VM languages like Java or C#.
(Score: 0) by Anonymous Coward on Tuesday April 18 2017, @09:25PM
Yeah, I'm seeing more and more projects go the route of massive dependency injections. Updating becomes a hellscape of version control, and packages that worked before no longer work well with some other required package, etc. Short term productivity gains are preferred over long term stability and management. Node projects horrify me, even a simple theming framework can require hundreds of MB of libraries with package issues getting worse over time. OSX horrifies me, some people have computers a few years old and have to buy a new license just to install the latest web browsers. What kind of madness is that???
I think the larger issue is that the average software needs of most users has become a "solved problem" so now the software companies are coming up with products with short shelf lives. To run the latest widget you now need an expensive update, but if you ask the actual software engineers you find out that it would be trivially easy and inexpensive for the company to patch older systems.
Software should be designed for long term stability and security.
(Score: 2) by AnonTechie on Tuesday April 18 2017, @07:30AM (4 children)
Some additional information related to this:
Advanced Vehicle Automation and Computers Aboard the Shuttle [nasa.gov]
Albert Einstein - "Only two things are infinite, the universe and human stupidity, and I'm not sure about the former."
(Score: 2) by MostCynical on Tuesday April 18 2017, @10:33AM (3 children)
I expect most modern software is far less than $50 per line, and is unlikely to be tested beyond simple functional testing (did not break/did not kill anyone), and is unlikely to be documented *at all*.
IoT stuff is likely being coded for $0.50 per line.
A "good" router, to NASA spec, would cost $10,000,000.
Alas, everyone goes and buys the $200 model.
"I guess once you start doubting, there's no end to it." -Batou, Ghost in the Shell: Stand Alone Complex
(Score: 0) by Anonymous Coward on Tuesday April 18 2017, @06:09PM (1 child)
Which makes perfect sense, because the security of your router against attackers is simply is not worth $9 999 800. For most people I would say it's not even worth $10. These products have essentially no margin so any extra development cost will mean a higher price tag on the shelf.
A rational person acting in their own best interest should, all else being equal, choose the cheaper router over the "more secure" one. So it's no wonder that manufacturers don't bother with security, because it represents added cost for little, if any, benefit.
(Score: 0) by Anonymous Coward on Wednesday April 19 2017, @07:45AM
A rational person acting in their own best interest should, all else being equal, choose the cheaper router over the "more secure" one. So it's no wonder that manufacturers don't bother with security, because it represents added cost for little, if any, benefit.
You don't understand what that phrase means, do you?
(Score: 2) by urza9814 on Wednesday April 19 2017, @01:23PM
I suspect it's still $50/line or more, but with $40 of every $50 going to management, $5 for time the coder spent updating Excel sheets of their progress, $4.75 for time the coder spent idle/browsing the web, and $0.25 for the time spent actually writing that line of code.
Documentation does exist IME, but usually the requirements are written after the software is completed so that they match the actual behavior, and then the code is changed six more times before release without any updates to the docs. And testing begins six hours before the code gets deployed to production.