Cindy Cohn writes at EFF that when a criminal started lacing Tylenol capsules with cyanide in 1982, Johnson & Johnson quickly sprang into action to ensure consumer safety. It increased its internal production controls, recalled the capsules, offered an exchange for tablets, and within two months started using triple-seal tamper-resistant packaging. Congress ultimately passed an anti-tampering law but the focus of the response from both the private and the public sector was on ensuring that consumers remained safe and secure, rather than on catching the perpetrator. Indeed, the person who did the tampering was never caught.
According to Cohn the story of the Tylenol murders comes to mind as Congress considers the latest cybersecurity and data breach bills. To folks who understand computer security and networks, it's plain that the key problem are our vulnerable infrastructure and weak computer security, much like the vulnerabilities in Johnson & Johnson's supply chain in the 1980s. As then, the failure to secure our networks, the services we rely upon, and our individual computers makes it easy for bad actors to step in and "poison" our information. The way forward is clear: We need better incentives for companies who store our data to keep it secure. "Yet none of the proposals now in Congress are aimed at actually increasing the safety of our data. Instead, the focus is on "information sharing," a euphemism for more surveillance of users and networks," writes Cohn. "These bills are not only wrongheaded, they seem to be a cynical ploy to use the very real problems of cybersecurity to advance a surveillance agenda, rather than to actually take steps to make people safer." Congress could step in and encourage real security for users—by creating incentives for greater security, a greater downside for companies that fail to do so and by rewarding those companies who make the effort to develop stronger security. "It's as if the answer for Americans after the Tylenol incident was not to put on tamper-evident seals, or increase the security of the supply chain, but only to require Tylenol to "share" its customer lists with the government and with the folks over at Bayer aspirin," concludes Cohn. "We wouldn't have stood for such a wrongheaded response in 1982, and we shouldn't do so now."
(Score: 2) by c0lo on Tuesday June 02 2015, @06:33AM
I tend to agree.
The tertiary education in country I came from used to have 3 degrees that closely relate to computers:
Slipping through the fingers would have been the numerical methods (e.g. ODE, physics modelling), each of the above + physics would scratch the surface, but only just.
https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
(Score: 2) by MichaelDavidCrawford on Wednesday June 03 2015, @01:20AM
UCSC offers both CS and CE degrees. The Computer Engineers learn stuff like how to write a mouse driver. The CS majors do stuff like learn how to prove that you can't do any better than O(n lg(n) ) when sorting. (Well sorta - there are special cases in which you can do better.)
I don't know either way whether UCSC offers software engineering, but carnegie mellon sure does. Watts Humphrey got a medal from Dubya for his contributions to the profession. I find the SEI's methods a little too tedious for my personal style but I can see the point of it when bugs lead to things exploding.
There are lots of places where CS students can learn stuff like operating system development. However the way I usually explain the difference is that a Computer Scientist will figure out how to make something work, while a Software Engineer will create something that works correctly.
It upsets me profoundly that the term "complexity" is commonly taken to mean the asymptotic growth in runtime for an algorithm, sometimes the algorithmic growth in complexity.
IMHO the term "complexity" should denote the probability that an average coder will implement the algorithm correctly.
Yes I Have No Bananas. [gofundme.com]