Stories
Slash Boxes
Comments

SoylentNews is people

posted by n1 on Tuesday June 02 2015, @04:04AM   Printer-friendly
from the dr.-spin dept.

Cindy Cohn writes at EFF that when a criminal started lacing Tylenol capsules with cyanide in 1982, Johnson & Johnson quickly sprang into action to ensure consumer safety. It increased its internal production controls, recalled the capsules, offered an exchange for tablets, and within two months started using triple-seal tamper-resistant packaging. Congress ultimately passed an anti-tampering law but the focus of the response from both the private and the public sector was on ensuring that consumers remained safe and secure, rather than on catching the perpetrator. Indeed, the person who did the tampering was never caught.

According to Cohn the story of the Tylenol murders comes to mind as Congress considers the latest cybersecurity and data breach bills. To folks who understand computer security and networks, it's plain that the key problem are our vulnerable infrastructure and weak computer security, much like the vulnerabilities in Johnson & Johnson's supply chain in the 1980s. As then, the failure to secure our networks, the services we rely upon, and our individual computers makes it easy for bad actors to step in and "poison" our information. The way forward is clear: We need better incentives for companies who store our data to keep it secure. "Yet none of the proposals now in Congress are aimed at actually increasing the safety of our data. Instead, the focus is on "information sharing," a euphemism for more surveillance of users and networks," writes Cohn. "These bills are not only wrongheaded, they seem to be a cynical ploy to use the very real problems of cybersecurity to advance a surveillance agenda, rather than to actually take steps to make people safer." Congress could step in and encourage real security for users—by creating incentives for greater security, a greater downside for companies that fail to do so and by rewarding those companies who make the effort to develop stronger security. "It's as if the answer for Americans after the Tylenol incident was not to put on tamper-evident seals, or increase the security of the supply chain, but only to require Tylenol to "share" its customer lists with the government and with the folks over at Bayer aspirin," concludes Cohn. "We wouldn't have stood for such a wrongheaded response in 1982, and we shouldn't do so now."


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 0) by Anonymous Coward on Tuesday June 02 2015, @05:11AM

    by Anonymous Coward on Tuesday June 02 2015, @05:11AM (#191037)

    Dan would later learn that there was a time when anyone could have debugging tools. There were even free debugging tools available on CD or downloadable over the net. But ordinary users started using them to bypass copyright monitors, and eventually a judge ruled that this had become their principal use in actual practice. This meant they were illegal; the debuggers' developers were sent to prison.

    Programmers still needed debugging tools, of course, but debugger vendors in 2047 distributed numbered copies only, and only to officially licensed and bonded programmers. The debugger Dan used in software class was kept behind a special firewall so that it could be used only for class exercises.

    It was also possible to bypass the copyright monitors by installing a modified system kernel. Dan would eventually find out about the free kernels, even entire free operating systems, that had existed around the turn of the century. But not only were they illegal, like debuggers—you could not install one if you had one, without knowing your computer's root password. And neither the FBI nor Microsoft Support would tell you that.

    The Right to Read [gnu.org]

    Why do you hate freedom, MichaelDavidHitler?

  • (Score: 0) by Anonymous Coward on Tuesday June 02 2015, @05:17AM

    by Anonymous Coward on Tuesday June 02 2015, @05:17AM (#191039)

    I'm gonna invade my own Pole-Land with Blackjack and Hookers!

  • (Score: 2) by MichaelDavidCrawford on Tuesday June 02 2015, @05:34AM

    by MichaelDavidCrawford (2339) Subscriber Badge <mdcrawford@gmail.com> on Tuesday June 02 2015, @05:34AM (#191042) Homepage Journal

    I assert that software engineering should require licensing. While closely related computer programming and software engineering are distinctly different practices.

    --
    Yes I Have No Bananas. [gofundme.com]
    • (Score: 2) by c0lo on Tuesday June 02 2015, @06:33AM

      by c0lo (156) Subscriber Badge on Tuesday June 02 2015, @06:33AM (#191055) Journal

      I assert that software engineering should require licensing.

      I tend to agree.

      The tertiary education in country I came from used to have 3 degrees that closely relate to computers:

      1. computer science - taught in Uni, Maths faculty. You'll know structures/parsing-lexing/complexity/type theory/symbolic calculus/algo decidability/etc inside out. If you want a master, prepare yourself to dig some obscure theorems of formal equivalence of quantum computing algos, but they'd make the worst hires if you wanted to develop a business app/system
      2. computer engineering - a degree in engineering schools, starts with formal logic implemented on gates, passes through data structures with a bit of complexity sprinkled around, some deeper touches on programming languages (procedural, OO, functional, low level such as MIX/ASM/Forth), a good chunk of operating systems/DBMS and related problems (handling concurrency is a deliciously hard course), then you choose to major either toward the electrical part of it (system engineering/automation/CAD/CAM) or software practice (development life cycles, quality systems, maybe a bit of project management)
      3. cyber-economics - business schools degree, all that touches business optimisation (minmax, linear/nonlinear/discrete optimisation, floor planning, financial planning, risk modelling, etc)

      Slipping through the fingers would have been the numerical methods (e.g. ODE, physics modelling), each of the above + physics would scratch the surface, but only just.

      --
      https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
      • (Score: 2) by MichaelDavidCrawford on Wednesday June 03 2015, @01:20AM

        by MichaelDavidCrawford (2339) Subscriber Badge <mdcrawford@gmail.com> on Wednesday June 03 2015, @01:20AM (#191373) Homepage Journal

        UCSC offers both CS and CE degrees. The Computer Engineers learn stuff like how to write a mouse driver. The CS majors do stuff like learn how to prove that you can't do any better than O(n lg(n) ) when sorting. (Well sorta - there are special cases in which you can do better.)

        I don't know either way whether UCSC offers software engineering, but carnegie mellon sure does. Watts Humphrey got a medal from Dubya for his contributions to the profession. I find the SEI's methods a little too tedious for my personal style but I can see the point of it when bugs lead to things exploding.

        There are lots of places where CS students can learn stuff like operating system development. However the way I usually explain the difference is that a Computer Scientist will figure out how to make something work, while a Software Engineer will create something that works correctly.

        It upsets me profoundly that the term "complexity" is commonly taken to mean the asymptotic growth in runtime for an algorithm, sometimes the algorithmic growth in complexity.

        IMHO the term "complexity" should denote the probability that an average coder will implement the algorithm correctly.

        --
        Yes I Have No Bananas. [gofundme.com]
  • (Score: 2) by MichaelDavidCrawford on Tuesday June 02 2015, @05:48AM

    by MichaelDavidCrawford (2339) Subscriber Badge <mdcrawford@gmail.com> on Tuesday June 02 2015, @05:48AM (#191046) Homepage Journal

    I actually worked in a shop like that once. It was originally set up to keep the internet out but then the company owner realized it would keep his employees from stealing the source.

    We each had two windows boxen. One was connected to the internet, the other was inside a locked rackmount, with ethernet kvm units.

    I worked remotely for most of that contract, when I delivered my source the head programmer had to unlock one of the cabinets then copy my source onto the secured box I used. When I returned home he had to copy my source back onto my floppy.

    I wouldn't dream of working full-time in such an environment, but actually they are quite common.

    --
    Yes I Have No Bananas. [gofundme.com]