Stories
Slash Boxes
Comments

SoylentNews is people

posted by cmn32480 on Thursday October 29 2015, @03:21PM   Printer-friendly
from the need-a-penalty-box dept.

Bruce Schneier's blog talks about the recent hack of CIA director John O. Brennan's AOL account (among others) and says when it comes to social engineering attacks:

The problem is a system that makes this possible, and companies that don't care because they don't suffer the losses. It's a classic market failure, and government intervention is how we have to fix the problem.

It's only when the costs of insecurity exceed the costs of doing it right that companies will invest properly in our security. Companies need to be responsible for the personal information they store about us. They need to secure it better, and they need to suffer penalties if they improperly release it. This means regulatory security standards.

Schneier goes on to suggest the government should establish minimum standards for results and let the market figure out the best way to do it. He also partly blames consumers because they demand any security solutions be easy to use, ending with:

It doesn't have to be this way. We should demand better and more usable security from the companies we do business with and whose services we use online. But because we don't have any real visibility into those companies' security, we should demand our government start regulating the security of these companies as a matter of public safety.

Related: WikiLeaks Publishes CIA Chief's Personal Info


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by skater on Friday October 30 2015, @02:58PM

    by skater (4342) on Friday October 30 2015, @02:58PM (#256486) Journal

    I see your point, but as others said: The point is to push the companies toward reducing what they store and to make sure a breach does cost them something. Until security breaches start seriously affecting profit, nothing will change.

    Sorry, but security is so horribly bad right now it's simply unfair to punish anyone specifically. To me it's a much smarter idea for the government to create a budget of billions of dollars to do precisely what I said: Security Audit our fucking software! Including the proprietary platforms because they would be the government and secure access could be *demanded*.

    Would that be the same government that had 25 million records stolen recently? They're one of the worst offenders.

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 2) by edIII on Friday October 30 2015, @09:28PM

    by edIII (791) on Friday October 30 2015, @09:28PM (#256678)

    I'm not asking the government to audit the code. That would be like Little Red Riding Hood asking the Big Bad Wolf to make her dinner. Yes, he makes dinner, but she won't like it :)

    I very much want companies to change too, I'm only trying to be reasonable and practical about what they could change. It's a lot to ask, and quite frankly there is just too much work for the market to care of at this point. We painted ourselves into a corner in a room so vast, that it's insurmountable for any single entity. I take you very seriously, and I work with databases and systems like these. When it comes to CC processing and the like, there is very very very big reason why nearly every single payment gateway has something like a 'customer vault' where I only have a random identifier for the customer file, and can order pre-approved payment plans and packages against it. I deal with CC numbers exactly once, they're encrypted in the browser, and nobody ever sees it again. No Bins, Last 4, CVV2, magstripe info, nothing is stored on my end. Doing so largely eliminates our own liability, and shifts PCI compliance to the payment gateway. We're still required to have PCI compliance on our end *technically*, but the failure for doing so disseminates no financial information about my customers in the event of a breach.

    That's why I mention PCI compliance since they are specifically in the business of auditing specific code and processes to engender security in financial transactions. That program *could* vastly expand, but I *loathe* Big Gubbermint. If it's executed ineptly, as government has a penchant for doing, we won't be any better off at all.

    No, the government will only *pay* recognized experts in the field to operate with complete transparency and begin auditing all of our code bases in use. Adobe wants to throw a shit fit about it as an example, you throw their executives in prison for non-compliance. Security through obscurity is a complete failure, and I only support operating with them in a NDA like fashion. We'll keep their proprietary secrets for them, but we will still be looking at code and advising them. On that note, Adobe could *never* sue a security researcher again, or have their corrupt friends in the FBI arrest them. Security researchers receive immunity while working for the government.

    That would never happen for two reasons:

    1) It would be extremely effective. Billions could hire us a crew of tens of thousands of experts worldwide, and I believe we would see very impressive results in 12-24 months even. FOSS would become very secure, and proprietary platforms could at least be trusted as far as security was concerned.

    2) Nobody in power wants effectiveness at all. That's bad for the insurance industry and intelligence communities, as well as not exactly giving the kind of storm trooper control the government likes. You make more money in war time, then you do in peace time.

    --
    Technically, lunchtime is at any moment. It's just a wave function.