Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 16 submissions in the queue.
posted by cmn32480 on Monday October 03 2016, @07:29PM   Printer-friendly
from the inherently-broken dept.

Arthur T Knackerbracket has found the following story from Bruce Schneier's blog:

Every few years, a researcher replicates a security study by littering USB sticks around an organization's grounds and waiting to see how many people pick them up and plug them in, causing the autorun function to install innocuous malware on their computers. These studies are great for making security professionals feel superior. The researchers get to demonstrate their security expertise and use the results as "teachable moments" for others. "If only everyone was more security aware and had more security training," they say, "the Internet would be a much safer place."

Enough of that. The problem isn't the users: it's that we've designed our computer systems' security so badly that we demand the user do all of these counterintuitive things. Why can't users choose easy-to-remember passwords? Why can't they click on links in emails with wild abandon? Why can't they plug a USB stick into a computer without facing a myriad of viruses? Why are we trying to fix the user instead of solving the underlying security problem?

Traditionally, we've thought about security and usability as a trade-off: a more secure system is less functional and more annoying, and a more capable, flexible, and powerful system is less secure. This "either/or" thinking results in systems that are neither usable nor secure.

[...] We must stop trying to fix the user to achieve security. We'll never get there, and research toward those goals just obscures the real problems. Usable security does not mean "getting people to do what we want." It means creating security that works, given (or despite) what people do. It means security solutions that deliver on users' security goals without­ -- as the 19th-century Dutch cryptographer Auguste Kerckhoffs aptly put it­ -- "stress of mind, or knowledge of a long series of rules."

[...] "Blame the victim" thinking is older than the Internet, of course. But that doesn't make it right. We owe it to our users to make the Information Age a safe place for everyone -- ­not just those with "security awareness."


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Interesting) by Bogsnoticus on Tuesday October 04 2016, @12:25AM

    by Bogsnoticus (3982) on Tuesday October 04 2016, @12:25AM (#409752)

    We're trying to educate the user, but many users prefer not to think for themselves, and complain to the marketing department of companies that things are too hard to do.
    So the marketing departments, instead of wanting stuff designed to withstand attacks from the lowest common denominator in cyber-crims, instead demand products get made to cater to the lowest common denominator in user-land.

    You'll see it in every industry. When a user/customer is confronted with a technician or mechanic saying "XXX cannot do that function", they're response is always "but the salesperson said it could".
    They would rather argue with the guy/gal trying to fix the problem and threaten legal action, than admit to themselves that they were duped by a smooth talking shyster whose entire job is to ensure a fool and their money are parted.

    --
    Genius by birth. Evil by choice.
    Starting Score:    1  point
    Moderation   +1  
       Interesting=1, Total=1
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3  
  • (Score: 2) by PocketSizeSUn on Tuesday October 04 2016, @02:59AM

    by PocketSizeSUn (5340) on Tuesday October 04 2016, @02:59AM (#409789)

    They would rather argue with the guy/gal trying to fix the problem and threaten legal action, than admit to themselves that they were duped by a smooth talking shyster whose entire job is to ensure a fool and their money are parted.

    If they were duped by a shyster they should get their money back, no?
    If they get their money back they won't keep arguing, no?
    So it's really the criminally liable companies that insist they don't have to live up to their shyster sales-person's promise that is to blame, no?

    Or perhaps I am misunderstanding the situation.

    • (Score: 2) by Bogsnoticus on Tuesday October 04 2016, @05:30AM

      by Bogsnoticus (3982) on Tuesday October 04 2016, @05:30AM (#409829)

      They "should" get their money back, but the process of doing so means they have to admit to others, and themselves, that they were stupid in making that particular decision. Nobody wants to look stupid in front of others.
      So instead of doing the right thing, they throw tantrums and threats around (sound like anyone we know applying for a certain high level govt position?).

      End result is the marketing folks end up making technical decisions, which all end up the same: "Make is easy for idiots"

      --
      Genius by birth. Evil by choice.