Stories
Slash Boxes
Comments

SoylentNews is people

posted by cmn32480 on Monday October 03 2016, @07:29PM   Printer-friendly
from the inherently-broken dept.

Arthur T Knackerbracket has found the following story from Bruce Schneier's blog:

Every few years, a researcher replicates a security study by littering USB sticks around an organization's grounds and waiting to see how many people pick them up and plug them in, causing the autorun function to install innocuous malware on their computers. These studies are great for making security professionals feel superior. The researchers get to demonstrate their security expertise and use the results as "teachable moments" for others. "If only everyone was more security aware and had more security training," they say, "the Internet would be a much safer place."

Enough of that. The problem isn't the users: it's that we've designed our computer systems' security so badly that we demand the user do all of these counterintuitive things. Why can't users choose easy-to-remember passwords? Why can't they click on links in emails with wild abandon? Why can't they plug a USB stick into a computer without facing a myriad of viruses? Why are we trying to fix the user instead of solving the underlying security problem?

Traditionally, we've thought about security and usability as a trade-off: a more secure system is less functional and more annoying, and a more capable, flexible, and powerful system is less secure. This "either/or" thinking results in systems that are neither usable nor secure.

[...] We must stop trying to fix the user to achieve security. We'll never get there, and research toward those goals just obscures the real problems. Usable security does not mean "getting people to do what we want." It means creating security that works, given (or despite) what people do. It means security solutions that deliver on users' security goals without­ -- as the 19th-century Dutch cryptographer Auguste Kerckhoffs aptly put it­ -- "stress of mind, or knowledge of a long series of rules."

[...] "Blame the victim" thinking is older than the Internet, of course. But that doesn't make it right. We owe it to our users to make the Information Age a safe place for everyone -- ­not just those with "security awareness."


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 0) by Anonymous Coward on Tuesday October 04 2016, @08:56AM

    by Anonymous Coward on Tuesday October 04 2016, @08:56AM (#409885)

    Part of the problem is the entire security model around executables. Or rather... the lack of it. Any program I run as user x automatically has the same privileges as user x, which in practice it means it can do whatever it wants to in my home folder containing my documents and other data. Who needs admin or sudo privileges when you can simply ransom or steal the user's data?

    The solution is to not allow anything unless explicitly needed. porncodec.exe should not be allowed to do anything other then render porn videos to a framebuffer/texture or whatever a codec does.

    My web browser should not be allowed to do anything besides acces the net and place files in select directories for settings, caches or downloads. The web browser should not be allowed to access my home folder unless I have picked a file to be uploaded from that folder using a file picker dialog supplied by the operating system, and then the operating system should open that file in read-only mode and supply the data stream to the browser. Any browser extensions that need more permissions should ask for these at the moment that they need them while clearly indicating why such a permission is needed.

    Installers should not be able to go hog-wild doing whatever they want just because I gave permission to do system changes to install a program which I think might be useful but secretly contained a copy skynet.

    We need to redesign our operating systems from the ground up, to include security based on behavior blocking from the start, and in ways that are user friendly. Instead of training users to only run software that they trust, I wish to see systems that assume that all code and all data is untrustworthy and to allow safely running this untrustworthy code while knowing that even if it is malicious, it can do only very limited damage.

    Also the Android model with tons of blanket permissions required to even install an app is a slight improvement but almost as bad as the desktop situation. You still have very little control over what an app might be doing behind your back.