Cory Doctrow at the Guardian posts - Privacy technology everyone can use would make us all more secure.
You don’t need to be a technical expert to understand privacy risks anymore. From the Snowden revelations to the daily parade of internet security horrors around the world – like Syrian and Egyptian checkpoints where your Facebook logins are required in order to weigh your political allegiances (sometimes with fatal consequences) or celebrities having their most intimate photos splashed all over the web.
The time has come to create privacy tools for normal people – people with a normal level of technical competence. That is, all of us, no matter what our level of technical expertise, need privacy. Some privacy measures do require extraordinary technical competence; if you’re Edward Snowden, with the entire NSA bearing down on your communications, you will need to be a real expert to keep your information secure. But the kind of privacy that makes you immune to mass surveillance and attacks-of-opportunity from voyeurs, identity thieves and other bad guys is attainable by anyone.
He then goes on to promote https://simplysecure.org/ an organization he belongs to who are looking for programmers who are interested in either improving the security software; and/or the interfaces and setup for those tools.
So while I think this is a fantastic idea; I'm confounded that the Guardian would post such an advertisement as an article; and saddened that the foundation has nothing to offer at this point.
I think the public would better embrace security technologies if it was easy to roll out. What have you run into that confounds you?
(Score: 0) by Anonymous Coward on Thursday September 25 2014, @05:08AM
> I'm confounded that the Guardian would post such an advertisement as an article;
Really? Would you have a problem with them running an article about Mozilla?
Because neither one of them is looking to sell anything.
(Score: 3, Interesting) by c0lo on Thursday September 25 2014, @05:24AM
The idea that security can work as a magic dust a Joe SixPack user simply sprinkle over their gadgets and no evil spirit can touch them.
Blessed are the poor in spirit... for they can believe anything.
Hackers will love it. Why? Because:
"The greatest trick the Devil ever pulled was convincing the world he didn't exist"
https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
(Score: 1, Interesting) by Anonymous Coward on Thursday September 25 2014, @11:27AM
The idea that security can work as a magic dust a Joe SixPack user simply sprinkle over their gadgets and no evil spirit can touch them.
Blessed are the poor in spirit... for they can believe anything.
More and more it seems like snark is how the ignorant on soylent self-identify.
Good security implementations make it easier for Joe SixPack to work more securely - like the fingerprint scanner on the newer iphones which is a big step up from how people typically use their phones - no PIN at all. Bad security implementations make it harder for Joe SixPack to work more securely - like excessive password complexity requirements that make it so hard users to remember their passwords that they write them down on a post-it.
This project is pretty clearly about former.
(Score: 3, Insightful) by c0lo on Thursday September 25 2014, @12:56PM
The extremes: the most secure computer is one that can never be switched on, too bad is inflexibly unusable; and the most flexible computer is one that does everything for everybody, too bad everybody can do everything with it.
The art of staying as secure as possible but still "alive" involves a constant and dynamic act of balancing between these extremes: believing there could be an application that can adjust this balance for you without a mental effort is like believing in the application with a single button labeled "Do it!"
"there is always a well-known solution to every human problem — neat, plausible, and wrong." with my adage the existence of this solution does not make the problem less complex.
(meh... I'm past the age I care)
https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
(Score: 2) by Blackmoore on Thursday September 25 2014, @04:35PM
I get this, but some of the software requires people to be Linux admins, when what we have out there are Joe-sixpack end users.
sure some of the cruft MS included win Win7,8 helps them (better than none i guess) but even if they understand they have a problem - there is very little they can purchase and understand how to install properly.
The Linux side is worse. Most tools expect you to have Admin access and installation knowledge. And for a Linux hobbyist like myself some of those are a bit daunting. We've gotten (mostly) past the point of "just MAKE the kernal with these updates" level of frightening. but a good UI is a good UI - we should be able to do this.
(Score: 0) by Anonymous Coward on Thursday September 25 2014, @07:01PM
You seem to be arguing that if you don't do everything you shouldn't do anything.
You are adding negative value to this discussion.
I'll take these guys over your philosophy of making the perfect the enemy of the good any day.
(Score: 2) by c0lo on Thursday September 25 2014, @10:06PM
I'm arguing that if you don't know anything and you want to do something, you'd better start learning. Otherwise, chances are you'll screw yourself beyond recognition: there are many ways a thing can get wrong and relatively very few to do properly.
https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
(Score: 2) by edIII on Friday September 26 2014, @04:08AM
This is an appeal to emotion. Yes, a great number of people are not buying into the idea that security needs to be paid for. It's saddening when they state it shouldn't be so hard to do it.
Simplysecure.com is stepping up and asking people to get involved, and attempting to work at it. That seems to be what you want. Someone paying for the lunch. Well, they're trying.
I believe that to be a false dichotomy. The most secure computer is one in which every operation, no matter how small, is subject to a full analysis of its impact everywhere else in the system, and complex logic is used to process these events. The most flexible computer, is the one with the most code, or the most understanding of how to accomplish the tasks being asked of it, including being asked.
The knowledge of how to accomplish the task is never mutually exclusive from the knowledge of how to do it securely. Flexibility is divested from security and only has a relationship in terms of money, or the acknowledgment of the choice: I choose a lower state of security, not from ignorance, but from lack of resources.
There is nothing to balance it against. While I do agree the user needs to choose how they use the computer, it's not impossible, or unknown either, how we might create information systems with data diodes. The ability to understand and control the flow of information.
It's not a matter of flexibility for us to do this. It's a matter of CPU resources, distributed shared memory, heuristics, topological analysis, pattern recognition, deep packet inspection, social engineering, social awareness, etc.
To say it's a complete flight of fancy to imagine computer systems in 20 years that more or less can understand what data you are working with, and prevent casual errors, is unsupported.
On this we agree. The problem is complex. Not as impossible as you make it though.
This sounds like at least some sort of progress.
Technically, lunchtime is at any moment. It's just a wave function.
(Score: 3, Insightful) by c0lo on Friday September 26 2014, @05:05AM
No, it's not (or at least I didn;t intend to).
My points, to be clear:
I can't agree with you, I think there is a need to balance between the two, no matter how you implement it (diodes or distributed shared memory, heuristics, topological analysis, pattern recognition, deep packet inspection, social engineering, social awareness, etc)
Further more, simplysecure must admit that the security it aims to deliver (or just help delivering) does need to include Joe Sixpack's education... "you can't fix stupid" holds true even if by stupid you don't infer anything about intellectual capabilities but slides the meaning to "ignorance" (if Joe Sixpack choose to stay ignorant in security matters, no matter how simple or wonderful your security applications, they fail).
Otherwise, best of luck from me: I'm aware I don't know enough to contribute to the project, even if I know enough the security area is complex.
In other words: my level of knowledge makes me think the project is hitting "BS bingo" in two steps by promoting a straight "Simple security" oxymoron. But... I admit I'm no master.
https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
(Score: -1, Troll) by Anonymous Coward on Thursday September 25 2014, @05:32AM
If you use intel you have a vnc client onboad which can be accessed via the network or out of band via 3g.
It shows what ever's on your screen.
(VPro, VT, whatever they are calling it now)
This is so that they can catch those people who like young girls.
The feminist states of america can't have any old believers around.
Ofcourse you people here should be for it: you are pro-women's rights.
Can't have men who'd like to marry young female children not locked up.
(Score: 2, Interesting) by Gravis on Thursday September 25 2014, @05:57AM
You can't just slap on a fancy app and call your system secure, it has to be secure from the inside out. That means designing the OS AND software with security in mind. Commercial software rarely considers security to be an issue so you should assume anything with a closed source is exploitable. Open source isnt much better but at the very least you can check if it's secretly phoning home or if it's needlessly exposing itself. D-Bus is a great example of software with no notion of security.
Software should be written with the assumption that a hostile program is actively running and monitoring the system.
(Score: 0) by Anonymous Coward on Thursday September 25 2014, @06:47AM
I'd rephrase as
"... designing the OS AND software AND ALL firmwares with security in mind ... assume anything with a closed source firmware running underneath it is exploitable. Open source isnt much better and unfortunately you'll never be able to check if its closed source network interface firmware is secretly phoning home or if it's needlessly exposing itself."
(Score: 2, Interesting) by MichaelDavidCrawford on Thursday September 25 2014, @06:04AM
- into you.
Forum in Risks to the Public in Computers and Related Systems [ncl.ac.uk].
Security and privacy are difficult problems.
Yes I Have No Bananas. [gofundme.com]
(Score: 1, Insightful) by Anonymous Coward on Thursday September 25 2014, @11:30AM
I've been reading RISKS for about 20 years and recommend it highly. It didn't put the fear of god into me, but it did teach me that a good engineer thinks about how to make a system fail even more than he thinks about how to make a system work. Its the difference between robust security and brittle security.
(Score: 2) by kaszz on Thursday September 25 2014, @02:07PM
I'll concur on this.
(Score: -1, Troll) by Anonymous Coward on Thursday September 25 2014, @06:33AM
If you use intel you have a vnc client onboad which can be accessed via the network or out of band via 3g.
It shows what ever's on your screen.
(VPro, VT, whatever they are calling it now)
This is so that they can catch those people who like young girls.
The feminist states of america can't have any old believers around.
Ofcourse you people here should be for it: you are pro-women's rights.
Can't have men who'd like to marry young female children not locked up.
..
(Score: 4, Interesting) by TheLink on Thursday September 25 2014, @08:05AM
OK so how much privacy would a "normal person" need? What is the scope of the problem?
If a "normal person" is in a situation where he or she really _needs_ something like Tor, either you give them a tool that limits what they can do or post to safe "canned responses", or you have to train them quite rigorously.
It doesn't matter if you tell people how to use Tor and other tools/devices properly. Most people actually require training AND practice before they can reliably not make mistakes. You can tell and show them and they go "yeah" etc. But they might accidentally login to their public webmail using their Tor browser before/while/after doing something else they don't want linked to them. Or even try to post something anonymously but sign off with their real name... Or use phrases or writing styles, spelling mistakes/preferences that can help identify them.
So no matter how good the tools are it may not be enough to teach them how to use the tools. You may still have to teach them to think and behave in a certain sort of way. It's like good surgeons/divers/drivers/pilots have good habits. Those habits don't appear after 5 minutes or even an hour of instruction.
Anyway here are two of my suggestions:
1) https://bugs.launchpad.net/ubuntu/+bug/148440 [launchpad.net]
2) https://bugs.launchpad.net/ubuntu/+bug/156693 [launchpad.net]
1) Best way to have encrypted secrets and not be bothered about them is when everyone has encrypted containers whether they know it or not.
2) Should make it easy for admins/users to isolate apps and thus limit the damage. Then you can have one isolated browser instance for banking, one for shopping, one for crap, and one for work, etc.
(Score: 0) by Anonymous Coward on Thursday September 25 2014, @09:59AM
Any relation to Cory Doctorow?
(Score: 2) by Geezer on Thursday September 25 2014, @10:20AM
Dreaming the impossible dream again, are we?
Two huge obstacles stand in the way.
The obvious one is that, as a rule, anything "user-friendly" for school kids, grandmothers, and truck drivers is going to be easily cracked. Oh, hello Windows!
The other is more insidious. Just about every nation-state with electricity is either in cahoots with, in competition with, or envious of the NSA spy machine. That's just the nature of power-mad bureaucrats and politicians. It is child's play for any government to drop an NSL (or whatever the local regime calls their secret government orders) to stipulate that as a condition of doing business, Company X must create or provide surreptitious surveillance capability.
The solution has to be political. It's too easy for governments, left unguarded, to circumvent technology with legal brute force.
(Score: 2) by kaszz on Thursday September 25 2014, @02:18PM
You can always make use of users self interest. Any solution that outsource your security to a 3rd party will be vulnerable. That means secrets are protected be means of endpoints.
(Score: 2, Insightful) by WillR on Thursday September 25 2014, @05:12PM
Any solution that doesn't outsource your security will also be vulnerable, unless you're a black belt master of sysadmin-fu for every piece of software you're using (including ones you don't even have documentation for, like the baseband firmware in your phone). Security is hard.
(Score: 2) by kaszz on Friday September 26 2014, @02:48AM
Yeah, the baseband firmware is a giant loophole right in front of you. It's a pain to do anything about it however.
(Score: 3, Interesting) by mtrycz on Thursday September 25 2014, @12:34PM
I've tried out uTox with a friend and it just works. No, really, it just works. It's faster to setup than skype, even.
Was searching for a skype alternative. Happened after Microsoft "clicked" an account-activation link two collegues exchanged via skype. Creepy.
We're keeping it for the time being, we'll see how things unwind. If anybody here has insight into potential problems with theory and implementation.
In capitalist America, ads view YOU!
(Score: 2) by wonkey_monkey on Thursday September 25 2014, @12:57PM
Yes, yes it is. Why so unsure?
systemd is Roko's Basilisk
(Score: 0) by Anonymous Coward on Thursday September 25 2014, @04:16PM
Spider Oak's UI is easy to use and understand for normal folks, and it offers actual security. So if everyone who wanted to use cloud storage used them, the world would be a better place. I have no affiliation with them, I just admire the company.