Approximately two weeks ago, Open Whisper Systems announced the merger of two of its Android apps, Redphone (secure calling) and TextSecure (encrypted messaging) into one: Signal for Android. This is a counterpart to Signal for iOS, created by the same team. A Chrome extension is forthcoming.
Signal has been getting a lot of love from the security community (Snowden, Schneier, etc) specifically for it's user-friendliness --- something that has prevented the adoption of other crypto software.
The encrypted messaging algorithm seems to be a version of OTR modified for asynchronous mobile environments. Some version of this has been implemented in CyanogenMod as WhisperPush and WhatsApp.
Their blog has a lot of nerdy crypto detail for those interested. For example: deniability, forward secrecy, calling network.
All of their code is open source and funded by donations. Donations are also possible using bitcoin. Accepted pull requests get a payout using another of their projects, Bithub (code).
(Score: 5, Insightful) by melikamp on Friday November 20 2015, @05:53PM
I am getting a similar impression: not a lot of care be it the user freedom or the user security.
Seriously, let's have a discussion, because sometimes I feel like either me or the hole world is going crazy, and either possibility is really bad :) What can we infer about a security solution provider who does not ever ever mention a giant by-design security hole in their application? In this case, it's the fact that all target operating systems are adversarial (because non-free). How does the Signal app cope with the operating system or any of its privileged apps logging keystrokes? It doesn't. The overt assumption is that the OS is secure, which would be reasonable about a free GNU/Linux installation, for example. But the commercial deployments of Android offer neither security nor privacy, they are rooted by parties other than the user, often in an exploitative manner. Same for IOS. These are basic facts of life: non-free systems spy on their users. They do so because it's profitable, it's not illegal, and many users are so out of touch with security issues, they don't quite understand why this is bad, so they consent to it. The so-called security professionals who gloss over these facts really make me wonder. Don't they owe to the user at least a warning??? With big bold red letters: THIS APP ONLY RUNS ON NON-FREE PHONES, WHERE THE OS MAY BE LOGGING ALL KEYSTROKES AND SCREEN INFORMATION, AND THERE IS NO WAY TO KNOW WHEN IT DOES THAT OR TO MITIGATE THE LOSS OF PRIVACY IN ANY WAY. Because this is what's going on, right? Shouldn't the user be aware? What would be the reason to gloss over that?
I personally can hardly believe Snowden suggested the app to people, although his actual words may have been mangled by the interview process. I myself can sign under "IF you use Android, THEN Signal is the most secure option", but I would also add "which doesn't mean anything on an Android-based cell phone, but hey, everything else is even worse". And Schneier I actually spoke to, and he said he uses an Apple phone. OK, I guess he is being consistent with his advice, but now I am loosing my grip on what Schneier means by privacy.
Am I completely out of touch, people? What is the freaking point of making security and/or privacy solutions based on non-free platforms? The whole idea of a non-free platform to to strip the security, the privacy, and the control privileges from the user. None of the security features can be guaranteed there. Why are we wasting resources on this crap? Why is there a Windoze build of TOR, for example? Just so that the attackers can snoop over TOR communications through the backdoors in Windoze? I do not seriously suggest that TOR developers are screwing over their users on purpose, but there is this blindness to the issue, and I find it scary.
(Score: 0) by Anonymous Coward on Friday November 20 2015, @06:34PM
I agree with you to some extent, but there is a fundamental problem here: Most users don't want to give up convenience for privacy and security. Oftentimes non-free proprietary software is more convenient at the moment (not to mention pushed on people by multi-million dollar ad campaigns which of course do not inform users of what they're really getting), so people use that. Still, even under these circumstances, there's a chance that things like TOR could help, even if users would be better off with free platforms. Perfection is the enemy of... slightly better.
(Score: 5, Insightful) by melikamp on Friday November 20 2015, @08:05PM
You know, that's great, and probably true: we can reasonably suppose that using TOR on Windows (or Signal on IOS) does increase privacy and security, although the end result is still really really bad. What stymies me is... Where's the admission? Where's a fair warning? Where's the honest efficiency assessment by the devs or the security experts? If they want to spend their time developing "security" solutions for spy-phones, I can't complain, but what's that with pretending they work? Just tell the user like it is. The sooner users are aware of the basic facts, the sooner they will push legislators to marginalize the whole damn non-free software ecosystem.