Riana Pfefferkorn, a Cryptography Fellow at the Center for Internet and Society at Stanford Law School, has published a whitepaper on the risks of so-called "responsible encryption". This refers to inclusion of a mechanism for exceptional access by law enforcement to the cleartext content of encrypted messages. It also goes by the names "back door", "key escrow", and "golden key".
Federal law enforcement officials in the United States have recently renewed their periodic demands for legislation to regulate encryption. While they offer few technical specifics, their general proposal—that vendors must retain the ability to decrypt for law enforcement the devices they manufacture or communications their services transmit—presents intractable problems that would-be regulators must not ignore.
However, with all that said, a lot more is said than done. Some others would make the case that active participation is needed in the democratic process by people knowledgeable in use of actual ICT. As RMS has many times pointed out much to the chagrin of more than a few geeks, "geeks like to think that they can ignore politics, you can leave politics alone, but politics won't leave you alone." Again, participation is needed rather than ceding the whole process, and thus its outcome, to the loonies.
Source : New Paper on The Risks of "Responsible Encryption"
Related:
EFF : New National Academy of Sciences Report on Encryption Asks the Wrong Questions
Great, Now There's "Responsible Encryption"
(Score: 4, Interesting) by JoeMerchant on Sunday February 18 2018, @02:50PM (13 children)
If you care about keeping a secret, the only real answer is to DIY the encryption - learn as much as you feel you need to about the algorithms, implement them yourself, and try to not copy something exactly that's in mainstream use and likely to be broken.
This is not saying: make up your own stuff from scratch and hope it sticks... this is saying: research the methods that have been proven, roll your own implementation from vetted published solutions, and include enough variation that when a successful attack method for the common implementations inevitably gets released it won't work on your implementation.
Or, pick one of these and pray: https://www.techrepublic.com/blog/five-apps/five-free-and-secure-messaging-tools/ [techrepublic.com]
It's always a tradeoff between convenience and security - rolling your own does come with a high inconvenience cost, and a risk that if you are sloppy you'll be insecure anyway, but if you're not a high value target then the effort required to practically secure your own communications is pretty low.
🌻🌻🌻 [google.com]
(Score: 4, Insightful) by NotSanguine on Sunday February 18 2018, @05:52PM (4 children)
I disagree. How does the old saw go? "Three can keep a secret, if two of them are dead."
I'd add that often, not even that is enough.
Sure, you can roll your own encryption tools and share binaries (via encrypted, out-of-band channels) with trusted parties. Assuming you use sufficiently large key sizes [stackexchange.com], that will almost certainly keep prying eyes from decrypting any messages sent/received while in transit. However, the same can be said for current TLS implementations.
This, IMHO, argues for ubiquitous encryption of *all* network traffic, significantly increasing the complexity of compromising *specific* encrypted communications via wholesale network traffic captures.
That said, in order for such trusted parties to usefully interact with such messages, those parties must at least have the capability to decrypt them. That opens up a raft of potential vectors for compromising the confidentiality of those messages.
What's more, If one of those trusted parties is targeted [xkcd.com], confidentiality is almost certainly suspect.
Are things quite so dire for most of us? Probably not. But given the state of current technology, some form of coercion (warrants, violence/threat of violence, drugs, bribery/extortion, etc., etc.) is almost certainly the weakest link in the chain, not a lack of secure encryption tools.
No, no, you're not thinking; you're just being logical. --Niels Bohr
(Score: 2) by JoeMerchant on Monday February 19 2018, @03:45AM (1 child)
The $5 pipe wrench, artfully applied to the holder of a secret key, is indeed the most efficient method of decoding many secrets.
The real art in secret communication is not letting anyone know who you are communicating with in the first place. Or, not communicating anything secret at all - if no one can tell the difference, then you're doing it right.
🌻🌻🌻 [google.com]
(Score: 2) by NotSanguine on Monday February 19 2018, @04:22AM
An excellent point.
Given that the complexity (I discuss that a little bit below) in obfuscating the participants in a particular communication in the current environment, especially for folks who are unlikely to be targeted, I submit that a strategy of strongly encrypting *all* communications, whether they communicate sensitive information or not, is more achievable on a large scale. Sadly, that's not very likely, given the state of the software ecosystem enabling such communications.
I imagine I could undertake a survey of Craigslist ads, posts on sites like 4chan, reddit and, a raft of other sites that allow anonymous comments in an attempt to identify covert (or potentially not so covert) communications channels with a reasonable chance of success.
Even without performing such a survey, I'm certain that such communications, while perhaps not common, are used in the same way that classified ads in newspapers were used for covert communications in previous decades.
In fact, I assume that intelligence gathering agencies already scan all those sites and more in an attempt to identify such communications.
In some cases, that would be *more* secure than using encrypted emails/chat/messaging apps, given the risks associated with local system/app/server related compromises.
However, those covert channels have their own set of issues WRT cipher distribution, mis-identification of messages and timing, among other things.
Unless and until we have protocols and tools that can, relatively seamlessly, ensure confidentiality and integrity, one can either keep sensitive information to oneself, or meet trusted parties in isolated, soundproofed faraday cages to discuss such things.
No, no, you're not thinking; you're just being logical. --Niels Bohr
(Score: 2) by JoeMerchant on Monday February 19 2018, @03:55AM (1 child)
Well, one solution for the "two can keep a secret" problem is for all parties to communicate 1:1 using each others' public keys. If one (or more) parties are sloppy with their keys, only messages addressed to the poor key keeper are compromised. This is just as unavoidable as the sloppy party re-posting the decrypted content in public - you can't stop a bad actor, but you can limit what you share with them.
🌻🌻🌻 [google.com]
(Score: 2) by NotSanguine on Monday February 19 2018, @04:33AM
Absolutely. Unfortunately (as I pointed out in my reply [soylentnews.org] to your previous comment), the software ecosystem that would need to support widespread use of asymmetric key encryption is sorely lacking in the features that could engender widespread use.
Choosing the "wrong" (whether they be incompetent, unprincipled, stupid or otherwise "bad actors") folks with whom to communicate sensitive information goes far beyond digital communications, as is evidenced by (I'm sure there's at least one in your circle) that person(s) who can't help but tell everyone the stuff you reveal to them in confidence.
No, no, you're not thinking; you're just being logical. --Niels Bohr
(Score: 0) by Anonymous Coward on Sunday February 18 2018, @08:10PM
I disagree about the inconvenience cost. If you're capable of rolling your own this is by far the most convenient way to get just about anything done because you don't depend on arbitrary third parties to address any issues that nor do you have to wade through opaque, messy source code someone else wrote.
(Score: 4, Informative) by frojack on Sunday February 18 2018, @09:14PM (6 children)
The first thing everyone of the people who I trust in the encryto-sphere says is NEVER ROLL YOUR OWN.
Then Along comes Joe.
No, you are mistaken. I've always had this sig.
(Score: 2) by pipedwho on Sunday February 18 2018, @09:56PM (4 children)
This is absolutely true. However, Joe did qualify with:
There are various parameters and things you can do with an open source implementation of something that you otherwise trust. For example, at the lowest level lets say you're paranoid and worry about a weakness in AES (ignore that is has some proven security bounds), so you decide to add some extra rounds to your 'variant'. That change is within the design parameters of AES and will not only add security (at the detriment of performance, which may not matter to you), but also very likely 'break' any mass deployed automated cracking scripts/tools that assume that the protocol is just using standard AES.
You might also decide to add an 'obfuscation' protocol (based on otherwise secure crypto) on top of the existing 'trusted' solution. For example, you add some additional fixed encryption using AES (or Twofish, Serpent, etc) on top of the already effective protocol elements (especially the key exchange section). You might even just encode the fixed key constant in your app, creating an 'obfuscation' layer more than anything else. However, if the 'trusted' App falls to some exploit that allows a trojan or worm to 'mass harvest' messages from people communicating with the original App, your variant is still 'safe'. Yes, someone could analyse and easily crack that too. But, it's a lot of manual effort for a single payoff. And the mass deployed harvesting tools can't 'see' inside the encrypted data, so it just looks like garbage (or another 'unknown' protocol).
All you've done is taken an open source tool such as PGP and made a modification that is described by cryptographic literature as otherwise 'valid' or 'safe'. Whether or not it makes much difference (or even adds any additional 'security' in the classical sense) is irrelevant to what Joe was alluding to.
But, you are right, people with no experience with cryptographic implementation should definitely avoid rolling their own.
(Score: 3, Insightful) by JoeMerchant on Monday February 19 2018, @01:58AM
Thank you - I try not to spell things out too pedantically, it takes too long to read (and write.)
IMO, the most powerful security is a combination of the best available algorithms plus obscurity. If the crackers don't know what they're dealing with, it will take actual (expensive) human brain power to try to break it, and unless you're a top priority target that's not likely to happen.
What is likely to happen is a the cracking of a widely used tool (or app) and the subsequent mass harvesting of all traffic that passed through it - that will catch all kinds of people who weren't even on anyone's radar, until a mass harvester tripped on a few keywords in their archived communications streams from years gone by.
🌻🌻🌻 [google.com]
(Score: 2) by Wootery on Tuesday February 20 2018, @03:34PM (2 children)
So the advice essentially boils down to become an expert cryptographer, then roll your own.
That advice is... unhelpful.
What's really wrong with just implementing an algo like Curve25519?
Or just use a proper algorithm like Curve25519 and don't pretend that amateur-hour band-aids are the solution.
(Score: 2) by pipedwho on Wednesday February 21 2018, @08:22AM (1 child)
Because you're just as likely to stuff up your implementation of Curve25519 if you're not an experience cryptography programmer anyway. So it's not particularly useful to use a single protocol element without considering the rest of the cryptographic system design.
If you want to use Curve25519 just find some trusted Open Source project that already uses it.
Or better yet, add it 'in series' with an already trusted/vetted Open Source project so you have something that is resistant to automated mass data vacuum scripts that might target the public version of your chosen tool. Even if you end up with a ham fisted effort that turns out to be insecure, at least you can fall back to the trusted secure implementation you have as the basis of your modified application.
So, the advice is don't try to roll your own. If you do, play it safe and do it in an additive way that doesn't touch or use any key material from the trusted implementation that you're modifying. If you have done a lot of research on the topic, then feel free to make some more subtle but secure changes to the parameters (if you don't understand what this means, you haven't done enough research, and are not ready to change anything).
I've seen far too many amateur hour totally insecure implementations that use otherwise secure algorithms like AES, RSA, Curve25519, SHA, etc. But they do something stupid (and usually a cascade of equally dumb things) that ends up pretty much voiding any security that the algorithms offered. It's better than back in the day where it was common for people/companies to try and roll their own low level crypto algorithms. At least these days we have some good building blocks to use from the likes of NIST, and if you don't trust NIST, then Dan Bernstein.
(Score: 2) by Wootery on Wednesday February 21 2018, @09:40AM
Ultimately I agree, of course, but we were putting on our tin-foil hats and pretending we couldn't trust any existing implementations. Perhaps the best answer to that hypothetical is simply in that case all is already lost.
The horror, the horror. A proprietary 4096-bit crypto scheme sounds great on paper, to a clueless pointy-haired boss at least, but as you say, it's like saying you've given your security guards hand-made 11mm handguns.
(Score: 2) by JoeMerchant on Monday February 19 2018, @01:52AM
See, that's where you need to learn to read, and understand, the whole first two sentences before reacting:
Take working copies of good implementations and roll them into your own layered solution, getting the good bits, but not compatible with mainstream released methods that will inevitably be hacked on until they fall.
Or, you could just run mainstream tools, patch regularly and pray that they release lockouts for the "bad guys" fast enough, and that the "good guys" who have the back door keys use them responsibly... that's the first thing everyone of the people who represent mainstream cybersecurity best practices encourages everyone to do. Watching Adobe Flash play whack-a-mole with east European BBC pirates was enough to convince me that that merry-go-round is not running for my benefit.
🌻🌻🌻 [google.com]