Stories
Slash Boxes
Comments

SoylentNews is people

posted by on Wednesday January 25 2017, @11:22AM   Printer-friendly
from the ROT-13-is-too-secure dept.

Like other politicians and government officials, President Trump's nominee for the position of Attorney General, Jeff Sessions, wants to have it both ways when it comes to encryption:

At his confirmation hearing, Sessions was largely non-committal. But in his written responses to questions posed by Sen. Patrick Leahy, however, he took a much clearer position:

Question: Do you agree with NSA Director Rogers, Secretary of Defense Carter, and other national security experts that strong encryption helps protect this country from cyberattack and is beneficial to the American people's' digital security?

Response: Encryption serves many valuable and important purposes. It is also critical, however, that national security and criminal investigators be able to overcome encryption, under lawful authority, when necessary to the furtherance of national-security and criminal investigations.

Despite Sessions' "on the one hand, on the other" phrasing, this answer is a clear endorsement of backdooring the security we all rely on. It's simply not feasible for encryption to serve what Sessions concedes are its "many valuable and important purposes" and still be "overcome" when the government wants access to plaintext. As we saw last year with Sens. Burr and Feinstein's draft Compliance with Court Orders Act, the only way to give the government this kind of access is to break the Internet and outlaw industry best practices, and even then it would only reach the minority of encryption products made in the USA.

Related: Presidential Candidates' Tech Stances: Not Great


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Insightful) by meustrus on Wednesday January 25 2017, @02:09PM

    by meustrus (4961) on Wednesday January 25 2017, @02:09PM (#458483)

    The fundamental problem with treating digital encryption like physical locks - which would lead to law enforcement having the right to break in - is the same as the fundamental problem facing everything digital: scale. The police can, with a lot of time, expense, and visibility, break any safe. But there is no way for the police to break encryption with a lot of time, expense, and visibility. There is only a way for the police to break encryption quickly, cheaply, and invisibly.

    That's just how computers work. Encryption, which must be designed to keep information safe wherever it is copied, must accomplish a task that is impossible to achieve in physical space to be useful. And achieving that task - being difficult to break into even if when it is on high-end hostile computer systems - necessarily prevents the police from gaining access.

    That brings us to the problem of scale. Because breaking any encryption means breaking lots of encryption quickly, cheaply, and invisibly, there is no effective way to limit the backdoor to one lock at a time. It will be used at scale to break into lots of things really quickly. It's like the police want to be able to break into a safe from brand X, but in order to do it, they are going to be able to break into every safe from brand X at the same time. Not only that, but any enterprising criminals who found out how the cops did it would be able to do the same thing. The only thing keeping jewel thieves out of every safe is that breaking into them requires time, can be noisy, and must be done behind physical security. Hackers have all the time they need and can make all the noise they want because they can break through security, make a very portable copy of the information, and run off with it to break in on their own time.

    Unfortunately this fundamental difference between the physical and digital worlds is not something that the average person understands. We live in the physical world. Only the math and software nerds understand the implications of the digital world. And there has been very little effort to explain the fundamental differences in such simple terms.

    We are likely to see this problem keep coming up every time our politicians represent the understanding of the population at large rather than the understanding of the elite. Much as I dislike this administration and assert that the do not represent the majority of the population, politically speaking they must represent the majority understanding on most issues. It's a simple matter of their messaging - Republicans are only willing to accept declarations from the elite when they sound like common sense, whereas Democrats are willing to accept declarations that appear to come from experts even when they conflict with common sense. And in this case, the reality definitely conflicts with common sense, because common sense is based on the physical world.

    I don't expect Democrats to fix this problem. Obama made it worse. Law enforcement experts tend to override cryptography experts when it comes to law enforcement, so relying on expert opinion simply isn't a viable strategy. Our only hope is to make Republican voters - and right-leaning movements all over the world - understand intuitively how the digital world works. Because ultimately, if they understood what encryption backdoors meant, they would oppose it as government overreach.

    --
    If there isn't at least one reference or primary source, it's not +1 Informative. Maybe the underused +1 Interesting?
    Starting Score:    1  point
    Moderation   +3  
       Insightful=3, Total=3
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   5  
  • (Score: 2) by Non Sequor on Wednesday January 25 2017, @04:00PM

    by Non Sequor (1005) on Wednesday January 25 2017, @04:00PM (#458513) Journal

    I'm going to play devil's advocate here.

    Let's say you have a cryptographic system where there is an algorithm that uses a publicly available constant in the process. This algorithm would be framed so that a private constant can be used to reduce the complexity of cracking a key to something that could be accomplished with a very large supercomputer. Maybe new constants are published at regular intervals to reduce issues with leaks of the private constant and maybe encryption users could regularly decrypt and re-encrypt using the newer constant to prevent older encrypted data from becoming less secure over time.

    Now suppose that use of this scheme was mandated, enforced by a fine, with exemptions for encryption used for health records, sales transactions, and messages under attorney-client privilege. Messages and personal data are under the mandate, but personal encryption users would be subject to at most a fine for violating the mandate and not subject to criminal prosecution. Service providers offering encrypted storage or messaging, on the other hand, likely would not be able to risk the fine.

    This is probably what the middle ground looks like. A pure backdoor system where the authorities can access any encrypted information at any time is a straw man of what the other side wants. They may actually even say they want it, but when you come down to it, they're mistaken on what they want, because they'll definitely want data security for certain pragmatic purposes and they'll grudgingly concede certain explicit civil rights uses such as attorney-client privilege if pressed hard enough.

    If a serious middle ground proposal in this vein is aired, they will get the major tech companies on board and they will cut out the legs from under the traditional encryption advocates. Of course, that doesn't mean the middle ground proposal will actually work, since there are a lot of other devils in the details. A sloppy execution results in it falling apart, and there is plenty of attack surface in this kind of framework. But the point is that eventually the conversation may change so that rather than arguing from a fundamental position, you're arguing about expected pragmatic outcomes of a particular plan. That's going to be a much different animal.

    --
    Write your congressman. Tell him he sucks.
    • (Score: 0) by Anonymous Coward on Wednesday January 25 2017, @04:52PM

      by Anonymous Coward on Wednesday January 25 2017, @04:52PM (#458526)

      That sounds exactly like: https://en.wikipedia.org/wiki/Dual_EC_DRBG [wikipedia.org] in that the whole security of the thing depends on two constants being independent. The security of the whole thing goes out the window if there is a valid solution to (Backdoor EC multiplication Secret1) = Secret2. True, solving that inverse operation is hard, but not impossible. And once someone gets it, the whole thing is insecure in that you can predict the random output.

    • (Score: 2) by meustrus on Thursday January 26 2017, @03:42PM

      by meustrus (4961) on Thursday January 26 2017, @03:42PM (#458962)

      Now suppose that use of this scheme was mandated, enforced by a fine, with exemptions for encryption used for health records, sales transactions, and messages under attorney-client privilege. Messages and personal data are under the mandate, but personal encryption users would be subject to at most a fine for violating the mandate and not subject to criminal prosecution. Service providers offering encrypted storage or messaging, on the other hand, likely would not be able to risk the fine.

      That is indeed a middle ground. It's also completely unworkable because of whom it targets: service providers like Facebook or Apple. These service providers are the people with lobbying power. They don't want to succumb to this surveillance because of two reasons: 1) it incurs unnecessary expenses on their part (which is the same reason all industries resist regulation of any kind), and 2) it will make their users angry, possibly angry enough to leave the platform. And ultimately, while conservatives should be concerned about reason #1, security buffs are very concerned about #2.

      It comes down to the exact same problem as digital piracy. Right now Facebook is like Napster: operating outside the rules (in this case the unwritten rules or else they'd have met the same fate). When Napster was shut down, users who left the platform didn't stop pirating. They moved to a decentralized platform that was harder to crack down on. Similarly, if Facebook were required to build in backdoors, users would leave in favor of a decentralized platform. This would similarly make it harder to enforce the rules, and much like the difference between DRM-based legal stores and torrents, the legitimate customers would end up with a product that is inferior to what the terrorists get. And the terrorists still win.

      In short, if you make encryption illegal, then only terrorists will have encryption.

      Which is how we get to where we currently are: the NSA gets secret powers because if they got what they needed by law, everyone would know what they are doing and the bad actors would work to prevent it. The NSA probably doesn't want Facebook to have legally required backdoors because that would actually make their existing tools - which rely on people using Facebook without really thinking about their security - less effective.

      What the NSA should really want is specifically to target individual users, not services. The realistic fear is that terrorists will create a real encrypted platform outside of US control. They wouldn't even need to solve hard decentralization problems to keep the platform safe from air strikes; they could use the already-available decentralization solution that the Pirate Bay uses to avoid being shut down everywhere: keep lots of mirrors. Your "devil's advocate" scheme will do nothing to help combat this situation and may even help bring it about.

      --
      If there isn't at least one reference or primary source, it's not +1 Informative. Maybe the underused +1 Interesting?
    • (Score: 2) by urza9814 on Friday January 27 2017, @01:22AM

      by urza9814 (3954) on Friday January 27 2017, @01:22AM (#459244) Journal

      Let's say you have a cryptographic system where there is an algorithm that uses a publicly available constant in the process. This algorithm would be framed so that a private constant can be used to reduce the complexity of cracking a key to something that could be accomplished with a very large supercomputer. Maybe new constants are published at regular intervals to reduce issues with leaks of the private constant and maybe encryption users could regularly decrypt and re-encrypt using the newer constant to prevent older encrypted data from becoming less secure over time.

      You've already got a problem. You can't re-encrypt to make the data more secure, because someone may already have a copy of the data that used the old encryption keys. Or you might just forget to re-encrypt an old copy that's sitting on your backup server. You can't assume that the criminal is only trying to break into live data. They'll take the data and sit on it for a few months or even years, and until they crack it, you may not even know it's been stolen (and even then you still might not know). In fact, *they already do this*. Thankfully, most encryption schemes are designed to last quite a few years, and hopefully the data they protect is useless by the time they can be cracked...but yeah, that's another scheme that just makes things easier for the criminals.

      Any encryption scheme designed to be broken is, well, broken.

  • (Score: 2) by Hyperturtle on Wednesday January 25 2017, @04:29PM

    by Hyperturtle (2824) on Wednesday January 25 2017, @04:29PM (#458519)

    I hope you are right that this can be done, that this can be effectively communicated, and that people are willing to try to understand.

    There is a reason that many tech companies are made up of individuals that are not on the fringes of the political spectrum. A lot of that has to do with the view of what change is and what it represents.

    Certainly, change is a constant. It would seem that some people will go to great lengths to avoid change, though, and willful ignorance (I am not saying stupidity--I am saying the refusal to accept something and carrying on as normal) is an unfortunate response by people that don't have a good alternative to the change.

    Often it makes things worse, like policies that are not in the best interests of those resisting change, made more easily implemented because of a refusal to understand the issues and a desire for nothing to happen. The problem is that changes are more easily introduced and implemented when the populace is ignorant--willful or otherwise.

    IT is an excellent arena where we can safely conclude that the populace is ignorant of the difficulties and concerns surrounding many concepts within it-- privacy, security -- and the merger of both, encryption-- but we can also conclude that the populace knows how to respond to fear. Consider one-issue voters. They are often compelled to vote out of fear that their one issue will no longer be decided in their favor.

    It is easier to tap into an emotion than it is to relate to an "expert", even worse when they are described as egghead geeks that are so far removed that they don't understand the 'realities'.

    I am hoping for the best, but expect that Big Brother Inside (the clipper chip, if you recall--something President Clinton proposed back during his presidency) will see its modern equivalent make a resurgence. This may result in a mandate... and the support will be drummed up because of various threats described that have only one solution to maintain your safety--complete surrender of privacy in the digital realm in the name of security, because terrorist pedophiles want to take your rights while praying in a different religion as they compete for outsourcing contracts to replace your jobs.