Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 18 submissions in the queue.
posted by n1 on Sunday August 10 2014, @08:11PM   Printer-friendly
from the sellers-market dept.

Digital Era reports:

To increase the security of the internet and computers, the government should corner the market on zero-day vulnerabilities and exploits, offering top-dollar to force out all other buyers. At least, that's what Dan Geer thinks, and his opinion matters. Geer is chief information security officer at the CIA's venture capital arm In-Q-Tel, which invests in technologies that help the intelligence community.

Geer, an icon in the world of computer security, delivered his controversial stance during a keynote at the Black Hat security conference in Las Vegas today. His talk, entitled "Cybersecurity as Realpolitik" was provocative throughout, including advocating that software companies make their unsupported products open source to keep them secure. He even quoted the Code of Hammurabi (circa 1700 B.C.) while suggesting that product liability be applied to source code. "If a builder builds a house for someone, and does not construct it properly, and the house which he built falls in and kills its owner, then the builder shall be put to death," he said. While the death penalty may a little severe for software makers who fail to adequately secure their products, criminal and civil liability isn't, he noted.

But the highlight of Geer's talk was definitely his suggestion that the U.S. Government own the zero-day market. Zero-day vulnerabilities are security holes in software that are yet unknown to software makers or to antivirus firms. They're unpatched and unprotected, leaving them open to exploit by spy agencies, criminal hackers, and others. Once the government purchases zero-days, he said, it should burn them by disclosing them.

Related Stories

PHK on Surveillance Which Is Too Cheap to Meter 3 comments

Developer Poul-Henning Kamp (PHK) has written a brief post in the July issue of Communications of the ACM about the cost of surveillance having become negligible. Furthermore, in many cases that surveillance is actually required either by large governments or by large corporations, thus making it cheaper to go with the flow and track people and their online activities very closely as it becomes more and more expensive for programmers and developers to even try to avoid tracking people and their online activities.

During his keynote address, risk management specialist Dan Geer asked the 2014 Black Hat audience a question: "What if surveillance is too cheap to meter?"

As is the case with electricity from nuclear power, technology has little to do with it: This is a question about economy, specifically the economy of the path of least resistance.

Surveillance is ridiculously cheap for governments. Many have passed laws that obligate the surveillance industry—most notably, the mobile network operators—to share their take "at cost," and we know law enforcement uses it a lot.

So why is so much cheap surveillance available for purchase?

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by gringer on Sunday August 10 2014, @09:06PM

    by gringer (962) on Sunday August 10 2014, @09:06PM (#79771)

    If a builder builds a house for someone, and does not construct it properly, and the house which he built falls in and kills its owner, then the builder shall be put to death.

    I agree that people who are paid to write code for a specific single purpose should be liable for damages caused by the use of that code in its original utility, but the more common use case (especially for free software) is that someone will create a program, and someone else will decide to use it.

    However, I can imagine that someone could be sued in USA for writing code that someone else used for a completely different purpose and caused damage.

    --
    Ask me about Sequencing DNA in front of Linus Torvalds [youtube.com]
    • (Score: 2) by SlimmPickens on Sunday August 10 2014, @09:17PM

      by SlimmPickens (1056) on Sunday August 10 2014, @09:17PM (#79774)

      But what if you change the question a bit? Are there any scenarios where some additional liability is appropriate?

      • (Score: 2) by gringer on Monday August 11 2014, @01:18AM

        by gringer (962) on Monday August 11 2014, @01:18AM (#79863)

        Changing the question? What question?

        In the specific scenario that I've stated, liability is appropriate -- it's the closest that software creation gets to the "builder" example. Otherwise, the client is taking code that was possibly designed for something else, and I think that the liability should fall on the client (because it was their decision to use that specific code).

        If I were to hesitantly extend the building analogy, it would be like saying that a builder of an apartment block would be liable for damages caused by someone installing a 500-tonne polythene pool in the living room of the top storey (and assuming commercial software, charging people who used that pool):

        http://www.news.com.au/finance/real-estate/russian-teenagers-turn-apartment-into-swimming-pool/story-fncq3era-1227011394167 [news.com.au] [just in case you didn't think it was a realistic situation]

        FWIW, New Zealand is currently struggling through liability issues for "leaky buildings" -- bad building design a few decades ago is causing premature rotting and water damage. The local councils, government, builders, and original purchasers all have some role in causing the problem, and it's a fairly tricky problem sorting out who should pay who.

        Of course, if a software creator freely offers liability, then that's a different matter, and I think people should accept liability if they have previously said they will be liable.

        --
        Ask me about Sequencing DNA in front of Linus Torvalds [youtube.com]
  • (Score: 1) by doublerot13 on Sunday August 10 2014, @09:55PM

    by doublerot13 (4497) on Sunday August 10 2014, @09:55PM (#79785)

    Windows will automatically make people admins when they setup their computer. So it is trivial to find a crack in the browser/plugins then own their box because the browser is running as admin.

    Microsoft needs to force people to use very strong passwords for the Admin account[which shouldn't be called Administrator, it should be user chosen] and strongly suggest and/or by default make people not Admins during the initial setup.

    Browser makers should also create an unprivileged account on the machine for the browser to run under during install.

    Yes this means that you'll have to find and enter that long nasty password when you intentionally install software or change admin settings on the box. This is a good thing as it makes you at least aware that your actions are making important changes to the OS.

    • (Score: 0) by Anonymous Coward on Sunday August 10 2014, @11:30PM

      by Anonymous Coward on Sunday August 10 2014, @11:30PM (#79829)

      No, when you create your account on vista and above it is as a reduced-privileged user. The Administrator account is even specifically disabled.

      • (Score: 2, Interesting) by doublerot13 on Sunday August 10 2014, @11:43PM

        by doublerot13 (4497) on Sunday August 10 2014, @11:43PM (#79833)

        I just did test install of Windows 7 in a vm, it made my first user an Admin.

    • (Score: 1) by radu on Monday August 11 2014, @01:29PM

      by radu (1919) on Monday August 11 2014, @01:29PM (#80039)

      Windows will automatically make people admins when they setup their computer

      Some Linux distributions do that too (e.g. Debian asks for root password during installation; it's not obvious to leave it blank). And Linux won't ask "are you sure?" if you "rm -rf /etc" as root. The only difference I see is that people who install Linux often have more knowledge about computers so they don't tend to click on "you won, click here" every time.

      • (Score: 2) by HiThere on Monday August 11 2014, @06:45PM

        by HiThere (866) Subscriber Badge on Monday August 11 2014, @06:45PM (#80159) Journal

        FWIW, I never leave it blank. And I don't install sudo. (Well, I don't use Ubuntu...if I did I wouldn't fight the system, because I don't think it's *that* big a security hole.)

        Having a password for root, which by default can't log-in as a user, is not a bad idea, and isn't at all the same as having the default user be root. Ubuntu's use of sudo is much closer to that, though even in Ubuntu the administrator can remove accounts from the sudoer's list...including the first account. What I consider slightly scary is the systems that have "automatic logon to the default account". It's probably not as bad as it looks like, but it looks pretty bad. Also, however, pretty convenient...which is probably why people sometimes set things up that way.

        P.S.: Even when using Ubuntu I put a password on the root account. This wasn't strictly necessary, as one can always enter "sudo su" into a terminal, and get a root shell. As long as I'm the only user of the machine, things work pretty well either way.

        P.P.S.: I don't know if it's still true, but awhile back a MSWind user who wasn't running as root would find that a large number of normal programs would refuse to work properly. Then needed admin privileges to run. This isn't strictly speaking entirely the fault of MS, but rather of programmers who didn't care about security and worked for a large number of different companies. (Some, admittedly, worked for MS, but not most of them.)

        --
        Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
  • (Score: 4, Insightful) by Anonymous Coward on Sunday August 10 2014, @10:03PM

    by Anonymous Coward on Sunday August 10 2014, @10:03PM (#79789)

    ...deliberately code them.

    Either way, you get PAID! :D

    But seriously, this is either a patently bad idea or the Feds WILL
    do this then disclose the 'low hanging fruit' vulnerabilities while
    keeping for themselves the 'crown jewels' they can uses in the next
    state-level cyberattack against their enemies....

    How could the general public be ABSOLUTELY SURE that ALL the 0-day
    exploits are disclosed by the Feds?

    • (Score: 3, Informative) by cafebabe on Sunday August 10 2014, @10:51PM

      by cafebabe (894) on Sunday August 10 2014, @10:51PM (#79822) Journal

      I believe this is called the Cobra Effect [freakonomics.com]. It doesn't work with snakes. It doesn't work with rats. And it probably won't work with bugs.

      --
      1702845791×2
      • (Score: 2) by cafebabe on Monday August 11 2014, @12:00AM

        by cafebabe (894) on Monday August 11 2014, @12:00AM (#79839) Journal

        After reading the Cobra Effect [freakonomics.com] in full, I have come to the conclusion that bounties work if and only if supply is finite. For example, putting a bounty on an individual. However, at this point, we have the crowdsourcing problem where zero or more parties work on a problem but zero or one parties are rewarded. Even this assumes that the party offering the reward is honorable enough to pay it.

        After reading the comments for the Cobra Effect [freakonomics.com], I found a quote which raises another issue [terrypratchettbooks.com] and is presumably based on the Hanoi rat problem:-

        Rats had featured largely in the history of Ankh-Morpork. Shortly before the Patrician came to power there was a terrible plague of rats. The city council countered it by offering twenty pence for every rat tail. This did, for a week or two, reduce the number of rats - and then people were suddenly queuing up with tails, the city treasury was being drained, and no-one seemed to be doing much work. And there still seemed to be a lot of rats around. Lord Vetinari had listened carefully while the problem was explained, and had solved the thing with one memorable phrase which said a lot about him, about the folly of bounty offers, and about the natural instinct of Ankh-Morporkians in any situation involving money: 'Tax the rat farms.'

        If we apply this principle to a government bug bounty, perhaps the government should tackle tax dodging before making a welfare program for software companies.

        --
        1702845791×2
        • (Score: 0) by Anonymous Coward on Monday August 11 2014, @12:47AM

          by Anonymous Coward on Monday August 11 2014, @12:47AM (#79852)

          In other words, a better approach would be to fine companies for every vulnerability found in the software they produce and for leaving networks unsecured, and to make sure simple obfuscation and threats aren't used to hide vulnerabilities, "hackers" will be employed to regularly search for vulnerabilities and do penetration testing.

          • (Score: 2) by cafebabe on Monday August 11 2014, @01:40AM

            by cafebabe (894) on Monday August 11 2014, @01:40AM (#79866) Journal

            The best approach to software quality is to have a tight, closed loop in which the people who write the bugs are the people who fix the bugs, search for similar bugs and find methods to preclude bugs of similar classes from occurring in the future [fastcompany.com]. If that isn't possible, have an adversarial approach in which programmers try to prevent bugs and tester try to uncover bugs. If that isn't possible, have a consensual beta testing program with close partners. Having an open loop (where the bad guys find it first and disclose/deploy on their own terms) is awful and shouldn't be encouraged - even if there is an established market for such work.

            The knee-jerk reaction is to fine the developer in an attempt to force a closed loop of quality. However, that disadvantages software development which is not financially motivated. And any country which fines developers (if indeed that is possible) will find itself economically disadvantaged. The current head-in-sand approach to flawed deployments is only being corrected through market forces [soylentnews.org]. However, this could be suppressed for national security or somesuch bunkum [soylentnews.org]. So, penetration testing won't be a long-term solution, especially if it requires significant legwork.

            I can see the appeal of bug bounties. You only pay for results and it can be done discreetly. However, how many bugs are there? I believe there is one bug per 250 lines of deployed code. How much is an exploit worth? US$1,000? That means the US Government will underwrite code at the value of US$4 per line of code. Quibble about the figures, if you want, but that's going to be the effect. And if it wants to outbid a market which lacks exclusivity, that cost will increase rather than fall with volume.

            Whatever happens, total tax revenue from software development must cover these schemes otherwise it creates perverse incentives.

            --
            1702845791×2
            • (Score: 2) by Magic Oddball on Monday August 11 2014, @09:36AM

              by Magic Oddball (3847) on Monday August 11 2014, @09:36AM (#79975) Journal

              Interesting article... What particularly struck me is that if the NASA team's more professional, "mature" approach were to become standard, it would actually be a matter of the industry going back to its roots.

              I wasn't around back then, but last winter I read a fascinating non-fiction book called "The Soul of a New Machine" that took place during the industry's shift from the old guard of mature professionals to the young hotshots, depicted by chronicling the experiences of a brand-new team (people between the two "types") as they designed a mainframe with the leadership of two 'old guard' men. If you're interested in the 'professionalism' approach as a potential solution, I highly recommend checking it out -- it's painfully obvious, reading it now, where a lot of the problems began. Here's the Wikipedia page [wikipedia.org] if you're remotely interested.

              • (Score: 2) by cafebabe on Tuesday August 12 2014, @11:30PM

                by cafebabe (894) on Tuesday August 12 2014, @11:30PM (#80652) Journal

                The Soul Of A New Machine by Tracy Kidder is a very good book. They made the Last Great Minicomputer and released it after the world moved on. Indeed, the end of the book reminded me of Ender's Game. Regardless, it is an example of doing things properly. For example, staff shrinkage was handled in a manner consistent with the Mythical Man-Month.

                I think it was a matter of scale that killed that line of development. The Data General Eclipse MV/8000 had current hungry PLAs [wikipedia.org] and 88(?) bits of microcode ROM. Whereas, the microcomputer guys were working with specks of silicon which were barely overly the break even point.

                --
                1702845791×2
            • (Score: 2) by HiThere on Monday August 11 2014, @06:51PM

              by HiThere (866) Subscriber Badge on Monday August 11 2014, @06:51PM (#80164) Journal

              Your estimate is equating bugs with vulnerabilities. They aren't always the same. Most bugs just make things either stop working or produce the wrong result. (I think stop working is usually preferable, but it *does* depend on the application.)

              --
              Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
              • (Score: 2) by cafebabe on Tuesday August 12 2014, @11:37PM

                by cafebabe (894) on Tuesday August 12 2014, @11:37PM (#80656) Journal

                I was specifically taking into account that bugs and bug bounties follow power laws. Most bugs aren't a security problem. Some cannot be triggered reliably. And many occur in obscure programs. However, the bug bounty market pays accordingly. Even if my figures are wrong by a factor of 1,000, are you happy with the US Government underwriting bugs in software? And are you happy about this situation when companies writing bugs are tax dodging?

                --
                1702845791×2
                • (Score: 2) by HiThere on Wednesday August 13 2014, @06:08PM

                  by HiThere (866) Subscriber Badge on Wednesday August 13 2014, @06:08PM (#80906) Journal

                  OK. I agree that selling programs with known bugs, is fraud unless they are acknowledged clearly ahead of time, and that being paid to fix your own bugs is worse. I don't think it's tax evasion, however.

                  OTOH, it needs to be a civil tort, because it's impossible to avoid shipping software that contains bugs, except by not shipping software. So it shouldn't be criminal.

                  Perhaps shipping software that contains security risks, or not fixing them once they are detected, should be called "maintaining an attractive nuisance". (I'd be tempted to call it conspiracy to maintain an attractive nuisance, but that shoves it over into criminal law.) Or considering that we're talking about security risks rather than bugs now, perhaps it should be criminal.

                  But do be aware that often the people who write the code are unable to see the flaws in it until they are pointed out by someone else. So you don't want the people who wrote the code responsible for detecting the bugs.

                  --
                  Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
    • (Score: 0) by Anonymous Coward on Monday August 11 2014, @03:19AM

      by Anonymous Coward on Monday August 11 2014, @03:19AM (#79889)

      Indeed, this is a perverse incentive [wikipedia.org] waiting to happen. Just as the government bounties on rat tails in Hanoi led to people just farming the rats to get the bounties, this will probably lead to developers intentionally coding security bugs into their software and then selling these "bugs" to the Feds. I can't imagine Microsoft or any other software company being above these kinds of shenanigans.

    • (Score: 2) by mrider on Monday August 11 2014, @07:25PM

      by mrider (3252) on Monday August 11 2014, @07:25PM (#80178)

      I'm surprised nobody has pointed this out:

      O.B. Dillbert [dilbert.com]

      --

      Doctor: "Do you hear voices?"

      Me: "Only when my bluetooth is charged."

  • (Score: 2, Interesting) by Anonymous Coward on Sunday August 10 2014, @10:34PM

    by Anonymous Coward on Sunday August 10 2014, @10:34PM (#79808)

    If a group (any group) is known to be willing to buy all of something, at any price, you can get infinite money by selling your security vulnerability (set the price as high as you want).

    Aside from the obvious cost to buy them, this drives up the market value, and increases the incentive to create such vulnerabilities: If I can get a guaranteed high priced sale to the feds, I can just put holes in my own software (of that of my company) and sell them to get hella rich.

    So A: this costs infinite money to implement and B provides an infinite incentive to make holes in software. It's impossible (too costly) and won't work.

    • (Score: 2) by kaszz on Sunday August 10 2014, @10:40PM

      by kaszz (4211) on Sunday August 10 2014, @10:40PM (#79815) Journal

      Must be why it's attached to the idea of liabilities. Which means lawyers get rich..

      #1 - Bad idea
      #2 - Bad idea
      #3 - The government will screw you after all this is implemented anyway
      #4 - The lawyers will ruin you soon after
      :D

    • (Score: 2) by c0lo on Sunday August 10 2014, @11:56PM

      by c0lo (156) Subscriber Badge on Sunday August 10 2014, @11:56PM (#79838) Journal

      If a group (any group) is known to be willing to buy all of something, at any price, you can get infinite money by selling your security vulnerability (set the price as high as you want).

      Just replace password with vulnerability [xkcd.com]

      --
      https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
      • (Score: 1) by pogostix on Monday August 11 2014, @05:02AM

        by pogostix (1696) on Monday August 11 2014, @05:02AM (#79920)

        I'm not a big fan of XKCD links even when they are relevant :P

        • (Score: 2) by c0lo on Monday August 11 2014, @06:22AM

          by c0lo (156) Subscriber Badge on Monday August 11 2014, @06:22AM (#79936) Journal

          I'm not a big fan of XKCD links even when they are relevant :P

          I'm not a fan of ballet either. But... I reckon there's no drama as long as you get the message to dialogue partner intended.

          --
          https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford