Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 10 submissions in the queue.
posted by mrpg on Tuesday February 26 2019, @09:49AM   Printer-friendly
from the ?¿?!!¡¡ dept.

Submitted via IRC for chromas

Surveillance firm asks Mozilla to be included in Firefox's certificate whitelist

[...] The vendor is named DarkMatter, a cyber-security firm based in the United Arab Emirates that has been known to sell surveillance and hacking services to oppressive regimes in the Middle East

[...] On one side Mozilla is pressured by organizations like the Electronic Frontier Foundation, Amnesty International, and The Intercept to decline DarkMatter's request, while on the other side DarkMatter claims it never abused its TLS certificate issuance powers for anything bad, hence there's no reason to treat it any differently from other CAs that have applied in the past.

Fears and paranoia are high because Mozilla's list of trusted root certificates is also used by some Linux distros. Many fear that once approved on Mozilla's certificate store list, DarkMatter may be able to issue TLS certificates that will be able to intercept internet traffic without triggering any errors on some Linux systems, usually deployed in data centers and at cloud service providers.

In Google Groups and Bugzilla discussions on its request, DarkMatter has denied any wrongdoing or any intention to do so.

The company has already been granted the ability to issue TLS certificates via an intermediary, a company called QuoVadis, now owned by DigiCert.

Also at Electronic Frontier Foundation


Original Submission

Related Stories

Mozilla Fears DarkMatter 'Misuse' Of Browser For Hacking 7 comments

Firefox browser-maker Mozilla is considering whether to block cybersecurity company DarkMatter from serving as one of its internet security gatekeepers after a Reuters report linked the United Arab Emirates-based firm to a cyber espionage program.

Reuters reported in January that DarkMatter provided staff for a secret hacking operation, codenamed Project Raven, on behalf of an Emirati intelligence agency. The unit was largely comprised of former U.S. intelligence officials who conducted offensive cyber operations for the UAE government.

Former Raven operatives told Reuters that many DarkMatter executives were unaware of the secretive program, which operated from a converted Abu Dhabi mansion away from DarkMatter’s headquarters.

Those operations included hacking into the internet accounts of human rights activists, journalists and officials from rival governments, Reuters found. DarkMatter has denied conducting the operations and says it focuses on protecting computer networks.

[...] DarkMatter has been pushing Mozilla for full authority to grant certifications since 2017, the browser maker told Reuters. That would take it to a new level, making it one of fewer than 60 core gatekeepers for the hundreds of millions of Firefox users around the world.

[Selena] Deckelmann said Mozilla is worried that DarkMatter could use the authority to issue certificates to hackers impersonating real websites, like banks.

As a certification authority, DarkMatter would be partially responsible for encryption between websites they approve and their users.

In the wrong hands, the certification role could allow the interception of encrypted web traffic, security experts say.

In the past Mozilla has relied exclusively on technical issues when deciding whether to trust a company with certification authority.

The Reuters investigation has led it to reconsider its policy for approving applicants. “You look at the facts of the matter, the sources that came out, it’s a compelling case,” said Deckelmann.

Previously: Surveillance Firm Asks Mozilla to be Included in Firefox's Certificate Whitelist


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 2) by NateMich on Tuesday February 26 2019, @11:28AM (4 children)

    by NateMich (6662) on Tuesday February 26 2019, @11:28AM (#806877)

    Just tell them no.
    Don't give a reason.

    • (Score: 2) by DannyB on Tuesday February 26 2019, @03:33PM (3 children)

      by DannyB (5839) Subscriber Badge on Tuesday February 26 2019, @03:33PM (#806968) Journal

      But they have never abused their super powers in any detectable way that you know of or ever will know of. So they have a clean track record and only intend good for all humanity.

      --
      Every performance optimization is a grate wait lifted from my shoulders.
      • (Score: 2) by captain normal on Tuesday February 26 2019, @06:25PM (2 children)

        by captain normal (2205) on Tuesday February 26 2019, @06:25PM (#807134)

        You forgot the "/s" :-))

        --
        Everyone is entitled to his own opinion, but not to his own facts"- --Daniel Patrick Moynihan--
        • (Score: 2) by DannyB on Tuesday February 26 2019, @06:38PM (1 child)

          by DannyB (5839) Subscriber Badge on Tuesday February 26 2019, @06:38PM (#807145) Journal

          Oh? I thought I had included it: and only intend good for all humanity

          --
          Every performance optimization is a grate wait lifted from my shoulders.
          • (Score: 2) by captain normal on Tuesday February 26 2019, @08:11PM

            by captain normal (2205) on Tuesday February 26 2019, @08:11PM (#807227)

            It was pretty obvious to me, but you know how some of our resident trolls can be.

            --
            Everyone is entitled to his own opinion, but not to his own facts"- --Daniel Patrick Moynihan--
  • (Score: 5, Interesting) by pTamok on Tuesday February 26 2019, @11:32AM (4 children)

    by pTamok (3042) on Tuesday February 26 2019, @11:32AM (#806878)

    I have never understood why it is not easier to edit the list of trusted root certificates* - personally, I would like to start with an empty list, and add CA roots whenever the browser finds a new one by popping up a dialogue that says something like: New root certificate entry needed - Allow permanently, allow once, deny permanently, deny once?:

    This would allow me to research the CAs I'm trusting before using them, and by default deny any new ones.

    As it is, a large number of organisations I've never heard of ( see Mozilla's list here: https://wiki.mozilla.org/CA/Included_Certificates [mozilla.org] ) say that I should trust that data communicated with a site using one of their certificates is secure against third party eavesdropping whilst in transit.

    Since I'm not planning to transact business with sites secured by, say, Chunghwa Telecom, E-Tugra, or , I would be nice to be able to flag up when I'm unexpectedly relying on them.

    *Mozilla bug is here: https://bugzilla.mozilla.org/show_bug.cgi?id=545498 [mozilla.org] "Provide Capabilities to Detect and Manage Root Certificate Inconsistencies", Staus is "RESOLVED, WONTFIX". You can use the CLI utility 'certutil' to list non-default Root Certificate settings: https://superuser.com/questions/734208/is-there-a-program-to-check-the-list-of-root-cas-in-firefox [superuser.com] - it is not exactly a 'user friendly' process.

    • (Score: 3, Insightful) by zocalo on Tuesday February 26 2019, @11:53AM (2 children)

      by zocalo (302) on Tuesday February 26 2019, @11:53AM (#806880)
      Probably because Mozilla is catering for regular users who just want their browser to work (and will switch to Chrome if it doesn't), probably won't understand what they are agreeing to, and may live in a region of the world where Chunghwa Telecom, E-Tugra, or whatever is much more prevalent as a CA, although they must have some nous because they've presumably installed an alternative browser in the first place. It would be nice to have the option to have a combination of default deny with a "first time seen confirmation" prompt though, but that hardly fits in with what appears to be the current design ethos of throwing in everything including the kitchen sink that most users don't want while simultaneously removing or burying under "about:config" all the stuff that many users do seem to want. You can always edit the list and delete any unwanted CA entries, of course, but I suspect they'll come straight back on the next update.
      --
      UNIX? They're not even circumcised! Savages!
      • (Score: 3, Interesting) by hoeferbe on Tuesday February 26 2019, @01:31PM (1 child)

        by hoeferbe (4715) on Tuesday February 26 2019, @01:31PM (#806898)
        zocalo (302) [soylentnews.org] wrote [soylentnews.org]:

        You can always edit the list and delete any unwanted CA entries, of course, but I suspect they'll come straight back on the next update.

        This.  Long ago, I tried deleting all my CAs, intending to only accept server certificates that I verified through a secondary route.  Not only was it a tremendous hassle since various huge entities would have different certificates that expired at different times for their ephemeral cloud / content delivery network machines, but yes, a Firefox update undid all that work.

        I tried using Certificate Patrol [psyced.org] for much the same reason. Yet, even with its ability to whitelist by domain names, it was still too much of a pain caused by clustered sites using several inconsistent certificates.

        • (Score: 3, Interesting) by pTamok on Tuesday February 26 2019, @06:37PM

          by pTamok (3042) on Tuesday February 26 2019, @06:37PM (#807144)

          I used to use the Firefox Extension 'Perspectives' from CMU, but it ran into the sand. It was a nice idea - there is still a zombie website [perspectives-project.org], but little else to show for it.

          It checked to see if the certificate* you got for a website was the same as other users got by using public notaries that kept track of certificates from multiple different locations. A MITM attack using a certificate that passed browser validations substituting for the actual certificate would be detected, as checking the public notary servers would show the discrepancy.
          More details here: Perspectives: Improving SSH-style Host Authentication with Multi-Path Probing Dan Wendlandt, David G. Andersen, Adrian Perrig Carnegie Mellon University [cmu.edu] (pdf)

          It is something that I wish had been taken further.

          *Actually, server public keys.

    • (Score: 0) by Anonymous Coward on Tuesday February 26 2019, @06:54PM

      by Anonymous Coward on Tuesday February 26 2019, @06:54PM (#807163)

      In a sane CA system, sites would have certs signed by as many CAs as possible. If one CA shows the slightest sign of misbehavior, users blacklist them -- and it doesn't break the interwebs, because everybody worth a damn uses multiple CAs. This threat of immediate blacklisting, in turn, gives CAs some motivation to do their fucking jobs, to do them right, and to be seen to be doing them right. And it doesn't rely on all users having the same list of trusted CAs -- if you have 3 or 4 of the top 5 CAs, that's going to give you pretty much complete coverage.
      But our system is broken by design -- each site must choose only one CA. So blacklisting any CA instantly breaks every site that uses that CA, with no fallback. The inevitable result has been that browsers maintain a uniform list of CAs that are trusted, users are discouraged from changing this list, and that even once we have clear proof of a CA's untrustworthiness, there's much delay and hand-wringing before the major browsers will actually remove them from the list.

      It's also broken in another way, though at least there's been some effort to fix this one. Given one CA per site, you might expect some hierarchy or per-site definition defining which CA a given site could use. Otherwise, when one CA is compromised, they could sign certs to MITM any site in the world, not just the ones that use them. But no, every CA you trust at all is trusted globally. You can't say "I trust CA X for sites A, B, C, and D, and CA Y for sites E and F" -- even though site A has always used CA X, and seeing a cert for that site from CA Y would suggest something fishy, your only choices are to not trust CA Y at all (thus breaking sites E and F) or trust it globally. There's multiple efforts to deal with this (DANE, HPKP, etc.), but none are universally adopted yet.

  • (Score: 4, Interesting) by bradley13 on Tuesday February 26 2019, @11:53AM (6 children)

    by bradley13 (3053) on Tuesday February 26 2019, @11:53AM (#806881) Homepage Journal

    There's an easy solution that should have been implemented years ago: Tie CA privileges to specific domains or subdomains. The only organization that should be able to issue a root-level certificate is the owner of that root-level domain, unless some CA can make an excellent case for an exception (and received the approval of the TLD owner).

    So, sure, these guys can be a CA. And they can issue certificates for any domain where they can both (a) make a case that they should, *and* (b) get the approval of the owner.

    --
    Everyone is somebody else's weirdo.
    • (Score: 3, Interesting) by iamjacksusername on Tuesday February 26 2019, @01:10PM (1 child)

      by iamjacksusername (1479) on Tuesday February 26 2019, @01:10PM (#806891)

      From a technical standpoint, I absolutely agree. From a practical standpoint, it will never be implemented. The amount of domains with misconfigured spf and non-existent dkim records is staggering and email is something end users actual care about. From the support side, this will result in an endless stream of the same ticket: my browser doesn't work anymore.

      • (Score: 3, Informative) by zocalo on Tuesday February 26 2019, @02:51PM

        by zocalo (302) on Tuesday February 26 2019, @02:51PM (#806935)
        This would apply to the operators of gTLDs and ccTLDs, not all the myriad owners of random "company.com" type domains, or whatever. I'd like to think that the former group at least know how to setup DNS properly, although realistically all they need to do is run some servers (albeit potentially quite beefy ones), delegate sub-domains, and optionally implement DNSSec, so not exactly getting into the technical weeds of everything that DNS can do. e.g. if you are the operator of .com (Verisign) you could either opt to be the sole CA for all domains bought within the .com gTLD (ka-CHINGGG!) and/or delegate that out to one or more approved CAs, so they could (for instance) allow Thawte to issues certs for .com domains but deny Comodo the right to doing so.

        A nice idea in a way since root CAs are meant (in theory at least) to be partly responsible for downstream CAs using their certs, and it might also encourage more TLD operators to run a cleaner ship (there's probably no helping some of the "random word" gTLD cesspits out there though), but potentially hugely anti-consumer as it's likely to see prices for certificates on some domains spike considerably with the removal of competition.
        --
        UNIX? They're not even circumcised! Savages!
    • (Score: 2, Insightful) by Anonymous Coward on Tuesday February 26 2019, @01:48PM

      by Anonymous Coward on Tuesday February 26 2019, @01:48PM (#806907)

      Is the SSLiverse a safe place? [media.ccc.de](video) Hint: No.

      And that was 8 years ago. Now is not better. Nuff said

    • (Score: 0) by Anonymous Coward on Tuesday February 26 2019, @10:12PM (2 children)

      by Anonymous Coward on Tuesday February 26 2019, @10:12PM (#807294)

      There exists a standard which can be implemented by the domain owners themselves to restrict which CA's have the ability to sign certificates for their domains. This standard is called Certificate Authority Authorization (CAA). https://blog.qualys.com/ssllabs/2017/03/13/caa-mandated-by-cabrowser-forum [qualys.com]

      • (Score: 0) by Anonymous Coward on Tuesday February 26 2019, @11:06PM (1 child)

        by Anonymous Coward on Tuesday February 26 2019, @11:06PM (#807315)

        That "solution" depends on the assumption that the real problem is blameless CAs being tricked into signing bad certs. If a corrupt or incompetent CA wants to ignore CAA, there's nothing to prevent them or flag that cert as invalid.

        • (Score: 0) by Anonymous Coward on Tuesday February 26 2019, @11:18PM

          by Anonymous Coward on Tuesday February 26 2019, @11:18PM (#807323)

          True. It would be nice if the browser itself could enforce this behavior.

  • (Score: 3, Insightful) by pkrasimirov on Tuesday February 26 2019, @01:04PM

    by pkrasimirov (3358) Subscriber Badge on Tuesday February 26 2019, @01:04PM (#806889)

    If they need some own cert signed they can go to letsencrypt.org. But maybe they need signed certs for domains they don't own (google.com, facebook.com etc.) Yeah, me too, add my CA to everyone's trusted list as well. And I don't even have a bad public history yet!

  • (Score: 3, Insightful) by fustakrakich on Tuesday February 26 2019, @02:42PM

    by fustakrakich (6150) on Tuesday February 26 2019, @02:42PM (#806927) Journal

    In addition to already being hacked, this is just another reason to dump the whole thing. It is worthless. In fact it is so bad, it looks like it was designed more as a tracking and malware delivery system by whatever government than anything else.

    --
    La politica e i criminali sono la stessa cosa..
  • (Score: -1, Troll) by Anonymous Coward on Tuesday February 26 2019, @11:19PM

    by Anonymous Coward on Tuesday February 26 2019, @11:19PM (#807326)

    No way am I letting a jihadi CA anywhere near my computer. Go take a trip to Mecca, goat fuckers.

(1)