Stories
Slash Boxes
Comments

SoylentNews is people

posted by LaminatorX on Thursday May 29 2014, @04:03AM   Printer-friendly
from the Another-one-bites-the-dust dept.

The TrueCrypt website has been changed it now has a big red warning stating "WARNING: Using TrueCrypt is not secure as it may contain unfixed security issues". They recommend using BitLocker for Windows 7/8, FileVault for OS X, or (whatever) for Linux. So, what happened? The TrueCrypt site says:

This page exists only to help migrate existing data encrypted by TrueCrypt. The development of TrueCrypt was ended in 5/2014 after Microsoft terminated support of Windows XP. Windows 8/7/Vista and later offer integrated support for encrypted disks and virtual disk images. Such integrated support is also available on other platforms (click here for more information). You should migrate any data encrypted by TrueCrypt to encrypted disks or virtual disk images supported on your platform.

Did the TrueCrypt devs (or SourceForge?) get a NSL? They are offering a "new" version (7.2), but apparently the signing key has changed and a source code diff seems to indicate a lot of the functionality has been stripped out. What's up?

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 1) by pmontra on Thursday May 29 2014, @10:54AM

    by pmontra (1175) on Thursday May 29 2014, @10:54AM (#48667)

    I didn't know TrueCrypt's developers hide their identity. Whatever their reason is I don't think I'd trust a critical piece of software written by someone under disguise, not even if I can read the source code and compile it myself. It's just too easy to slip some nasty piece of code into it and get it unnoticed (some of them are called bugs and it may take years to find). Knowing the authors and their history is important for building trust. Lucky me I never used TrueCrypt.

  • (Score: 2) by tibman on Thursday May 29 2014, @01:46PM

    by tibman (134) Subscriber Badge on Thursday May 29 2014, @01:46PM (#48728)

    I have always found that true identity is not required for trust. I completely agree with you on history though. Knowing true identity does allow you to find the person if they betray the trust. But the trust is still betrayed.

    --
    SN won't survive on lurkers alone. Write comments.
  • (Score: 2) by tangomargarine on Thursday May 29 2014, @02:54PM

    by tangomargarine (667) on Thursday May 29 2014, @02:54PM (#48764)

    Whatever their reason is I don't think I'd trust a critical piece of software written by someone under disguise, not even if I can read the source code and compile it myself.

    So basically, you're impossible to satisfy? Or would you rather that we disassembled the end binary to look for compiler backdoors?

    --
    "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
    • (Score: 3) by tangomargarine on Thursday May 29 2014, @02:56PM

      by tangomargarine (667) on Thursday May 29 2014, @02:56PM (#48765)

      If you can compile it yourself using GCC, either the public official build of GCC is backdoored (in which case we might as well just give up anyway), or the program actually does indeed do what the source code says.

      You *do* know how programming works, don't you? The source code is *kind of* related to how the end result behaves.

      --
      "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
    • (Score: 2, Interesting) by pmontra on Thursday May 29 2014, @03:29PM

      by pmontra (1175) on Thursday May 29 2014, @03:29PM (#48784)

      Auditing a non trivial piece of software is difficult. Knowing the author gives some extra hints. Given the same code base I bet we would look at TrueCrypt in a different way if it turns out that its author is

      A) A well known cryptology researcher
      B) Works for a big company with rumored links with a three letter agency
      C) Works for a three letter agency

      It might not be rational (after all the code is there for inspection) but wouldn't we?
      Would we perform the audit again if it turned out to be case B, just in case we missed something (we always do)? Again and again if it were C?

      • (Score: 2) by tangomargarine on Thursday May 29 2014, @03:53PM

        by tangomargarine (667) on Thursday May 29 2014, @03:53PM (#48795)

        They already did an audit. It was clean.

        If we knew who the devs were, it would make it much more likely that a TLA had infiltrated them. If nobody knows who they are, it's a lot harder to find them and force them to do anything.

        Computer use is so fundamentally rooted in trust issues that nobody but a computing idiot savant can completely trust their own system. And even then, they'd have to write all their own software (including hardware drivers...) so there's plenty of room for just plain bugs.

        --
        "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
        • (Score: 2) by isostatic on Thursday May 29 2014, @09:01PM

          by isostatic (365) on Thursday May 29 2014, @09:01PM (#48916) Journal

          They'd have to build their own hardware too, at least to a chip level. Resisters are probably safe enough, you can test them easily.