Stories
Slash Boxes
Comments

SoylentNews is people

posted by LaminatorX on Saturday March 01 2014, @06:30AM   Printer-friendly
from the applied-steganography dept.

AnonTechie writes:

"The Register is reporting on a new approach to discourage mining of your cloud-stored data. The so-called 'Melbourne Shuffle' should make it harder for cloud operators to mine or sniff your data.

Researchers from Microsoft, the University of California-Irvine, and Brown University have proposed a technology that should make it harder to derive value from data stored in the cloud. In a paper titled The Melbourne Shuffle: Improving Oblivious Storage in the Cloud, authors Olga Ohrimenko, Michael T. Goodrich, Roberto Tamassia and Eli Upfal kick things off with the statement that, 'One of the unmistakable recent trends in networked computation and distributed information management is that of cloud storage, whereby users outsource data to external servers that manage and provide access to their data.'

'Such services also introduce privacy concerns,' the authors write, '[because] it is likely that cloud storage providers will want to perform data mining on user data, and it is also possible that such data will be subject to government searches. Thus, there is a need for algorithmic solutions that preserve the desirable properties of cloud storage while also providing privacy protection for user data.'"

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Insightful) by SMI on Saturday March 01 2014, @06:50AM

    by SMI (333) on Saturday March 01 2014, @06:50AM (#8996)

    Interesting. From the PDF attached to TFA:

    "Such solutions typically work by obfuscating a sequence of data accesses intended by a client by simulating it with the one that appears indistinguishable from a random sequence of data accesses. Often, such a simulation involves mixing the intended (real) accesses with a sequence of random "dummy" accesses. In addition, so as to never access the same address twice (which would reveal a correlation), such obscuring simulations also involve continually moving items around in the server's memory space."

    Sounds like it would serve to obfuscate, but at a very high cost. Why anyone with valuable or sensitive data would even consider using a cloud storage service is beyond me, but that's me, I guess.

    • (Score: 5, Informative) by frojack on Saturday March 01 2014, @07:12AM

      by frojack (1554) Subscriber Badge on Saturday March 01 2014, @07:12AM (#9005) Journal

      Seems overly complex.
      Just go look at Spideroak.
      They couldn't release your data with a warrant in their hands and a gun to their head.
      They never know your encryption key.

      Full disclosure : I'm a satisfied customer.

      --
      No, you are mistaken. I've always had this sig.
      • (Score: 3, Informative) by Anonymous Coward on Saturday March 01 2014, @10:38AM

        by Anonymous Coward on Saturday March 01 2014, @10:38AM (#9047)

        (Replying anonymously to preserve moderation)

        I modded you up because SpiderOak deserves more notice. Encryption by default is a good thing, and their service works well enough aside from a shoddy android client. However, one thing should be clarified. You said:

        They couldn't release your data with a warrant in their hands and a gun to their head.
        They never know your encryption key.

        That's what they claim, but it's currently not verifiable. The client, which generates the key, still isn't fully open source and auditable, so we have to take their word for it. Still, the fact that they provide encryption at all is a step up compared to most, and it's created by the client on your PC, so it's possible their claims are accurate.

        With that said, if you're really paranoid about sensitive data, you should encrypt it independently before putting it on any cloud storage. Something like EncFS [wikipedia.org] is a good choice, since the encryption is per-file. Set up SpiderOak to sync the encrypted directory EncFS creates from the unencrypted files you make. As a bonus, you keep your local copy encrypted as well (unless you do the backups with the --reverse encfs option)

      • (Score: 1) by Main Gauche on Saturday March 01 2014, @07:31PM

        by Main Gauche (2933) on Saturday March 01 2014, @07:31PM (#9200)

        "Just go look at Spideroak.
        They couldn't release your data with a warrant in their hands and a gun to their head."

        As long as you don't mind buggy software deleting files from your hard drive without notice. Or (less devastatingly) an extra copy of a file being added to your hard drive without notice, so when you thought you deleted all copies of it, you actually didn't.

        (Disclaimer: Dissatisfied customer of spideroak.)

        Here's the short story: A few years ago, a bug was discovered in which, if you are syncing a folder across two computers through spideroak (SO), and performed certain file copy/rename operations in a synced folder, SO would delete the file. This is quite a bug for a backup software! After quite a wait, SO was finally fixed... until the next major release, when that bug snuck back in again!

        That was a year ago. Versions of this bug still remain. (If I right-click-copy a file within a synced folder, and rename the "filename-Copy.ext" file, SO eventually introduces a third copy of the file under the name "filename-Copy.ext".)

        I complained about this (with others) on the forums. I was there for the first iteration of this bug, and after a long wait saw the fix. I then complained when the bug came back, and obviously they haven't fixed it yet, if I can still recreate the bug.

        If you don't believe me, go the User forums (right-click the SO icon in system tray), and search "delete files" and read the horror stories. I think you might have to be a customer to see those forums, so I will copy one of the older posts below.

        Admittedly I have been too lazy to switch to another service, but this certainly changes the way I use their "backup" service. It's more like a sync service for me, and I run my own supplemental backups sporadically. I also have to be mindful of how I create file copies within folders. My advice to those who haven't already invested in setting up this software: find another product.

        ------------
        Just one of the posts from a year or two ago (not mine):
        ------------
        Test #2: renaming a file, then changing the name back

        Result: failure.
        Steps to reproduce (computers A and B)

                Create folder on A and add to backup

                Create folder on B and add to backup

                Create a sync between the 2 folders

                Create an empty file in A's folder called "initial_name"

                [SpiderOak syncs the initial_name file from A to B]

                On B, rename initial_name to "renamed"

                [SpiderOak syncs the rename action from B to A; A's file is now called "renamed"]

                On A, rename "renamed" back to "initial name"

                [SpiderOak deletes "renamed" from B]

                [SpiderOak deletes "initial_name" from A]

        Expected Behavior

        Both A and B should have a file on disk called "initial_name"
        Actual behavior

        The files "initial_name" and "renamed" are now both missing from disk -- they are in the deleted items, although no delete operation was ever issued.

        • (Score: 2) by frojack on Saturday March 01 2014, @09:16PM

          by frojack (1554) Subscriber Badge on Saturday March 01 2014, @09:16PM (#9231) Journal

          I've seen this exact same thing in Dropbox, especially when one end is Linux.

          Its basically a timing attack on the sync process.
          On dropbox, If you waited a little while between each the last three steps, it would get it right.

          I haven't tried this with Spideroak because I don't use is for sync, just for backup, due to the ability to step back in time, (see previous versions) of files I backup.

          --
          No, you are mistaken. I've always had this sig.
    • (Score: 4, Informative) by weilawei on Saturday March 01 2014, @07:14AM

      by weilawei (109) on Saturday March 01 2014, @07:14AM (#9006)
      I skimmed the first bit of the PDF. It looks to be very similar to steganography for remote requests (API calls). I'm guessing they assume that the cloud provider is trusted. One method might be that if there is a random number generator giving out random bits (1 or 0) and a 1 means "perform an access", the client will make an API call in that time slot if they have one available to make. If the client does not have an API call to make in that time slot, they will make a random one anyway. This has the property of obscuring which ones are real API calls, at least with respect to time alone. At the same time, an inner loop of API calls randomly shuffles the elements around within memory to avoid repeat accesses.

      Such solutions typically work by obfuscating a sequence of data accesses intended by a client by simulating it with the one that appears indistinguishable from a random sequence of data accesses. Often, such a simulation involves mixing the intended (real) accesses with a sequence of random "dummy" accesses

      This inner-loop process requires putting items in new locations that are independent of their old locations while hiding the correlations between the two.

      Personally, I think that this sort of need would be better served by host-proof computing implemented with fully homomorphic encryption [wikipedia.org] (giving you logic circuits which can operate on encrypted data). You might still need some obfuscation for the timing (see the first example).

  • (Score: 2, Interesting) by Anonymous Coward on Saturday March 01 2014, @08:05AM

    by Anonymous Coward on Saturday March 01 2014, @08:05AM (#9017)

    I'm a Melbournian. Where'd the name come from? Doesn't seen related to this [wikipedia.org], that's for sure!

  • (Score: 2) by Lagg on Saturday March 01 2014, @09:34PM

    by Lagg (105) on Saturday March 01 2014, @09:34PM (#9236) Homepage Journal

    Use a backup service that doesn't plaster "TEH CLOUD" all over its site trying to distract you from practical concerns like privacy and treat you like an idiot. Spideroak [spideroak.com] is one such service. These sorts of articles kind of assume that spying is a normal thing and that instead of just voting with your wallet you should do convoluted things like this. There are backup hosts out there that make such tactics redundant and they're becoming increasingly common and popular.

    Disclaimer: I got 50GB from them because they liked my testimonial enough to put it on their site. Otherwise I am unaffiliated. They are the only "cloud" backup hosts I trust, not that I have to trust them since client side encryption is used.

    --
    http://lagg.me [lagg.me] 🗿
    • (Score: 2) by everdred on Monday March 03 2014, @06:53PM

      by everdred (110) Subscriber Badge on Monday March 03 2014, @06:53PM (#10156) Homepage Journal

      > not that I have to trust them since client side encryption is used

      Of course you still have to trust them. Is their client open source?

      • (Score: 2) by Lagg on Monday March 03 2014, @10:18PM

        by Lagg (105) on Monday March 03 2014, @10:18PM (#10284) Homepage Journal
        I think the android code is released and they claim to be working on open sourcing everything from the client to the server. But I do not have to trust them in this instance because one can fairly easily see the packets, key and encrypted data.
        --
        http://lagg.me [lagg.me] 🗿