Stories
Slash Boxes
Comments

SoylentNews is people

posted by LaminatorX on Saturday March 01 2014, @06:30AM   Printer-friendly
from the applied-steganography dept.

AnonTechie writes:

"The Register is reporting on a new approach to discourage mining of your cloud-stored data. The so-called 'Melbourne Shuffle' should make it harder for cloud operators to mine or sniff your data.

Researchers from Microsoft, the University of California-Irvine, and Brown University have proposed a technology that should make it harder to derive value from data stored in the cloud. In a paper titled The Melbourne Shuffle: Improving Oblivious Storage in the Cloud, authors Olga Ohrimenko, Michael T. Goodrich, Roberto Tamassia and Eli Upfal kick things off with the statement that, 'One of the unmistakable recent trends in networked computation and distributed information management is that of cloud storage, whereby users outsource data to external servers that manage and provide access to their data.'

'Such services also introduce privacy concerns,' the authors write, '[because] it is likely that cloud storage providers will want to perform data mining on user data, and it is also possible that such data will be subject to government searches. Thus, there is a need for algorithmic solutions that preserve the desirable properties of cloud storage while also providing privacy protection for user data.'"

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 1) by Main Gauche on Saturday March 01 2014, @07:31PM

    by Main Gauche (2933) on Saturday March 01 2014, @07:31PM (#9200)

    "Just go look at Spideroak.
    They couldn't release your data with a warrant in their hands and a gun to their head."

    As long as you don't mind buggy software deleting files from your hard drive without notice. Or (less devastatingly) an extra copy of a file being added to your hard drive without notice, so when you thought you deleted all copies of it, you actually didn't.

    (Disclaimer: Dissatisfied customer of spideroak.)

    Here's the short story: A few years ago, a bug was discovered in which, if you are syncing a folder across two computers through spideroak (SO), and performed certain file copy/rename operations in a synced folder, SO would delete the file. This is quite a bug for a backup software! After quite a wait, SO was finally fixed... until the next major release, when that bug snuck back in again!

    That was a year ago. Versions of this bug still remain. (If I right-click-copy a file within a synced folder, and rename the "filename-Copy.ext" file, SO eventually introduces a third copy of the file under the name "filename-Copy.ext".)

    I complained about this (with others) on the forums. I was there for the first iteration of this bug, and after a long wait saw the fix. I then complained when the bug came back, and obviously they haven't fixed it yet, if I can still recreate the bug.

    If you don't believe me, go the User forums (right-click the SO icon in system tray), and search "delete files" and read the horror stories. I think you might have to be a customer to see those forums, so I will copy one of the older posts below.

    Admittedly I have been too lazy to switch to another service, but this certainly changes the way I use their "backup" service. It's more like a sync service for me, and I run my own supplemental backups sporadically. I also have to be mindful of how I create file copies within folders. My advice to those who haven't already invested in setting up this software: find another product.

    ------------
    Just one of the posts from a year or two ago (not mine):
    ------------
    Test #2: renaming a file, then changing the name back

    Result: failure.
    Steps to reproduce (computers A and B)

            Create folder on A and add to backup

            Create folder on B and add to backup

            Create a sync between the 2 folders

            Create an empty file in A's folder called "initial_name"

            [SpiderOak syncs the initial_name file from A to B]

            On B, rename initial_name to "renamed"

            [SpiderOak syncs the rename action from B to A; A's file is now called "renamed"]

            On A, rename "renamed" back to "initial name"

            [SpiderOak deletes "renamed" from B]

            [SpiderOak deletes "initial_name" from A]

    Expected Behavior

    Both A and B should have a file on disk called "initial_name"
    Actual behavior

    The files "initial_name" and "renamed" are now both missing from disk -- they are in the deleted items, although no delete operation was ever issued.

  • (Score: 2) by frojack on Saturday March 01 2014, @09:16PM

    by frojack (1554) on Saturday March 01 2014, @09:16PM (#9231) Journal

    I've seen this exact same thing in Dropbox, especially when one end is Linux.

    Its basically a timing attack on the sync process.
    On dropbox, If you waited a little while between each the last three steps, it would get it right.

    I haven't tried this with Spideroak because I don't use is for sync, just for backup, due to the ability to step back in time, (see previous versions) of files I backup.

    --
    No, you are mistaken. I've always had this sig.