In May, Google made international headlines when it announced that it was going to offer free, unlimited storage for photos and videos. If you read Google's press release, you'll see that the free storage plan limits images to 16 megapixels and videos to 1080p resolution. But if digital images are simply collections of binary data and if all other files on your computer also just collections of binary data then isn't unlimited photo storage simply unlimited storage?
If only something existed that made this easy to do; you know, something that could bitmap all the things....
[ Ed's Comment: This link points to the author's own personal software solution, but I'm sure that others will come up with alternative ideas.]
(Score: -1, Flamebait) by Anonymous Coward on Sunday June 28 2015, @01:07AM
Once you go ALL BLACK BITS, you never go back to white bitmap.
(Score: 2) by kaszz on Sunday June 28 2015, @11:50AM
We will go back to boob bitmaps, they are so much better than black bitmaps :p
(Score: 2, Insightful) by Anonymous Coward on Sunday June 28 2015, @01:16AM
"International headlines" my ass. Nobody heard of the Google PR, and only idiots would rely on a hack on an ephemeral Google service to preserve your data, much less a large amount of it.
(Score: -1, Redundant) by Anonymous Coward on Sunday June 28 2015, @01:19AM
Clearly you never used usenet.
(Score: 0, Disagree) by Anonymous Coward on Sunday June 28 2015, @01:21AM
STFU, Tyler.
(Score: 1) by tripstah on Sunday June 28 2015, @04:43AM
Hi, actually I'm Tyler; nice to meet you. And you are (something other than a coward perhaps)?
(Score: 0) by Anonymous Coward on Monday June 29 2015, @01:09AM
You are not Tyler Durden.
GP was refering to the first rule of usenet, same as the first rule of fightclub.
(Score: 0) by Anonymous Coward on Sunday June 28 2015, @02:39AM
You do not remember other past storage hacks
Audio cassettes. Storing as tunes
Video cassettes. Storing as pictures
Encryption. Storing in the background "noise" in a picture
Just one more in in great hacks.
(Score: 0) by Anonymous Coward on Sunday June 28 2015, @02:51AM
Great hack my ass. You are comparing this bullshit to those?
(Score: -1, Offtopic) by Anonymous Coward on Sunday June 28 2015, @01:23AM
999999999999 movies in the Cloud
Take one Down, stream it around
999999999999 movies in the Cloud
DMCA, DMCA, DMCA all day
still 999999999999 movies in the Cloud
(Score: 2) by Snotnose on Sunday June 28 2015, @01:24AM
takes about 3 tar files renamed to jpg or whatever and I've backed up all the stuff I care about.
/ don't take many pictures
// nor videos
/// those 3 tar files have some 30 years of computing in them
//// minus MP3s, pictures, and video of course.
I came. I saw. I forgot why I came.
(Score: 2) by kaszz on Sunday June 28 2015, @11:57AM
How many bytes are those .tar files on?
(Score: 0) by Anonymous Coward on Sunday June 28 2015, @12:50PM
I imagine they will validate the images by decoding them in memory before accepting them.
(Score: 3, Interesting) by khchung on Sunday June 28 2015, @01:26AM
Obviously, it is physically impossible for Google to offer real unlimited storage for any fixed price, much less free.
So, it is just like the "unlimited internet" plans from ISPs. The real question is, when would Google throttle your upload? Or tell you that you have violated their TOS by storing "too much"?
(Score: 2, Insightful) by Anonymous Coward on Sunday June 28 2015, @02:33AM
They'll just transcode your "images"
(Score: 2) by maxwell demon on Sunday June 28 2015, @08:33AM
Good point. Applying some mild lossy compression may not visibly alter most images, but it will completely destroy data "images".
Indeed, with so many images, they may also find ways to use efficient "cross-image" compression, by finding similar images and storing only the difference to them, using lossy compression.
The Tao of math: The numbers you can count are not the real numbers.
(Score: 2) by WizardFusion on Tuesday June 30 2015, @10:06AM
just like the "unlimited internet" plans from ISPs
Maybe in the US and other third world countries, but here in the UK, I have truly unlimited internet.
I download on average just over 300gb a month. My ISP doesn't care.
I also have unlimited data on my mobile phone tariff. I could browse and download all day everyday over fast reliable 4G, and my mobile provider doesn't care. They even state this on their website.
(Score: -1, Flamebait) by Anonymous Coward on Sunday June 28 2015, @01:33AM
Only scummy assholes pull hacks like this. Got tired of shoving random shit into URL shorteners, did you, you motherfucking asshole cunt?
(Score: 3, Insightful) by tangomargarine on Sunday June 28 2015, @02:24AM
I'm sure Google won't mind this at all.
"Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
(Score: 0) by Anonymous Coward on Sunday June 28 2015, @02:30AM
I forgot where I saw this, but hiding a file in a photo is detectable, unless it's encrypted. I think it was something about using google drive to hide files.
(Score: 2, Insightful) by Anonymous Coward on Sunday June 28 2015, @02:56AM
The real problem isn't that a steganographically hidden file might be detectable, but rather that online services often take it upon themselves to downsample or compress the image. Upload a glorious photograph to Facegooginstatwit and come back a day later to find a shabby, artifact-laden, compressed, resized image in its place.
Now imagine that image held your data.
You would lose your entire archive or FUSE filesystem or whatever.
(Score: 4, Interesting) by MrGuy on Sunday June 28 2015, @02:45AM
About 15 years ago, when e-mail providers were providing accounts to anyone who'd bother to ask, with more storage space than your average hard drive, I remember a friend of mine setting up what amounted to a file-system front-end backed by e-mail accounts. He'd just break a file up into chunks small enough to stay under attachment size limits, and e-mail them to a few accounts. New version of a file? Just store the new one and deprecate the old pointers. Sure it was slow as molasses, and involved a lot of text being sent in the clear, but it was effectively a cloud-hosted backup of all his important files, before cloud hosting became a "thing." And I'm not even giving my friend a lot of credit here - sure, there was some cleverness in the specifics of his implementation, but he's hardly the only one who had the idea to use e-mail that way. And I'm sure there's older tech than e-mail accounts that offered similar opportunities.
Using a great big storage device meant for one thing to store other things isn't a new idea. 1's and 0's.
(Score: 4, Interesting) by Popeidol on Sunday June 28 2015, @03:44AM
A closely related example is when gmail launched with a whopping 1gb of storage space. At the time hotmail offered about 6mb (from memory), and the most generous free hosting you could find was around 100mb.
It did not take long before somebody wrote a program to Mount your gmail account as a drive for file storage [viksoe.dk] (with a guide for using it here [engadget.com]). A surprising number of people used it until services like dropbox filled the same niche.
(Score: 2) by Geotti on Sunday June 28 2015, @03:48AM
but it was effectively a cloud-hosted backup of all his important files, before cloud hosting became a "thing."
FTFY.
"The Cloud" (or Butt, if you prefer) is essentially just elastic grid computing [wikipedia.org]. The mail server you referred to, however, was most probably either a single machine or a small cluster [wikipedia.org].
(Just nitpicking here, but these concepts are often confused and this is Soylent, so here we go.)
(Score: 2) by Geotti on Sunday June 28 2015, @03:51AM
Pardon me please, I should have pressed preview.
There was supposed to be a link to explain Elasticity [wikipedia.org] in cloud computing.
(Score: 0) by Anonymous Coward on Monday June 29 2015, @04:16PM
I've now read the original and the "fixed" text several times, and I am honestly unable to find any difference.
(Score: 2) by Geotti on Monday June 29 2015, @07:58PM
Was supposed to be:
but it was effectively a
cloud-hosted backup of all his important files, before cloud hosting became a "thing."Dunno what was the matter with me on that day ;)
(Score: 2) by isostatic on Sunday June 28 2015, @09:07AM
15 years ago?
On April 1 2004, only 11 years and 3 months ago, gmail launched. At the time, yahoo, hotmail etc offered 10 or 20MB. That wasn't exactly a lot in 1994, let alone 2004. Google announced they would give everyone 1GB for free. It was on the front page of the Evening Standard, and I remember laughing at the office about how they'd been fooled by such an april fools.
A few days later, things like gmail drive (http://www.viksoe.dk/code/gmail.htm) and fuse plugins (http://www.jacobsen.no/anders/blog/archives/2004/09/01/google_gmail_as_a_linux_file_system.html) appeared.
(Score: 3, Interesting) by MrGuy on Sunday June 28 2015, @02:53AM
It seems like a relatively straightforward thing to losslessly decompose a 4k (2160p) video into 4 1080p videos, simply by dividing the original image up into 2x2 pixel blocks, and then taking the upper right pixel of all the blocks as video A, upper left of each block as video B, etc. Similar approach can decompose an 8k image into 16 1080p videos. All the resulting videos would be playable (and reasonable "downsamples" of the original), but could also be recombined perfectly into the original higher-res video.
I know a compressed video isn't a collection of full images, but I don't see why the same algorithm wouldn't work equally well on an I or B frame. Someone with more video chops can correct me here.
In other words, I don't see why the resolution limit actually changes anything.
(Score: 3, Informative) by FatPhil on Sunday June 28 2015, @12:16PM
Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
(Score: 0) by Anonymous Coward on Monday June 29 2015, @01:11AM
Bingo!
You are correct sir!
(Score: 3, Insightful) by looorg on Sunday June 28 2015, @03:13AM
Shouldn't this really be filed under "what could possibly go wrong". Yes give all your images and videos to kindly uncle Google. They will never ever use them for anything but store them for you, for free!
I assume just renaming the files .jpeg won't work, or? Alternatively I guess one could just create a 1x1 pixel image to get all the headers and then just attach the rest of your data after the end of the file OR you might want to have an ordinary image and just inject all your data into the comment field if the format supports that. There should be quite a few different possibilities. I'm not really sure which version or what this program does, somewhat to lazy to try it out but it seems from the screenshot that he at least offers rar compression with password protection.
If you want free backup storage forever from Google for all files you can still just tarball (or whatever compression you like) up your files and email them to your gmail account. "Free" storage forever as long as they adhere to the size limitations of attachments. I seriously suggest encrypting all your files to before sending them.
(Score: 0) by Anonymous Coward on Monday June 29 2015, @01:13AM
Jpeg files are just groups of bitstreams. You could store your data in a non-image bitstream, not unlike the way EXIF streams contain things like exposure, gps coordinates, etc.
(Score: 2, Funny) by dingus on Sunday June 28 2015, @03:50AM
Someone should set up their botnet to send in randomly generated 16-megapixel photos until they run out of space.
Not encouraging a crime here, but I mean if you're a criminal with a botnet hanging around... go for it.
(Score: 5, Informative) by tripstah on Sunday June 28 2015, @04:19AM
Is this a great idea for backing up important files? No, of course not. I mean you could, but Google's storage is $120/yr and has a working API. Amazon's is even less.
Is this the next great file sharing tool? No, I think not either. I accidentally wrote one of those about 10 years ago (it was still open and called Azureus when I created it), but I don't see that happening again here as this is easily controllable by Google.
Will Google shut this down? Who knows, but they're aware of it and the fix is simple: stop allowing BMP files or transcode them (which they do for other formats or images of 16 megapixels).
As for the technical details, for anyone interested in the "hack," all this does is loop the files through rar to split them, place a 54 bit BMP header before the 64,000,000 byte chunks (i.e. 16 megapixels in RGBA BMP format) which makes it a valid BMP file. The client has a lot of extra features, but that's the general gist. It's explained here [linkedin.com] if anyone is interested (kudos to the 0s / 1s comments -- exactly on point).
Lastly, haters gotta hate I guess; anyone else BBSes / 2600 some days? Oh well, cheers to Soylent for keeping it real and to the insightful and interesting comments.
Best,
trip (i.e. Tyler Pitchford)
(Score: -1, Flamebait) by Anonymous Coward on Sunday June 28 2015, @04:51AM
Do you enjoy seeing your jizz all over the front page of Soylent, you attention-whoring asshole? Oh look at you, now the whole fucking world knows you have friends at Google. How very nice for you! Do your friends have anal privileges? Do you just love it when a dude reams your asshole until his manly cum squirts inside your bowels?
Haters gonna hate, you're right about that. But consider why we fucking hate you.
1) Your project literally enables the Tragedy of the Commons.
2) Your project has no practical value to anyone.
3) You love attention so very very very much.
Next time you feel like you deserve to have the world acknowledge your pitiful existence, go fucking tweet something, you worthless twit.
(Score: 2) by FatPhil on Monday June 29 2015, @08:10AM
Are there any alternative services that you could use? If so, then you could store the file over multiple sites, using an error-correcting code, such that if one of the sites were to go down, the data would still be available. This is what the internet and the cloud was invented for!
Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
(Score: 1) by tripstah on Monday June 29 2015, @11:59AM
Amazon also allows bitmaps on their cloud service. The image portion is free for "prime" members and they supposedly they accept files up to 2GBs. That would work out to a 16,000 x 16,000 x RGBA bitmap.
As for error correction, RAR includes parity information (it's one of the reasons I looped through it versus just splitting). By default B.A.T.T. encodes 10% parity information, but that's user configurable. If you want extra assurance, there's always https://en.wikipedia.org/wiki/PAR2 [wikipedia.org].
(Score: 1) by McD on Tuesday June 30 2015, @03:54PM
No kidding?
Well then let me take this off-topic opportunity to say thank you - I used Azureus a bit, right up until it turned south.
Can't buy you a beer, but I can make a donation to SN's upkeep in your honor. Cheers!
(Score: 1) by tripstah on Wednesday July 01 2015, @02:30PM
No kidding, and ++ to you sir!
Best,
Tyler
(Score: 0) by Anonymous Coward on Sunday June 28 2015, @07:27PM
Average internet speed in the USA = 10Mbps. Time it takes to transfer 1TB at 10Mbps = 9 days. So if you were actually creating more than 1TB per day, you wouldn't be storing it with Google especially if you're using some steganographic scheme which will greatly multiply the required storage amount.
If you were creating much less, then you can store it on a few hard drives or maybe even portable flashdrives. And even if you were storing it on Google, you wouldn't have enough data to need to resort to convoluted schemes to get "unlimited" storage.
And if you were somewhere in between, relying on Google probably won't make sense in most scenarios, perhaps more as a last resort backup.
(Score: 1) by tripstah on Monday June 29 2015, @12:04PM
B.A.T.T. isn't as complex as that; there's no stenography. B.A.T.T.'s overhead is 54 bytes per each 64,000,000 chunk, so the overhead works out to just under a megabyte per terabyte of data encoded as bitmaps.
(Score: 2) by darkfeline on Monday June 29 2015, @09:45PM
>This link points to the author's own personal software solution, but I'm sure that others will come up with alternative ideas.
Uh, it's not exactly hard to turn arbitrary binary data into a bitmap image. All you need to do is stick the appropriate metadata header at the front of the binary blob, then you can use whatever loss-less compression algorithm you want to get PNGs or whatever. I wouldn't be surprised if there were a handful of fellow Soylentils who already rolled their own 100 SLOC scripts to do this already.
Join the SDF Public Access UNIX System today!