Web developer Ukiah Smith wrote a blog post about which compression format to use when archiving. Obviously the algorithm must be lossless but beyond that he sets some criteria and then evaluates how some of the more common methods line up.
After some brainstorming I have arrived with a set of criteria that I believe will help ensure my data is safe while using compression.
- The compression tool must be opensource.
- The compression format must be open.
- The tool must be popular enough to be supported by the community.
- Ideally there would be multiple implementations.
- The format must be resilient to data loss.
Some formats I am looking at are zip, 7zip, rar, xz, bzip2, tar.
He closes by mentioning error correction. That has become more important than most acknowledge due to the large size of data files, the density of storage, and the propensity for bits to flip.
(Score: 0) by Anonymous Coward on Friday September 13 2019, @01:33PM
You're conflating "end of the world" and "good/want to have".
Surely, it's not the end of the world if gone, yet many people want pics, emails, and do read some of them, even if you do not.
And.. parent poster seems to be concerned, which may indicate historical annoyance at having something gone MIA. So, it's not about 'end of the world', but 'I can do this, I want to do this, so why on Earth not do it correctly, and well'.
As far as I'm concerned, most people take 1 hour to do a 2 hour task, but then the 1 hour task ends up fruitless, as it was not done correctly. Do it right, or don't bother...