Web developer Ukiah Smith wrote a blog post about which compression format to use when archiving. Obviously the algorithm must be lossless but beyond that he sets some criteria and then evaluates how some of the more common methods line up.
After some brainstorming I have arrived with a set of criteria that I believe will help ensure my data is safe while using compression.
- The compression tool must be opensource.
- The compression format must be open.
- The tool must be popular enough to be supported by the community.
- Ideally there would be multiple implementations.
- The format must be resilient to data loss.
Some formats I am looking at are zip, 7zip, rar, xz, bzip2, tar.
He closes by mentioning error correction. That has become more important than most acknowledge due to the large size of data files, the density of storage, and the propensity for bits to flip.
(Score: 2) by FatPhil on Friday September 13 2019, @06:13PM
I'm using zpaq for archiving a continually-changing part of my website, which is append only.
However, I'm, in parallel, using git to just store each new updated version, and at the moment git (after a -gc) seems to be just winning (probably the highest level of compression in zpaq might win, but it's way slower), and of course that can be cloned for redundancy.
Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves