Web developer Ukiah Smith wrote a blog post about which compression format to use when archiving. Obviously the algorithm must be lossless but beyond that he sets some criteria and then evaluates how some of the more common methods line up.
After some brainstorming I have arrived with a set of criteria that I believe will help ensure my data is safe while using compression.
- The compression tool must be opensource.
- The compression format must be open.
- The tool must be popular enough to be supported by the community.
- Ideally there would be multiple implementations.
- The format must be resilient to data loss.
Some formats I am looking at are zip, 7zip, rar, xz, bzip2, tar.
He closes by mentioning error correction. That has become more important than most acknowledge due to the large size of data files, the density of storage, and the propensity for bits to flip.
(Score: 2) by barbara hudson on Monday September 16 2019, @05:30PM (1 child)
1 More experience
2 The nature of the problem at hand has changed, or the computing environment (from dos to Windows to*nix, from 16 to 32 to 64 bit, from monochrome to change to vga to true colour).
Old code might never die, but it can become obsolete.
SoylentNews is social media. Says so right in the slogan. Soylentnews is people, not tech.
(Score: 2) by Immerman on Tuesday September 17 2019, @03:12PM
Oh absolutely.
However, I've encountered numerous situations where recreating the source code isn't actually worth the time and effort required, so I make something "good enough" to do what I need, and just do without all the "nice to have" features I would have gotten for free if the old code was still around.