Web developer Ukiah Smith wrote a blog post about which compression format to use when archiving. Obviously the algorithm must be lossless but beyond that he sets some criteria and then evaluates how some of the more common methods line up.
After some brainstorming I have arrived with a set of criteria that I believe will help ensure my data is safe while using compression.
- The compression tool must be opensource.
- The compression format must be open.
- The tool must be popular enough to be supported by the community.
- Ideally there would be multiple implementations.
- The format must be resilient to data loss.
Some formats I am looking at are zip, 7zip, rar, xz, bzip2, tar.
He closes by mentioning error correction. That has become more important than most acknowledge due to the large size of data files, the density of storage, and the propensity for bits to flip.
(Score: 2) by maxwell demon on Friday September 13 2019, @05:58AM (2 children)
Disagree. HTML is totally overkill for simple writing. Markdown is absolutely sufficient for most writing tasks.
On the other hand, if you do need advanced features, HTML is too limited. Use LaTeX in that case.
The Tao of math: The numbers you can count are not the real numbers.
(Score: 0) by Anonymous Coward on Friday September 13 2019, @02:14PM
LaTeX is too complex and more or less code. Plain text is where it is. Borrow simplest formatting (bold/italic/headings) from Markdown and call it a day.
(Score: 2) by The Mighty Buzzard on Friday September 13 2019, @02:16PM
My personal preference would be for all things TeX to die in a fire. It's almost, but not quite, as enjoyable as the PDF format. I'll stick with HTML and do anything especially funky in an image file.
My rights don't end where your fear begins.