A long time ago, before I had backups, before btrfs was in wide use; I tried putting everything on the drive, in a source control system.
Some source control systems, create lotsa duplicate copies, to make it easier to detect changes. For example, SVN places an extra copy of the source tree, in .svn/
Anyway, figured that I didn't need duplicate copies like that; removing them immediately would just result in some extra free disk space and a tiny loss of performance.
Ran something like
opencm blah blah; opencm delete
to insert the root directory into source control, and immediately delete the unnecessary duplicate copy that would be created.
Something went wrong, and I lost about half the data on the drive; about 5 gig (this was very long ago, when drives were tiny), before noticing the problem and killing the deletion process.
Thus, deeply burning onto my psyche, that reliability is more important than saving disk space, and to never, ever, dedup.
(Score: 3, Interesting) by throwaway28 on Friday October 30 2015, @02:35AM
A long time ago, before I had backups, before btrfs was in wide use; I tried putting everything on the drive, in a source control system.
Some source control systems, create lotsa duplicate copies, to make it easier to detect changes. For example, SVN places an extra copy of the source tree, in .svn/
Anyway, figured that I didn't need duplicate copies like that; removing them immediately would just result in some extra free disk space and a tiny loss of performance.
Ran something like
to insert the root directory into source control, and immediately delete the unnecessary duplicate copy that would be created.
Something went wrong, and I lost about half the data on the drive; about 5 gig (this was very long ago, when drives were tiny), before noticing the problem and killing the deletion process.
Thus, deeply burning onto my psyche, that reliability is more important than saving disk space, and to never, ever, dedup.