Web developer Trevor Morris has a short post on the attrition of web sites over the years.
I have run the Laravel Artisan command I built to get statistics on my outgoing links section. Exactly one year later it doesn't make good reading.
[...] The percentage of total broken links has increased from 32.8% last year to 35.7% this year. Links from over a decade ago have a fifty per cent chance of no longer working. Thankfully, only three out of over 550 have gone missing in the last few years of links, but only time will tell how long they'll stick around.
As pointed out in the early and mid 1990s, the inherent centralization of sites, later web sites, is the basis for this weakness. That is to say one single copy exists which resides under the control of the publisher / maintainer. When that one copy goes, it is gone.
(Score: 1, Informative) by Anonymous Coward on Wednesday January 31 2024, @07:29AM
Because it respects robots.txt, etc. So when a domain squatter takes over the previous archived site can vanish if the new robots.txt or similar tells IA to not archive:
https://help.archive.org/help/using-the-wayback-machine/ [archive.org]