Web developer Trevor Morris has a short post on the attrition of web sites over the years.
I have run the Laravel Artisan command I built to get statistics on my outgoing links section. Exactly one year later it doesn't make good reading.
[...] The percentage of total broken links has increased from 32.8% last year to 35.7% this year. Links from over a decade ago have a fifty per cent chance of no longer working. Thankfully, only three out of over 550 have gone missing in the last few years of links, but only time will tell how long they'll stick around.
As pointed out in the early and mid 1990s, the inherent centralization of sites, later web sites, is the basis for this weakness. That is to say one single copy exists which resides under the control of the publisher / maintainer. When that one copy goes, it is gone.
(Score: 4, Interesting) by darkfeline on Tuesday January 30 2024, @09:46PM
Copyright plays a role methinks.
> the inherent centralization of sites, later web sites, is the basis for this weakness
That should not be a problem, because with federation, anyone can copy and re-host what they deem is worth preserving. With digital technology, it has never been easier to copy and preserve information.
Except as a society we decided that that was a bad idea and imposed artificial legal restrictions on copying, in the hopes that the pros outweigh the cons, and we may do a similar thing again with AI.
Join the SDF Public Access UNIX System today!