On 2008.10.18 01:29:32 +0300, nsk nsk@karastathis.org scribbled 3.3K characters: ....
Perhaps the best solution would be to build a web archiving platform in Wikipedia itself, so that all referenced webpages are stored for later retrieval.
-- Thanks, NSK Nikolaos S. Karastathis, http://nsk.karastathis.org/
I actually once wrote a bot* which processed a dump for external links and submitted them to webcitation.org. I stopped running it because the link requests didn't seem to be resulting in URLs being archived, but that was back in May. (Perhaps things have changed since then.) How much of the solution would such a bot represent? Could the solution be as cheap as a post-page-save hook which submits all http:// links in the wikitext to webcitation.org?
* https://secure.wikimedia.org/wikipedia/en/wiki/User:Gwern/Archive-bot.hs
-- gwern Reaction nitric NSDD IDB Fiel president Perl-RSA Surveillance RIT Merlin