We (the greater WP community) know people at the Internet Archive.
One could imagine a bot which submitted a list of WP reference URLs to the Archive so that they could be preferentially added to the archive library, via a process worked out with Archive people...
Alternately, another online citation archiving service could be set up as a new WMF project, specifically to support the various WMF projects.
-george william herbert george.herbert@gmail.com
On Sat, Oct 18, 2008 at 6:34 PM, Gwern Branwen gwern0@gmail.com wrote:
On 2008.10.18 01:29:32 +0300, nsk nsk@karastathis.org scribbled 3.3K characters: ....
Perhaps the best solution would be to build a web archiving platform in Wikipedia itself, so that all referenced webpages are stored for later retrieval.
-- Thanks, NSK Nikolaos S. Karastathis, http://nsk.karastathis.org/
I actually once wrote a bot* which processed a dump for external links and submitted them to webcitation.org. I stopped running it because the link requests didn't seem to be resulting in URLs being archived, but that was back in May. (Perhaps things have changed since then.) How much of the solution would such a bot represent? Could the solution be as cheap as a post-page-save hook which submits all http:// links in the wikitext to webcitation.org?
-- gwern Reaction nitric NSDD IDB Fiel president Perl-RSA Surveillance RIT Merlin
-----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.9 (GNU/Linux)
iEYEAREKAAYFAkj6jq0ACgkQvpDo5Pfl1oKgIwCeI6nz/4cKpIOv353ZmaH5NiE9 33QAn2z/JINq0DT3uyilJSSMDqVUeykw =qFLg -----END PGP SIGNATURE-----
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: https://lists.wikimedia.org/mailman/listinfo/wikien-l