On 27 January 2014 00:17, Kevin Gorman
<kgorman(a)gmail.com> wrote:
Funny you ask... there are not currently any
solid ones afaik, but I've
been talking with the Internet Archive about building out a bot and
trying
to achieve community consensus on ENWP to
autoreplace deadlinks with
archive.org ones. The IA has been crawling all new external links on
all
Wikimedia projects at least once every couple of
hours for months, and
has
a strong interest in killing off literally all of
our dead links. Unless
something falls through, I should be bringing a more detailed plan up
within maybe five or six weeks.
Yes, I knew you were cooking up something :-) I was just surprised it
wasn't the sort of task that people had already automated, or written
a nice toolserver bot for, or something.
The ones that use {{cite web}} and variants are pretty simple: you
just whack in archiveurl= and archivedate= (preferably as close as
possible to any cited accessdate=) ... then double-check by eye, of
course. It just gets very tedious and error-prone doing it by hand,
cut'n'pasting URLs into the middle of the computer guacamole we
lovingly euphemise as "wikitext". VE isn't a much happier method.
Concur that it's a great idea....but perhaps a WMF Tools labs tool, instead
of toolserver? Running battle, I know - but so many of the tools I have
greatly valued over the years are now pretty much useless, or at least
unreliable.
In any case - it would be great to have a bot that did a fair bit of that,
but it should probably be manually run to ensure proper matching, kind of
like AWB.
Risker/Anne