[Foundation-l] excluding Wikipedia clones from searching
????
wiki-list at phizz.demon.co.uk
Sat Dec 11 09:15:19 UTC 2010
On 10/12/2010 23:51, John Doe wrote:
> I'm In the process of creating a cleanup tool that checks archive.org and
> webcitation.org if a URL is not archived it checks to see if it is live and
> if it is I request that webcitation archive it on demand, and fills in the
> archiveurl parameter of cite templates.
>
What is the point of doing that? If an URL goes missing the information
should be refound from another source. If it can't be re-referenced then
perhaps it wasn't quite as reliable as one first thought, and if URLs
aren't stable on any particular site then maybe one should re-examine
the reliability of the originating source.
Most dead URLs that I see, that can't be refound, come from references
to online articles of minor events in BLPs. Simply the event was
recorded on Monday and was fish and chip wrapping by Thursday. Or to put
it another way non-notable in the grand scheme of things. In some cases
the original source may also have removed the content because it was
untrue and could not be substantiated.
Stuffing URLs across to archive.org, or webcitation.org simply
perpetuates unsubstantiated gossip. One really ought to examine one's
motives for doing that.
More information about the wikimedia-l
mailing list