As a result of a robot run on the nl: wikipedia, I am now left with
25840 missing or incorrect links in other wikipedias. I wanted to put
these missing links on my user pages to let people help get them in. I
have done this on a smaller scale before, but this time it took me more
than 1 minute to upload each of 5 segments of the list to the en:
wikipedia. Even retrieving them feels like it is bringing the server
down to its knees. Did something in that aspect of the software change
that slowed it down dramatically?? The only thing I can imagine that is
"unique" to these pages is the number of international links.
See the 5 pages at:
http://en.wikipedia.org/wiki/User_talk:Rob_Hooft
(but only try that if you want to investigate, because it takes about a
minute to generate each page from the database!)
Should I take these offline again awaiting?
Rob
--
Rob W.W. Hooft || rob(a)hooft.net ||
http://www.hooft.net/people/rob/