As a result of a robot run on the nl: wikipedia, I am now left with 25840 missing or incorrect links in other wikipedias. I wanted to put these missing links on my user pages to let people help get them in. I have done this on a smaller scale before, but this time it took me more than 1 minute to upload each of 5 segments of the list to the en: wikipedia. Even retrieving them feels like it is bringing the server down to its knees. Did something in that aspect of the software change that slowed it down dramatically?? The only thing I can imagine that is "unique" to these pages is the number of international links.
See the 5 pages at: http://en.wikipedia.org/wiki/User_talk:Rob_Hooft (but only try that if you want to investigate, because it takes about a minute to generate each page from the database!)
Should I take these offline again awaiting?
Rob
On Tue, 21 Oct 2003, Rob Hooft wrote:
As a result of a robot run on the nl: wikipedia, I am now left with 25840 missing or incorrect links in other wikipedias. I wanted to put these missing links on my user pages to let people help get them in. I have done this on a smaller scale before, but this time it took me more than 1 minute to upload each of 5 segments of the list to the en: wikipedia. Even retrieving them feels like it is bringing the server down to its knees. Did something in that aspect of the software change that slowed it down dramatically?? The only thing I can imagine that is "unique" to these pages is the number of international links.
What is making these pages slow is not the large number of international links, but the large number of internal links. Basically, when you load a page, it has to check for each page whether the page exists (and if you have stub detection switched on, also its size). For pages with many links, this can get quite slow.
Andre Engels
Andre Engels wrote:
On Tue, 21 Oct 2003, Rob Hooft wrote:
As a result of a robot run on the nl: wikipedia, I am now left with 25840 missing or incorrect links in other wikipedias. I wanted to put these missing links on my user pages to let people help get them in. I have done this on a smaller scale before, but this time it took me more than 1 minute to upload each of 5 segments of the list to the en: wikipedia. Even retrieving them feels like it is bringing the server down to its knees. Did something in that aspect of the software change that slowed it down dramatically?? The only thing I can imagine that is "unique" to these pages is the number of international links.
What is making these pages slow is not the large number of international links, but the large number of internal links. Basically, when you load a page, it has to check for each page whether the page exists (and if you have stub detection switched on, also its size). For pages with many links, this can get quite slow.
I still have the impression that at this point, it is excessively slow. At ~250 internal links per page, there is nothing really special about that in those pages, but 1 minute loading time is way too long and much much longer than I am experiencing anywhere else. Calculate with me: rendering 250 links in one minute means 4 per second. No way, other pages are much faster than that. That is why I suspect the interwiki links somewhere: it sets these pages apart from other pages in the wikis.
Rob
wikitech-l@lists.wikimedia.org