These last 48 hours, I've sampled the response time of a number of Wikipedia URLs every 20 minutes, for a total of 144 samples each.
From this data I cannot tell the duration of the slowdown periods (I
sample too seldom), but I can tell how many samples of each URL have "absurdly" long response times (60+ seconds). All URLs are *not* equal:
Count URL ----- -------------------------------- 17 http://de.wikipedia.com/wiki.jpg 9 http://www.wikipedia.com/wiki/1458 7 http://www.wikipedia.com/wiki/Sweden 6 http://de.wikipedia.com/wiki.cgi?Letzte_%c4nderungen 6 http://de.wikipedia.com/ 5 http://www.wikipedia.com/wiki/Chemistry 4 http://de.wikipedia.com/wiki.cgi?Schweden 3 http://de.wikipedia.com/wiki.cgi?action=rc&days=7 2 http://www.wikipedia.com/wiki/special:RecentChanges 2 http://www.wikipedia.com/ 2 http://eo.wikipedia.com/wiki/Svedio 2 http://eo.wikipedia.com/wiki/Lastaj_Sxangxoj 1 http://eo.wikipedia.com/wiki.cgi?action=rc&days=3 1 http://eo.wikipedia.com/vikio.png 1 http://eo.wikipedia.com/
(Is the Esperanto Wikipedia running Perl again? Didn't it already use the new PHP software?)
wikitech-l@lists.wikimedia.org