I'm getting incredibly slow responses from the main Wikipedia now and in the last few days. Why is this? Who is working to fix it? How can I help?
The request
GET /wiki.png HTTP/1.1 Host: www.wikipedia.com
is lightening fast. But
GET / HTTP/1.1 Host: www.wikipedia.com
can take well over a minute to return. This is strange, since the start page is seldom updated and could easily be cached in HTML, ready to send to any client.
Simple but useful measurements can be done with the Linux commands:
time lynx -dump http://www.wikipedia.com/wiki.png >/dev/null time lynx -dump http://www.wikipedia.com/ >/dev/null
I can run this at regular intervals and report statistics to you, if this is interesting.
I have worked with finding and removing transaction response time bottlenecks in C++, Java, and Perl applications, but never in PHP, so I don't know how hard it is. My general approach is to look at the system clock when a new request comes in and then again at various points in the execution path, to find the point where more than 3 seconds have passed. If the code has a function for writing a debug or trace log message, that is the place to look at the system clock. Who can do this? Can I assist you in any way?
wikitech-l@lists.wikimedia.org