Anyway, currently the 'pedia seems to run fast and stable.
I have been seeing several fast accesses and every once in a while an extremely slow one. It could be that one of the special functions (or search?) behaves badly.
Is the "really big process" bug back??
What is the "really big process" bug?
I have been wondering before whether it's possible to have memory leaks in php scripts: is everything automatically released once the page is served? Somebody somewhere could be eating memory, and that would slow everything. If memory is indeed the bottleneck, then caching might not be a good idea, since we in effect transport twice as much data between database and php script, and I would expect that the data is buffered in memory on both ends.
Some top/ps outputs would really help.
Axel
wikitech-l@lists.wikimedia.org