Hi!
With all the talk going on, I wanted to give you my little thoughts. I am more on the wiki user side, so keep that in mind (I do know php, so I am not a complete dork)
1) how about a query cache (for edits and searches). http://www.vbulletin.com/ (not OpenSource), does something like that for search queries: http://forum.doom9.org/search.php Basically you instantly see a html page, which tells you that the search is in progress. This looks good for the user. The page reload can probably be configured in intervals. That way the best server load (e.g: 4 searches / second) can be obtained and it will never be used more that what you specify I think http://www.phpbb.com/downloads.php (GPL'ed) can do the same, but I am not sure. The thing is, when users see, that the server is doing something, they will not try to hit refresh every few seconds, because they think it helps (which it doesn't). User feedback is and always should be top priority.
2) What is really slowing down wikipedia As I see it, not everyone agrees on that. I am also guessing it's the db, and not the php-code that does the rendering. Maybe someone can somehow put this together, like searches (7/second avg. - serverload: 30%), edit(1/second: - serverload: 5%), recentchanges ( 5/second, 25% load), etc. maybe some real numbers will help focus on the problem. So If I am guessing right, and the sql-query for the searches takes the most time, building up a queue for searches like above might satisfy users (at least they know know something is happening) and it will reduce serverload.
I hope this helps the discussion somewhat.
Cheers Leo
wikitech-l@lists.wikimedia.org