After looking at tools like Apache's Latka and Canoo webtest, I concluded that they were a bit to high-level and underpowered for what I had in mind for the wiki software test suite. So I started a basic test suite in Java using the Httpunit classes, which seem to work quite well and give me all I need.
I checked in a first pass to CVS in the "testsuite" directory-- right now the suite just loads the main page, follows a couple of links, fetches an edit form, and extracts the source of a page to a file. Eventually of course this will be a full suite that loads up an empty database with web actions, does lots of modifications, exercises all the functions, and validates all the output.
Let me now ask for guidance about what the next pressing step is: I believe we are already in a performance crunch, and we already have some ideas about how to fix some of those problems, so I think I should make that the short-term driver for further development of the suite. For example, I have the wikipedia code installed on a machine on my network at home, and I can use the suite to time it, make a database change or something, then time that to get us good numbers about what to tweak.
What would be a good mix of functions for timing? I'm thinking heavy on regular page loads, heavy on recent changes, lighter on things like edits. Any other ideas?