After looking at tools like Apache's Latka and Canoo webtest, I concluded that they were a bit to high-level and underpowered for what I had in mind for the wiki software test suite. So I started a basic test suite in Java using the Httpunit classes, which seem to work quite well and give me all I need.
I checked in a first pass to CVS in the "testsuite" directory-- right now the suite just loads the main page, follows a couple of links, fetches an edit form, and extracts the source of a page to a file. Eventually of course this will be a full suite that loads up an empty database with web actions, does lots of modifications, exercises all the functions, and validates all the output.
Let me now ask for guidance about what the next pressing step is: I believe we are already in a performance crunch, and we already have some ideas about how to fix some of those problems, so I think I should make that the short-term driver for further development of the suite. For example, I have the wikipedia code installed on a machine on my network at home, and I can use the suite to time it, make a database change or something, then time that to get us good numbers about what to tweak.
What would be a good mix of functions for timing? I'm thinking heavy on regular page loads, heavy on recent changes, lighter on things like edits. Any other ideas?
On Tue, 2003-03-04 at 17:41, Lee Daniel Crocker wrote:
What would be a good mix of functions for timing? I'm thinking heavy on regular page loads, heavy on recent changes, lighter on things like edits. Any other ideas?
I suspect you'd also want to have some frequent edits on large pages--I'm thinking of pages like current events, and talk pages that get large. Maybe modeled that the edit rate is somewhat proportional to page size, until the page hits the 32K limit, at which time edit rate slows down or the page size goes down.
In other words, I'd model the Current Events page and random Talk page heated discussion.
wikitech-l@lists.wikimedia.org