Heres a little side project idea for anyone interested: I often see people wanting dumps of one or two articles from a Wikipedia wiki (most commonly en) and most commonly have to resort to screen scraping via third party tools which isn't very nice and some times even blocked (at the squid level) although there is the API, it isn't exactly the most easier system for anyone to use. What might be nice is a little tool where people can enter a few article names in a box and click a button and have it produce the static html dumps of the desired article(/s).