> By the way, to generate the names of all the pages
of a site, one
> needs some kind of recursive generation:
RK> Currently, no. It's
probably technically possible, but you'd still get
RK> your pages ordered by namespace first, then title (i.e. the same order
RK> as with separate requests) and paging through multiple namespaces screws
RK> up all kinds of wonderful features like apprefix= and apfrom=.
That would be fine, as we will only be feeding this into
e.g., Special:Import (where order doesn't matter) on another computer.
Anyway, the whole impetus here is that one sees all the Free copyright
banners, and then thinks "OK, if it is so free, let me have a hunk of
the raw data so I can try (making a wiki like yours) too", etc.
Only to find Special:Export and its tiny ticket window of data export
"yes Sir, which page would you like to export?" "What, you want more
than one page well you'll have to list them one by one. No, you can't
just 'have them all'".
Also for small wikis that might drop dead, loyal followers could also
keep a backup without anybody having to set up anything beyond just
vanilla MediaWiki.
Anyway, also an import feature would also be an answer to an earlier post:
Michael Dale: api file uploading
P.S., of course I favor limits. Any "Export this whole site!" button
needs to note "must change &limit= by hand, max=... sorry".