Max Semenik writes:
I've got some bot-related questions:
1) What is the optimal request size for getting content of a large
list, e.g. is calling api.php?action=query&list=categorymembers 10
times with cmlimit=10 more server-hoggy than calling it once with
cmlimit=100?
Use limit=max.
2) Should bots make delays between such calls?
No, I think.
3) How CPU-expensive is editing via
https://secure.wikimedia.org ?
Would it be wise to allow people to use AWB via HTTPS?
Why do you need it? You may
login using it, but of course this would be
very CPU-expensive both for client and server.
4) How severe can be strain put on servers by
retrieving contents of
a huge categry with its subcategories?
Not very severe, I think. But you may test
it yourself: install
MediaWiki, create many pages and categories with a bot, and then
retrieve contents of them using profiling.
--VasilievVV