Hi!
1) What is the optimal request size for getting
content of a large
list, e.g. is calling api.php?action=query&list=categorymembers 10
times with cmlimit=10 more server-hoggy than calling it once with
cmlimit=100?
If you really are going to call them all, better have few big calls
(like, 1000 or even 10000, avoid 100000 though ;-)
2) Should bots make delays between such calls?
If they call immediately, then sure, they should. But do note, that
you have to process data too. And if you don't do anything with it,
don't call. And for huge huge batch calls, there might be table dumps
somewhere out there.
3) How CPU-expensive is editing via
https://secure.wikimedia.org ?
Would it be wise to allow people to use AWB via HTTPS?
secure.wikimedia.org is handled by single box and is generally a
hack. It should not ever be treated as stable, well maintained and
supported entry into wikipedia.
4) How severe can be strain put on servers by
retrieving contents of
a huge categry with its subcategories?
You might be killed at this point.
--
Domas Mituzas --
http://dammit.lt/ -- [[user:midom]]