I've got some bot-related questions:
1) What is the optimal request size for getting content of a large list, e.g. is calling api.php?action=query&list=categorymembers 10 times with cmlimit=10 more server-hoggy than calling it once with cmlimit=100? 2) Should bots make delays between such calls? 3) How CPU-expensive is editing via https://secure.wikimedia.org ? Would it be wise to allow people to use AWB via HTTPS? 4) How severe can be strain put on servers by retrieving contents of a huge categry with its subcategories?
wikitech-l@lists.wikimedia.org