Mostly for the sake of the archives, I ended up with this, which is pretty much straight out of the example in the python library manual:
def _is_approved(nom):
return nom.is_approved()
with ThreadPoolExecutor(max_workers=10) as executor:
future_to_nom = {
executor.submit(_is_approved, nom): nom for nom in nom_list.nominations()
}
for future in as_completed(future_to_nom):
nom = future_to_nom[future]
if future.result():
approved_noms.append(nom)
else:
unapproved_noms.append(nom)
One of the nagging questions in my mind as I was exploring this was whether APISite is thread-safe. I haven't found anything in the pywikibot docs which says one way or the other, but apparently it is.
I don't have a good feel for what max_workers should be. For what I'm doing, 10 seems to work well, taking about 2-3 seconds to process 70 nominations. The largest number of nominations I would ever expect to see is around 200. According to the library docs, max_workers defaults to 5 * number_of_processors. I don't actually have a clue what that works out to on a toolforge k8s instance, nor do I have any idea how the production enwiki would like it if I threw 100 parallel API requests at it all at once. So for now, I'll just hardwire it to 10.