Hi, in the new rewrite branch, what is the best way to do this?

* Get all links from a set of pages. If some pages are redirects - get their targets. Only the link ns+titles are needed, no need to check existence/pageid/etc. All this is one API call:

http://en.wikipedia.org/w/api.php?action=query&prop=links&titles=Archiver|Abstract%20(law)&pllimit=300&redirects

* Get all links and categories from the result of a generator or a list of titles (from a file). Similar to the above, except for optional generator  and there is no redirect param. Once a bad page is found, I will load its content and fix it. The links and categories names are needed only as text string.

Thanks!

P.S. I have started http://www.mediawiki.org/wiki/Manual:Pywikipediabot/Recipes that should list all encountered bot scenarios  and how various versions of pywiki should be used to solve it.  Please help by adding your bot's core workflow to that list, and core developers could either suggest better code or alter pywiki framework to better handle such cases.