Hi pywikibot developers,

As recommended https://www.mediawiki.org/wiki/Manual:Pywikibot/Development#Development I wanted to write to say hello. I'm a graduate student at UC Berkeley working on Internet privacy.

I'm interested in using pywikibot to automatically make archives of certain external URLs that are included in a private MediaWiki instance that we're using as a repository of educational resources. I'd like to automatically check external URLs that are provided by users/curators of our wiki and create permanent external archives (through perma.cc or similar services) so that these annotated resources can be useful even as URLs age and expire. To start with, I'm looking at the weblinkchecker.py and seeing how I might modify or extend it. If some of this functionality already exists or I should look elsewhere, I would be very interested in pointers or recommendations. Similarly, if there are others who would find a link-archiving script useful, I'd love to collaborate or understand other's use cases.

Thanks,
Nick Doty
UC Berkeley, School of Information
https://npdoty.name