The only bot I know of that does this today is InternetArchiveBot ( https://github.com/cyberpower678/Cyberbot_II/tree/master/IABot ) which is written in PHP but might have some good suggestions.
In general a reference archive bot based on pywikibot feels like something which could definitely be useful.
Cheers, André
On 8 Aug 2017 00:19, "Nick Doty" npdoty@ischool.berkeley.edu wrote:
Hi pywikibot developers,
As recommended https://www.mediawiki.org/wiki/Manual:Pywikibot/ Development#Development I wanted to write to say hello. I'm a graduate student at UC Berkeley working on Internet privacy.
I'm interested in using pywikibot to automatically make archives of certain external URLs that are included in a private MediaWiki instance that we're using as a repository of educational resources. I'd like to automatically check external URLs that are provided by users/curators of our wiki and create permanent external archives (through perma.cc or similar services) so that these annotated resources can be useful even as URLs age and expire. To start with, I'm looking at the weblinkchecker.py and seeing how I might modify or extend it. If some of this functionality already exists or I should look elsewhere, I would be very interested in pointers or recommendations. Similarly, if there are others who would find a link-archiving script useful, I'd love to collaborate or understand other's use cases.
Thanks, Nick Doty UC Berkeley, School of Information https://npdoty.name
_______________________________________________ pywikibot mailing list pywikibot@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/pywikibot