Hi, I've recently taken interest in the Wikipedia data dumps. I'd like to download a subset of files when they are updated. On the Data dumps page[1] a monitoring file[2] is mentioned, but the file doesn't contain any data (except the "wiki" object).
I did some research and found the monitor.py script and some info in the relevant README [3]. If I've understood it correctly, a server will periodically run monitor.py which will create the index.json file.
Is this deployed now? Since the file exists with (very little) content I'd guess that monitor.py has been run.
[1] https://meta.wikimedia.org/wiki/Data_dumps#Monitoring_dump_generation [2] https://dumps.wikimedia.org/index.json [3] https://phabricator.wikimedia.org/source/operations-dumps/browse/master/xmld...
Regards Aron Bergman