I needed to clean up a bunch of tech debt before redoing the page content dump 'divvy up into small pieces and rerun if necessary' mechanism. I cleaned up a bit too much and broke stub and article recombine dumps in the process.
The fix has been deployed, I shot all the dump processes, marked the relevant jobs (big and huge wikis only) as failed, and tossed the bad files.
The dumps should resume from the failed steps in about an hour.
Follow along at https://phabricator.wikimedia.org/T160507 for all the gory details.
Ariel
xmldatadumps-l@lists.wikimedia.org