Are the dump processes stuck/dead again; or is the web pag just not updating?
Russ Blau <russblau@...> writes:
Are the dump processes stuck/dead again; or is the web pag just not updating?
I'm guessing it's not just the web page, since http://download.wikipedia.org/enwiki/latest/ shows a pagelinks file that's not updating, as does wget --spider http://download.wikipedia.org/enwiki/latest/enwiki-latest-pagelinks.sql.gz
So firstly, any chance someone could give the enwiki dump (and whichever others) a poke, and restart it, or whatever else might be required?
Secondly, would it worth reordering the dumps witin each language to make pagelinks slightly later? It appears to be the slowest of any of the "small" dumps (that is, everything else before abstract.xml), and also the most likely to fail or hang. Perhaps it could be run last of that group, for the sake of getting slightly less in the way of delays and failures of those others?
Thanks, Alai.
Russ Blau wrote:
Are the dump processes stuck/dead again; or is the web pag just not updating?
The NFS mounts didn't quite survive the maintenance reboot of the server, so the dump processes got stuck waiting for it to come back up.
Remounted the share, things should return to normal shortly.
-- brion
wikitech-l@lists.wikimedia.org