Dear ones,
Where might I get or mirror a dump of Commons media files?
> It seems worth mentioning on the front page of
https://dumps.wikimedia.org/
> It looks like the compressed XML of the ~50M description pages is ~25GB.
> It looks like wiki-team set up a dump script that posted monthly dumps to
the internet archive; in 2013 it stopped include the month+year in the
title; in 2016 it stopped altogether.
https://archive.org/details/wikimediacommons
Hi all,
Due to low disk space on the WCQS beta instance and possible data
corruption that might've happened because of it, we need to take down WCQS
beta and rebuild its journal - operation should take about 2 days -
starting now. I'm sorry for any inconvenience this may cause.
This is a recurring issue, but we have a permanent solution in the pipeline
on which we plan to start working on soon: [1].
[1] https://phabricator.wikimedia.org/T262265
--
Zbyszko Papierski (He/Him)
Senior Software Engineer
Wikimedia Foundation <https://wikimediafoundation.org/>