Dear ones,
Where might I get or mirror a dump of Commons media files?
> It seems worth mentioning on the front page of
https://dumps.wikimedia.org/
> It looks like the compressed XML of the ~50M description pages is ~25GB.
> It looks like wiki-team set up a dump script that posted monthly dumps to
the internet archive; in 2013 it stopped include the month+year in the
title; in 2016 it stopped altogether.
https://archive.org/details/wikimediacommons
Hi,
We have an issue with the data missing in WCQS. We are investigating it
right now.
Regards,
Zbyszko
--
Zbyszko Papierski (He/Him)
Senior Software Engineer
Wikimedia Foundation <https://wikimediafoundation.org/>
Hi there!
We have a small update on the state of WCQS beta:
1) We added a way of using SDoC based prefixes. You can both use prefixes
like sdc: in queries and they will be used in results display. Note that at
this moment it is impossible to pin them to the query directly from GUI.
For more details see [1]. Also note that autocomplete for non-wikdata items
is still not working.
2) Data reload has been automated and will happen around 9AM UTC every
Tuesday - actual time of the update depends on previous, data munging
steps, which do not block the service. It takes now about 3-4h, during
which service is being taken down. There is a maintenance page up when that
happens. Like we mentioned before - this kind of data reload is temporary
and will be replaced when we go into production.
3) We want to focus next on tasks that will help us decide how to move the
service into production.
A reminder - if you find any bugs or issues, don't hesitate to drop as a
ticket on Phabricator with the tag wikidata-query-service.
Have fun!
Zbyszko
[1] https://phabricator.wikimedia.org/T258625
--
Zbyszko Papierski (He/Him)
Senior Software Engineer
Wikimedia Foundation <https://wikimediafoundation.org/>