Hi folks,
is anyone using the Wikidata entity dump dcatap.rdf at https://dumps.wikimedia.org/wikidatawiki/entities/dcatap.rdf?
It is very rarely used and is thus causing us a (probably) undue maintenance burden, because of which we plan to remove it.
If anyone is making use of it, please speak up so that we can keep it or find a viable alternative.
Cheers, Marius
If it is used, (although rarely) and you are not sure if it is causing you any undue burden, why remove a metadata description for linked data recommended by the EU?
Med vänliga hälsningar Jan Ainali http://ainali.com
2017-09-27 12:04 GMT+02:00 Marius Hoch marius.hoch@wikimedia.de:
Hi folks,
is anyone using the Wikidata entity dump dcatap.rdf at https://dumps.wikimedia.org/wikidatawiki/entities/dcatap.rdf?
It is very rarely used and is thus causing us a (probably) undue maintenance burden, because of which we plan to remove it.
If anyone is making use of it, please speak up so that we can keep it or find a viable alternative.
Cheers, Marius
Wikidata mailing list Wikidata@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata
Hi!
is anyone using the Wikidata entity dump dcatap.rdf at https://dumps.wikimedia.org/wikidatawiki/entities/dcatap.rdf?
It is very rarely used and is thus causing us a (probably) undue maintenance burden, because of which we plan to remove it.
What's the issue with it? I don't use it but it seems to be part of standard for dataset descriptions, so I wonder if the issues can be fixed. I don't know too much about it but from the description is seems to be very automatable.
Marius Hoch asked last month:
is anyone using the Wikidata entity dump dcatap.rdf at https://dumps.wikimedia.org/wikidatawiki/entities/dcatap.rdf?
It is very rarely used and is thus causing us a (probably) undue maintenance burden, because of which we plan to remove it.
If anyone is making use of it, please speak up so that we can keep it or find a viable alternative.
Well, it is poorly documented maybe this is a reason for its low use? It should be mentioned at:
https://www.wikidata.org/wiki/Wikidata:Database_download
I just extended the DCAT-AP item and did some digging into the DCAT-AP community: http://www.wikidata.org/entity/Q28600460
DCAT-AP is an application profile (aka "best practice") of the W3C Data Catalog Vocabulary, so it's not only to make happy European data catalog registries. The main use case is going to be a search across multiple open data catalogs (similar to https://datahub.io/ / CKAN but only metadata) but by now it seems to be used more to submit data to local registries in more than 15 European countries (see this presentation from last yeat: https://www.w3.org/2016/11/sdsvoc/brecht).
Anyway, it's a good idea to describe Wikidata dumps in machine readble form so why not stick with the current standard?
How about adding the RDF to query.wikidata.org so we can get a current list?
PREFIX dcat: http://www.w3.org/ns/dcat# PREFIX dct: http://purl.org/dc/terms/
SELECT ?url ?date ?size WHERE { https://www.wikidata.org/about#catalog dcat:dataset ?dump . ?dump dcat:distribution [ dct:format "application/json" ; dcat:downloadURL ?url ; dcat:issued ?date ; dcat:byteSize ?bytes ] . }
I created a Phabricator issue for this request: https://phabricator.wikimedia.org/T178978
Cheers, Jakob
Hi!
How about adding the RDF to query.wikidata.org so we can get a current list?
We could probably load the rdf we have now into Blazegraph relatively easily. Updating may be a bit tricky (should we delete historical items?) but it's possible to figure it out. I'll look into it.