Dear Laura, others,
If somebody points me to the RDF datadump of Wikidata I can deliver an
HDT version for it, no problem. (Given the current cost of memory I
do not believe that the memory consumption for HDT creation is a
blocker.)
This would be awesome! Thanks Wouter. To the best of my knowledge, the most up to date
dump is this one [1]. Let me know if you need any help with anything. Thank you again!
[1]
https://dumps.wikimedia.org/wikidatawiki/entities/latest-all.ttl.gz
---
Cheers,
Wouter Beek.
Email: wouter(a)triply.cc
WWW:
http://triply.cc
Tel: +31647674624
On Fri, Oct 27, 2017 at 5:08 PM, Laura Morales <lauretas(a)mail.com> wrote:
Hello everyone,
I'd like to ask if Wikidata could please offer a HDT [1] dump along with the already
available Turtle dump [2]. HDT is a binary format to store RDF data, which is pretty
useful because it can be queried from command line, it can be used as a Jena/Fuseki
source, and it also uses orders-of-magnitude less space to store the same data. The
problem is that it's very impractical to generate a HDT, because the current
implementation requires a lot of RAM processing to convert a file. For Wikidata it will
probably require a machine with 100-200GB of RAM. This is unfeasible for me because I
don't have such a machine, but if you guys have one to share, I can help setup the
rdf2hdt software required to convert Wikidata Turtle to HDT.
Thank you.
[1]
http://www.rdfhdt.org/[http://www.rdfhdt.org/]
[2]
https://dumps.wikimedia.org/wikidatawiki/entities/[https://dumps.wikimedia.…
_______________________________________________
Wikidata mailing list
Wikidata(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata[https://lists.wikimed…
_______________________________________________
Wikidata mailing list
Wikidata(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata[https://lists.wikimed…