* T H A N K Y O U *
On 7 Nov I created an HDT file based on the then
current download link
from
https://dumps.wikimedia.org/wikidatawiki/entities/latest-all.ttl.gz
Thank you very very much Wouter!! This is great!
Out of curiosity, could you please share some info about the machine that you've used
to generate these files? In particular I mean hardware info, such as the model names of
mobo/cpu/ram/disks. Also "how long" it took to generate these files would be an
interesting information.
PS: If this resource turns out to be useful to the
community we can
offer an updated HDT file at a to be determined interval.
This would be fantastic! Wikidata dumps about once a week, so I think even a new HDT file
every 1-2 months would be awesome.
Related to this however... why not use the Laundromat for this? There are several datasets
that are very large, and rdf2hdt is really expensive to run. Maybe you could schedule
regular jobs for several graphs (wikidata, dbpedia, wordnet, linkedgeodata, government
data, ...) and make them available at the Laundromat?
* T H A N K Y O U *