a new dump of Wikidata in HDT (with index) is available[http://www.rdfhdt.org/datasets/].
Thank you very much! Keep it up! Out of curiosity, what computer did you use for this? IIRC it required >512GB of RAM to function.
You will see how Wikidata has become huge compared to other datasets. it contains about twice the limit of 4B triples discussed above.
There is a 64-bit version of HDT that doesn't have this limitation of 4B triples.
In this regard, what is in 2018 the most user friendly way to use this format?
Speaking for me at least, Fuseki with a HDT store. But I know there are also some CLI tools from the HDT folks.