Hello emarx, Many thanks for sharing KBox. Very interesting project! One question, how do you deal with different versions of the KB, like the case here of wikidata dump? Do you fetch their repo every xx time? Also, for avoiding your users to re-create the models, you can pre-load "models" from LOV catalog.
Cheers, Ghislain
2017-10-27 21:56 GMT+02:00 Edgard Marx digamarx@gmail.com:
Hey guys,
I don't know if you already knew about it, but you can use KBox for Wikidata, DBpedia, Freebase, Lodstats...
And yes, you can also use it to merge your graph with one of those....
https://github.com/AKSW/KBox#how-can-i-query-multi-bases
cheers,
<emarx>
On Oct 27, 2017 21:02, "Jasper Koehorst" jasperkoehorst@gmail.com wrote:
I will look into the size of the jnl file but should that not be located where the blazegraph is running from the sparql endpoint or is this a special flavour? Was also thinking of looking into a gitlab runner which occasionally could generate a HDT file from the ttl dump if our server can handle it but for this an md5 sum file would be preferable or should a timestamp be sufficient?
Jasper
On 27 Oct 2017, at 18:58, Jérémie Roquet jroquet@arkanosis.net wrote:
2017-10-27 18:56 GMT+02:00 Jérémie Roquet jroquet@arkanosis.net:
2017-10-27 18:51 GMT+02:00 Luigi Assom itsawesome.yes@gmail.com:
I found and share this resource: http://www.rdfhdt.org/datasets/
there is also Wikidata dump in HDT
The link to the Wikidata dump seems dead, unfortunately :'(
… but there's a file on the server: http://gaia.infor.uva.es/hdt/wikidata-20170313-all-BETA.hdt.gz (ie. the link was missing the “.gz”)
-- Jérémie
Wikidata mailing list Wikidata@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata
Wikidata mailing list Wikidata@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata
Wikidata mailing list Wikidata@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata