On 28.10.2015 10:11, Dimitris Kontokostas wrote: ...
Definitely. However, there is some infrastructural gap between loading a dump once in a while and providing a *live* query service. Unfortunately, there are no standard technologies that would routinely enable live updates of RDF stores, and Wikidata is rather low-tech when it comes to making its edits available to external tools. One could set up the code that is used to update query.wikidata.org <http://query.wikidata.org> (I am sure it's available somewhere), but it's still some extra work.
DBpedia Live does that for some years now. The only thing that is non-standard in DBpedia Live are the changeset format but now this is covered by LDPAtch http://www.w3.org/TR/ldpatch/
At the moment DBpedia Live only produces the changeset that other servers can consume. The actual SPARQL Endpoint is located in an Openlink server and we already use the same model to feed & update an LDF Endpoint (Still in beta)
If there *were* an ldpatch service for Wikidata, then you *could* do this for Wikidata as well, using standard tools (on the W3C "WG Note" level) from this point on. However, there is no such service now, and I am not aware of any activity that is aimed at building such a service. It's not rocket science to set this up, but it requires non-standard techniques and custom tools (starting with parsing edit histories of Wikidata).
Markus