On 10/28/15 6:25 AM, Markus Krötzsch wrote:
On 28.10.2015 10:11, Dimitris Kontokostas wrote: ...
Definitely. However, there is some infrastructural gap between loading a dump once in a while and providing a *live* query service. Unfortunately, there are no standard technologies that would routinely enable live updates of RDF stores, and Wikidata is rather low-tech when it comes to making its edits available to external tools. One could set up the code that is used to update query.wikidata.org <http://query.wikidata.org> (I am sure it's available somewhere), but it's still some extra work.
DBpedia Live does that for some years now. The only thing that is non-standard in DBpedia Live are the changeset format but now this is covered by LDPAtch http://www.w3.org/TR/ldpatch/
At the moment DBpedia Live only produces the changeset that other servers can consume. The actual SPARQL Endpoint is located in an Openlink server and we already use the same model to feed & update an LDF Endpoint (Still in beta)
If there *were* an ldpatch service for Wikidata, then you *could* do this for Wikidata as well, using standard tools (on the W3C "WG Note" level) from this point on. However, there is no such service now, and I am not aware of any activity that is aimed at building such a service. It's not rocket science to set this up, but it requires non-standard techniques and custom tools (starting with parsing edit histories of Wikidata).
Markus
Markus,
You can use existing standards to achieve these goals, as is already demonstrated. Fundamentally, you can use RDF Language to describe anything for machine processing, and of course that includes feeds, and the nature of said feeds (formats, deltas, refresh schedules etc..).
Standards and technology aren't the problem here, so let's not frame matters that way, for broad clarity.