On 26 April 2013 17:15, Daniel Kinzler daniel.kinzler@wikimedia.de wrote:
On 26.04.2013 16:56, Denny Vrandečić wrote:
The third party propagation is not very high on our priority list. Not because it is not important, but because there are things that are even more important - like getting it to work for Wikipedia :) And this seems to be stabilizing.
What we have, for now:
- We have the broadcast of all edits through IRC.
This interface is quite unreliable, the output can't be parsed in an unambiguous way, and may get truncated. I did implement notifications via XMPP several years ago, but it never went beyond a proof of concept. Have a look at the XMLRC extension if you are interested.
- One could poll recent changes, but with 200-450 edits per minute, this might
get problematic.
Well, polling isn't really the problem, fetching all the content is. And you'd need to do that no matter how you get the information of what has changed.
- We do have the OAIRepository extension installed on Wikidata. Did anyone try that?
In principle that is a decent update interface, but I'd recommend not to use OAI before we have implemented feature 47714 ("Support RDF and API serializations of entity data via OAI-MPH"). Right now, what you'd get from there would be our *internal* JSON representation, which is different from what the API returns, and may change at any time without notice.
Somewhat off-topic: I didn't know you have different JSON representations. I'm curious and I'd be happy about a few quick answers...
- How many are there? Just the two, internal and external? - Which JSON representations do the API and the XML dump provide? Will they do so in the future? - Are the API and XML dump representations stable? Or should we expect some changes?
JC
-- daniel
-- Daniel Kinzler, Softwarearchitekt Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l