Well, PubSubHubbub is a nice idea. However it clearly depends on two factors:
1. whether Wikidata sets up such an infrastructure (I need to check whether we have capacities, I am not sure atm)
2. whether performance is good enough to handle high-volume publishers

Basically, polling to recent changes [1] and then do a http request to the individual pages should be fine for a start. So I guess this is what we will implement, if there aren't any better suggestions.
The whole issue is problematic and the DBpedia project would be happy, if this were discussed and decided right now, so we can plan development.

What is the best practice to get updates from Wikipedia at the moment?
We are still using OAI-PMH...

In DBpedia, we use a simple self-created protocol:
http://wiki.dbpedia.org/DBpediaLive#h156-4
Publication of changesets: Upon modifications old triples are replaced with updated triples. Those added and/or deleted triples are also written as N-Triples files and then compressed. Any client application or DBpedia-Live mirror can download those files and integrate and, hence, update a local copy of DBpedia. This enables that application to always in synchronization with our DBpedia-Live.
This could also work for Wikidata facts, right?


Other useful links:
- http://www.openarchives.org/rs/0.5/resourcesync
- http://www.sdshare.org/
- http://www.w3.org/community/sdshare/
- http://www.rabbitmq.com/


All the best,
Sebastian

[1] https://www.wikidata.org/w/index.php?title=Special:RecentChanges&feed=atom

Am 26.04.2013 03:15, schrieb Hady elsahar:
Hello Dimirtis 

what do you thing of that ? 
shall i write this part as an abstract part in the proposal and wait for more details , 
or could we have a smiliar plan like the one already implemented in dbpedia  http://wiki.dbpedia.org/DBpediaLive#h156-3

thanks 
regards 


On Fri, Apr 26, 2013 at 12:50 AM, Jeremy Baron <jeremy@tuxmachine.com> wrote:
On Thu, Apr 25, 2013 at 10:42 PM, Hady elsahar <hadyelsahar@gmail.com> wrote:
> 2- is there any design pattern or a  brief outline for the change propagation design , how it would be ? in order that i could make a rough plan and estimation about how it could be consumed from the DBpedia side ?

I don't know anything about the plan for this but it seems at first
glance like a good place to use [[w:PubSubHubbub]].

-Jeremy

_______________________________________________
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l



--
-------------------------------------------------
Hady El-Sahar
Research Assistant 
Center of Informatics Sciences | Nile University

email : hadyelsahar@gmail.com
Phone : +2-01220887311 
http://hadyelsahar.me/

 



_______________________________________________
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


--
Dipl. Inf. Sebastian Hellmann
Department of Computer Science, University of Leipzig
Projects: http://nlp2rdf.org , http://linguistics.okfn.org , http://dbpedia.org/Wiktionary , http://dbpedia.org
Homepage: http://bis.informatik.uni-leipzig.de/SebastianHellmann
Research Group: http://aksw.org