Hi Julie,
We've thought a lot about this, but not done anything formally yet. There is an example of this happening to improve the disease ontology presented in this paper [1].
Mechanically, parties interested in a particular swath of data linked to their resource could set up repeated SPARQL queries to watch for changes. Beyond that, the core mediawiki API could be used to create alerts when new discussions are started on articles or items of interest.
At some point we hope to produce a reporting site that would aggregate this kind of information in our domain (feedback and changes by the community) as well as changes by our bots and provide reports back to the primary sources and to whoever else was interested. (Maybe we will see a start on that this summer..) This hasn't become a priority yet because we haven't yet generated the community scope to make it a really valuable source of input to the original databases.
[1] http://biorxiv.org/content/biorxiv/early/2015/11/16/031971.full.pdf
On Fri, Jun 10, 2016 at 11:31 AM, Julie McMurry mcmurry.julie@gmail.com wrote:
It is great that WikiData provides a way for data to be curated in a crowd-sourced way. It would be even better if changes (especially corrections) could be communicated back to the original source so that all could benefit.
Has this been discussed previously? Considered?
Julie
Wikidata mailing list Wikidata@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata