On Sat, Dec 7, 2013 at 12:47 PM, Thomas Douillard <thomas.douillard@gmail.com> wrote:

That's why I think we must do a lot more with such datas than just importing them from openlibrary, as they are really important to Mediawiki in general, and that the community as a whole is a powerful drinving force for Bibliographical datas. I'm not against cooperating with openlibrary, but we should seek deep cooperation and integration with them so both projects can benefits from each others community.

+1 on this

openlibrary.org have a limited set of fields.

Moreover, simply importing data at some random time of some random records will not benefit neither openlibrary neither Wikimedia. 

You will first need to search if Wikidata don't have the needed information, search again for it in openlibrary, create the content in openlibrary, import the content into Wikidata, make the desired local changes and send back to openlibrary any local relevant changes.

But I had an idea: a MediaWiki User Interface to openlibrary data

openlibrary.org offers access to records in 3 ways:

* read/write of individual records through API;
* read of individual records through RDF and JSON;
* bulk download of the entire dataset

So i'ts possible to:

1) Import the bulk data;
2) Catch all changes from openlibrary.org in real time;
3) Allows that the synced data can be browsable and editable at any time on MediaWiki/Wikidata instances;
4) Sends back to openlibrary the changes, storing locally the data from custom fields in the MediaWiki instance (allowing further import at openlibrary instance if they creates the corresponding fields in their DB);
5) Sends back to openlibrary all new book records created on MediaWiki instances.