Hi Martynas,

that is a good observation! First, rest assured -- there are a number of people involved in Wikidata who have very intimate knowledge of RDF, OWL, SPARQL -- some of us have been working on these standards actually :) We do fully understand the value of these standards.

We will export our data in RDF. But this does not mean that our internal data model has to be RDF. Think about Drupal or Semantic MediaWiki: both export a lot of their data in RDF, but their internal data models are very different. And still, they are great citizens of the Web of Data, I'd reckon. Or even think about Wikipedia: obviously, articles of Wikipedia are "exported" as HTML, so that browsers can display them. But the internal mark-up language to create, edit, and maintain the articles is not HTML, but MediaWiki syntax.

I hope this helps with your concerns :)

Cheers,
Denny




2012/3/28 Martynas Jusevicius <martynas@graphity.org>
Hey all,

I've been reading some of the technical notes on Wikidata, for example
http://meta.wikimedia.org/wiki/Wikidata/Notes/Data_model
http://meta.wikimedia.org/wiki/User:Nikola_Smolenski/Wikidata#Query_language

Statements like "[data model] similar to RDF, but allows qualified
property values" and "should there be a query language that will
enable querying of the data?" concern me a great deal regarding the
future of the whole Wikidata project.

It seems to me that whoever is making these technical decisions does
not fully realize the price of reinventing the bike -- or in this
situation, reinventing data models/formats/standards. Having designed
and implemented production-grade applications both on RDBMSs, XML, and
RDF, I strogly suggest you should base Wikidata on standard RDF.

I know some/most of you are coming from the wiki background which
might be hard to get over with, but if Wikidata is to become a free
and open knowledge base on the (Semantic) Web, then RDF is the free
and open industry standard for that. Whatever little advantage you
would get from developing a custom non-standard data model, think how
many man-years of standardization and tool development you would
loose. Isn't knowledge about standing on the shoulders of giants? RDF
has all the specifications, a variety of tools, and DBPedia as a very
solid proof-of-concept (which I also think should be better integrated
with this project) necessary to build Wikidata.
With SPARQL Update, full read/write RDF roundtrip is possible (and
works in practice). It also makes the notion of API rather obsolete,
since SPARQL Update (and related mechanisms) is the only generic
API-method one has to deal with.

To round up -- I think failure to realize the potential of RDF for
Wikidata would be a huge waste of resources for this project,
Wikipedia, and the general public.

Martynas
graphity.org

_______________________________________________
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l



--
Project director Wikidata
Wikimedia Deutschland e.V. | Eisenacher Straße 2 | 10777 Berlin
Tel. +49-30-219 158 26-0 | http://wikimedia.de

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V. Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.