Martynas,

since we had this discussion on this list previously, and again I am irked by your claim that we could just use standard RDF tools out of the box for Wikidata.

I will shut up and concede that you are right if you manage to set up a standard open source RDF tool on an open source stack that contains the Wikidata knowledge base, is keeping up to date with the rate of changes that we have, and is able to answer queries from the public without choking and dying for 24 hours, before this year is over. Announce a few days in advance on this list when you will make the experiment.

Technology has advanced by three years since we made the decision not to use standard RDF tools, so I am sure it should be much easier today. But last time I talked with people writing such tools, they were rather cautious due to our requirements.

We still wouldn't have proven that it could deal with the expected QPS Wikidata will have, but heck, I would be surprised and I would admit that I was wrong with my decision if you can do that.

Seriously, we did not snub RDF and SPARQL because we don't like it or don't know it. We decided against it *because* we know it so well and we realized it does not fulfill our requirements.

Cheers,
Denny

On Mon Oct 27 2014 at 6:47:05 PM Martynas Jusevičius <martynas@graphity.org> wrote:
Hey all,

so I see there is some work being done on mapping Wikidata data model
to RDF [1].

Just a thought: what if you actually used RDF and Wikidata's concepts
modeled in it right from the start? And used standard RDF tools, APIs,
query language (SPARQL) instead of building the whole thing from
scratch?

Is it just me or was this decision really a colossal waste of resources?


[1] http://korrekt.org/papers/Wikidata-RDF-export-2014.pdf

Martynas
http://graphityhq.com

_______________________________________________
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l