OK, thanks for your reply. We will watch for new developments and
incorporate them into our work as they are ready.
Keep up the good work on this important project!
On Fri, Sep 13, 2013 at 1:20 PM, Daniel Kinzler <daniel.kinzler(a)wikimedia.de
Am 13.09.2013 18:24, schrieb Benjamin Good:
Even 500 seems like a very low limit for this system unless I'm
misunderstanding something. Unless there is another way to execute
that return more rows than that, this would
negate the possibility of a
huge number of applications - all of ours in particular. If we want to
say, request something like "all human genes" (about 20,000 items), how
would we do that?
You are looking for actual *query* support, not just a "search by name".
on the road map, and I hope we will be able to deploy it by the end of the
But it's not possible yet.
Supporting queries like "all people born in hamburg" or "all cities in
is an obvious goal for wikidata. And we are working on it, but it's not
to make this scale to the number of entries, queries and different
are dealing with.
Within Wikipedia, we do this via the mediawiki
API based on
contains-template or category queries without any issue. Certainly
wikidata will be more useful for queries than raw mediawiki???
I'm certain I am missing something, please
This is currently standing in the way of our GSoC student completing his
summer project - due next week. A little disappointing for him..
Sorry, but we have never hidden the fact that our query interface is not
yet. wbsearchentities is a label lookup designed for find-as-you-type
suggestions. It's not a query interface, and was never supposed to be.
I understand the disappointment, but there is little we can do about this
All I can suggest is working from a dump right now (and sadly, we only have
mediawiki's raw json-in-xml dumps at the moment. I'm working on native
RDF dumps, but they are not ready).
Wikidata-l mailing list