Hello all,
I'm thinking of setting up an installation of Wikibase here at my work to allow
folks to store data about things that they work on.
Pretty regularly I am tasked with the creation of databases and interfaces for
those databases and I suspect that doing this will save me time, make things
easier for all of my users, and plus we get revision control for free.
I know how to use the Wikidata API to query for information on an individual
Wikidata item, but if I want to perform the equivalent of a SELECT statement
on a number of Wikidata items (for example to get a list of all items with
a particular Instance Of value), how would I go about that?
So far it's looking like I am going to need to install the Wikibase Query
Engine and the Ask library?
https://github.com/wmde/Ask
Any help would be appreciated, it looks like Wikibase documentation for reusers
is not terribly great yet. After I set things up I'm happy to write a post
on my website and create a page on Mediawiki.org to serve as a walkthrough for
future users.
Thank you,
Derric Atzrott
Computer Specialist
Alizee Pathology
Dear all,
I wanted to join in and give my birthday present to Wikidata (I am a
little bit late, though!)
(also, honestly, I didn't recall it was Wikidata's birthday, but it is
a nice occasion :P)
Here it is:
http://wikidataldf.com
What is LDF?
LDF stands for Linked Data Fragments, they are a new system for query
RDF datasets that stands middle way between having a SPARQL endpoint
and downloading the whole thing.
More formally LDF is «a publishing method [for RDF datasets] that
allows efficient offloading of query execution from servers to clients
through a lightweight partitioning strategy. It enables servers to
maintain availability rates as high as any regular HTTP server,
allowing querying to scale reliably to much larger numbers of
clients»[1].
This system was devised Ruben Verborgh, Miel Vander Sande and Pieter
Colpaert at Multimedia Lab (Ghent University) in Ghent, Belgium.
You can read more about it: http://linkeddatafragments.org/
What is Wikidata LDF?
Using the software by Verborgh et al. I have setup the website
http://wikidataldf.com that contains:
* an interface to navigate in the RDF data and query them using the
Triple Pattern Fragments client
* a web client where you can compose and execute SPARQL queries
This is not, strictly speaking, a SPARQL endpoint (not all the SPARQL
standard is implemented and it is slower, but it should be more
reliable, if you are interested in details, please do read more at the
link above).
The data are, for the moment, limited to the sitelinks dump but I am
working towards adding the other dump. I have taken the Wikidata RDF
dumps as of Oct, 13th 2014[2].
To use them I had to convert them in HDT format[3a][3b], using the
hdt-cpp library[3c] (devel) (which is taking quite a lot of resources
and computing time for the whole dumps, that's the reason why I
haven't published the rest yet ^_^).
DBpedia has also this[4]:
http://fragments.dbpedia.org/
All the software used is available under the MIT license on the LDF
repo on github[5a], and also the (two pages) website is available
here[5b].
I would like to thank Ruben for his feedback and his presentation
about LDF at SpazioDati in Trento, Italy (here's the slides[6]).
All this said, happy birthday Wikidata.
Cristian
[1] http://linkeddatafragments.org/publications/ldow2014.pdf
[2] https://tools.wmflabs.org/wikidata-exports/rdf/exports/
[3a] http://www.rdfhdt.org/
[3b] http://www.w3.org/Submission/HDT-Implementation/
[3c] https://github.com/rdfhdt/hdt-cpp
[4] http://sourceforge.net/p/dbpedia/mailman/message/32982329/
[5a] see the Browser.js, Server.js and Client.js repos in
https://github.com/LinkedDataFragments
[5b] https://github.com/CristianCantoro/wikidataldf
[6] http://www.slideshare.net/RubenVerborgh/querying-datasets-on-the-web-with-h…
Hi Scott,
Unfortunately the tutorial was not streamed.
However, here is the link to the etherpad containing the links to both
the slides and the demos:
https://pad.okfn.org/p/DBpediaULI
Hope this helps!
Cheers,
Marco
On 11/4/14, 4:00 AM, wikidata-l-request(a)lists.wikimedia.org wrote:
> Hi Marco and friends, Any chance you're streaming this to the web,
> Marco? If so, what's the URL please? Thanks, Scott On Mon, Nov 3, 2014
> at 9:45 AM, Marco Fossati <hell.j.fox(a)gmail.com> wrote:
>> >Hi folks,
>> >
>> >This is just to let you know that I will run live demos for the tutorial
>> >at the Unicode conference today between 1 p.m. and 2.30 p.m. Pacific
>> >Standard Time.
>> >I kindly ask you not to overload the DBpedia endpoint (especially
>> >http://dbpedia.org/sparql).
>> >Also, I kindly ask the OpenLink guys to have an eye on it if something is
>> >not responding well.
>> >Thanks for your collaboration!
>> >
>> >Cheers,
>> >--
>> >Marco Fossati
>> >http://about.me/marco.fossati
>> >Twitter: @hjfocs
>> >Skype: hell_j
--
Marco Fossati
http://about.me/marco.fossati
Twitter: @hjfocs
Skype: hell_j
Hello,
Claims with surname properties are sometimes impossible. For instance, I
tried to add the surname of René Grivart
<https://www.wikidata.org/wiki/Q18330867>, but as I typed "Grivart" in
the property value, the "save" link stays unclickable. Should we
systematically create an item associated with the surname ?
Thanks,
--
Jean-Baptiste Pressac
Traitement et analyse de bases de données
Production et diffusion de corpus numériques
Centre de Recherche Bretonne et Celtique
Unité mixte de service (UMS) 3554
20 rue Duquesne
CS 93837
29238 Brest cedex 3
tel : +33 (0)2 98 01 68 95
fax : +33 (0)2 98 01 63 93
Hi folks,
This is just to let you know that I will run live demos for the tutorial
at the Unicode conference today between 1 p.m. and 2.30 p.m. Pacific
Standard Time.
I kindly ask you not to overload the DBpedia endpoint (especially
http://dbpedia.org/sparql).
Also, I kindly ask the OpenLink guys to have an eye on it if something
is not responding well.
Thanks for your collaboration!
Cheers,
--
Marco Fossati
http://about.me/marco.fossati
Twitter: @hjfocs
Skype: hell_j
Hello,
Is there a way to retrieve the aliases of the items and properties with
Wikidata Query API ?
Thanks,
--
Jean-Baptiste Pressac
Traitement et analyse de bases de données
Production et diffusion de corpus numériques
Centre de Recherche Bretonne et Celtique
Unité mixte de service (UMS) 3554
20 rue Duquesne
CS 93837
29238 Brest cedex 3
tel : +33 (0)2 98 01 68 95
fax : +33 (0)2 98 01 63 93
Hi Wikidata crew,
Google Code-In (GCI) will soon take place again - a contest for 13-17
year old students to contribute to free software projects.
Wikimedia wants to take part again.
Last year's GCI results were surprisingly good - see
https://www.mediawiki.org/wiki/Google_Code-in_2013
We need your help:
1) Go to
https://www.mediawiki.org/wiki/Google_Code-in_2014#Mentors.27_corner and
read the information there. If something is unclear, ask!
2) Add yourself to the table of mentors on
https://www.mediawiki.org/wiki/Google_Code-in_2014#Contacting_Wikimedia_men…
- the more mentors are listed the better our chances are that Google
accepts us.
3) Please take ten minutes and go through open recent tickets in
https://bugzilla.wikimedia.org in your area of interest. If you see
self-contained, non-controversial issues with a clear approach which you
can recommend to new developers and would mentor: Add the task to
https://www.mediawiki.org/wiki/Google_Code-in_2014#Proposed_tasks
Until Sunday November 12th, we need at least five tasks from each of
these categories (plus some less technical beginner tasks as well):
* Code: Tasks related to writing or refactoring code
* Documentation/Training: Tasks related to creating/editing documents
and helping others learn more - no translation tasks
* Outreach/research: Tasks related to community management,
outreach/marketing, or studying problems and recommending solutions
* Quality Assurance: Tasks related to testing and ensuring code is of
high quality
* User Interface: Tasks related to user experience research or user
interface design and interaction
Google wants every organization to have 100+ tasks available on December
1st. Last year, we had 273 tasks in the end.
Note that you could also create rather generic tasks, for example fixing
two interface messages from the list of dependencies of
https://bugzilla.wikimedia.org/show_bug.cgi?id=38638
Helpful Bugzilla links:
* Reports that were proposed for GCI last year and are still open:
https://bugzilla.wikimedia.org/buglist.cgi?quicksearch=ALL%20whiteboard%3Ag…
* Open Wikidata tickets created in the last six months (if I got your
products and components right):
https://bugzilla.wikimedia.org/buglist.cgi?bug_status=UNCONFIRMED&bug_statu…
* 8 existing Wikidata "easy" tickets (are they still valid? Are they
really self-contained, non-controversial issues with a clear approach?
Could some of them be GCI tasks that you would mentor? If so, please tag
them as described above!):
https://bugzilla.wikimedia.org/buglist.cgi?bug_status=UNCONFIRMED&bug_statu…
Could you imagine mentoring some of these tasks?
Thank you for your help in reaching out to new contributors and making
GCI a success again! Please ask if you have questions.
Cheers,
andre
PS: And in a future Phabricator world, Bugzilla tickets with the 'easy'
keyword will become Phabricator tasks with the 'easy' project.
--
Andre Klapper | Wikimedia Bugwrangler
http://blogs.gnome.org/aklapper/