Dear all,
last year, we applied for a Wikimedia grant to feed qualified data from Wikipedia
infoboxes (i.e. missing statements with references) via the DBpedia software into
Wikidata. The evaluation was already quite good, but some parts were still missing and we
would like to ask for your help and feedback for the next round. The new application is
here:
https://meta.wikimedia.org/wiki/Grants:Project/DBpedia/GlobalFactSync
The main purpose of the grant is:
- Wikipedia infoboxes are quite rich, are manually curated and have references. DBpedia is
already extracting that data quite well (i.e. there is no other software that does it
better). However, extracting references is not a priority on our agenda. They would be
very useful to Wikidata, but there are no user requests for this from DBpedia users.
- DBpedia also has all the infos of all infoboxes of all Wikipedia editions (>10k
pages), so we also know quite well, where Wikidata is used already and where information
is available in Wikidata or one language version and missing in another.
- side-goal: bring the Wikidata, Wikipedia and DBpedia communities closer together
Here is a diff between the old an new proposal:
- extraction of infobox references will still be a goal of the reworked proposal
- we have been working on the fusion and data comparison engine (the part of the budget
that came from us) for a while now and there are first results:
6823 birthDate_gain_wiki.nt
3549 deathDate_gain_wiki.nt
362541 populationTotal_gain_wiki.nt
372913 total
We only took three properties for now and showed the gain where no Wikidata statement was
available. birthDate/deathDate is already quite good. Details here:
https://drive.google.com/file/d/1j5GojhzFJxLYTXerLJYz3Ih-K6UtpnG_/view?usp=…
Our plan here is to map all Wikidata properties to the DBpedia Ontology and then have the
info to compare coverage of Wikidata with all infoboxes across languages.
- we will remove the text extraction part from the old proposal (which is here for you
reference:
https://meta.wikimedia.org/wiki/Grants:Project/DBpedia/CrossWikiFact). This
will still be a focus during our work in 2018, together with Diffbot and the new DBpedia
NLP department, but we think that it distracted from the core of the proposal. Results
from the Wikipedia article text extraction can be added later once they are available and
discussed separately.
- We proposed to make an extra website that helps to synchronize all Wikipedias and
Wikidata with DBpedia as its backend. While the external website is not an ideal solution,
we are lacking alternatives. The Primary Sources Tool is mainly for importing data into
Wikidata, not so much synchronization. The MediaWiki instances of the Wikipedias do not
seem to have any good interfaces to provide suggestions and pinpoint missing info.
Especially to this part, we would like to ask for your help and suggestions, either per
mail to the list or on the talk page:
https://meta.wikimedia.org/wiki/Grants_talk:Project/DBpedia/GlobalFactSync
We are looking forward to a fruitful collaboration with you and we thank you for your
feedback!
All the best
Magnus
--
Magnus Knuth
Universität Leipzig
Institut für Informatik
Abt. Betriebliche Informationssysteme, AKSW/KILT
Augustusplatz 10
04109 Leipzig DE
mail: knuth(a)informatik.uni-leipzig.de
tel: +49 177 3277537
webID:
http://magnus.13mm.de/