Would the WikiData community be interested in linking together Wikidata
pages with Freebase entities? I've proposed a new property to link the two
Freebase already has interwiki links for many entities so it wouldn't be
too hard to automatically determine the corresponding Wikidata pages. This
would allow people to mash up both datasets and cross-reference facts more
Knowledge Developer Relations
In the past weeks the community of the Dutch Wikipedia has been working hard to solve interwiki conflicts. A few months ago we had more than 14000 interwiki conflicts, today less than 10800.
With fixing these interwiki conflicts, often we also fix them on other wikis. But are other Wikipedias also active on massively fixing interwiki conflicts?
Heya folks :)
Here's your weekly dose of Wikidata updates:
Have a great weekend!
Lydia Pintscher - http://about.me/lydia.pintscher
Community Communications for Technical Projects
Wikimedia Deutschland e.V.
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
I think Poland may do better than average because Polish people, out of
national pride, have made a special effort to be well documented in English
Wikipedia and represent a Polish point-of-view on topics like the city of
One fascinating thing about Wikidata is that it provides access to all of
the wonderful concepts shared in the Wikiverse, so now sites like Ookaboo
can collect pictures of many beautiful places that don't exist in en
On the other hand I'm also interested in the other end of the curve,
those elite concepts which are represented widely across the Wikipedias.
Surely this is connected with subjective importance, with some flavor
towards "global" appeal, whatever that would turn out to mean. Any chance
you could run a report on those?
From: Mathieu Stumpf
Sent: Thursday, June 13, 2013 4:51 AM
Subject: Re: [Wikidata-l] Visualisations of The Most Unique Wikipedias
According to Wikidata
Le 2013-06-12 22:22, Klein,Max a écrit :
> Hello Wikidatians,
> I made a few visualizations of the distributions of language links
> in Wikidata Items. You can also use these stats to see which Items
> represent wikipedia articles which are unique to a language and
> compare the uniquenesses of all languages. Also I investigate all the
> items with just two language links, to look at Wikipedia "pairs"
> See the full analysis:
Interesting! Could you also create that kind of visualisations by
topics : how much uniqueness come from biographies of local football
people, compared with history events or abstract concepts ?
Also, in a completly unrelated topic, you may explain me in private
what you mean with "Create a communal house to live in" which is in your
public todo list, it sounds interesting. :P
Wikidata-l mailing list
I started working on the DBpedia release and just wanted to check
what's the current status of the Wikidata dumps. I saw that RDF data
and RDF URIs like http://www.wikidata.org/entity/Q1 are already
available. Cool! Do you think there will be RDF dumps soon, i.e. in
the next few weeks?
If not, could you guys prepare a dump of the sitelinks table, as you
suggested below? If it's not too much effort, it would be cool if you
could generate CSV or a similar simple format. We won't put the stuff
into a DB, we just extract the data, and we would have to write a
parser for SQL insert statements. CSV would be much simpler.
Thanks a lot for your help!
On 4 May 2013 23:36, Daniel Kinzler <daniel.kinzler(a)wikimedia.de> wrote:
> On 04.05.2013 19:13, Jona Christopher Sahnwaldt wrote:
>> We will produce a DBpedia release pretty soon, I don't think we can
>> wait for the "real" dumps. The inter-language links are an important
>> part of DBpedia, so we have to extract data from almost all Wikidata
>> items. I don't think it's sensible to make ~10 million calls to the
>> API to download the external JSON format, so we will have to use the
>> XML dumps and thus the internal format.
> Oh, if it's just the language links, this isn't an issue: there's an additional
> table for them in the database, and we'll soon be providing a separate dump of
> that at table http://dumps.wikimedia.org/wikidatawiki/
> If it's not there when you need it, just ask us for a dump of the sitelinks
> table (technically, wb_items_per_site), and we'll get you one.
>> But I think it's not a big
>> deal that it's not that stable: we parse the JSON into an AST anyway.
>> It just means that we will have to use a more abstract AST, which I
>> was planning to do anyway. As long as the semantics of the internal
>> format will remain more or less the same - it will contain the labels,
>> the language links, the properties, etc. - it's no big deal if the
>> syntax changes, even if it's not JSON anymore.
> Yes, if you want the labels and properties in addition to the links, you'll have
> to do that for now. But I'm working on the "real" data dumps.
> -- daniel