Hey folks,
we plan to drop the wb_entity_per_page table sometime soon[0], because
it is just not required (as we will likely always have a programmatic
mapping from entity id to page title) and it does not supported non
-numeric entity ids as it is now. Due to this removing it is a blocker
for the commons metadata.
Is anybody using that for their tools (on tool labs)? If so, please
tell us so that we can give you instructions and a longer grace period
to update your scripts.
Cheers,
Marius
[0]: https://phabricator.wikimedia.org/T95685
Hoi,
Jura1 created a wonderful list of people who died in Brazil in 2015 [1]. It
is a page that may update regularly from Wikidata thanks to the
ListeriaBot. Obviously, there may be a few more because I am falling ever
more behind with my quest for registering deaths in 2015.
I have copied his work and created a page for people who died in the
Netherlands in 2015 [2]. It is trivially easy to do this and, the result is
great. The result looks great, it can be used for any country in any
Wikipedia
The Dutch Wikipedia indicated that they nowadays maintain important
metadata at Wikidata. I am really happy that we can showcase their work. It
is important work because as someone reminded me at some stage, this is
part of what amounts to the policy of living people...
Thanks,
GerardM
[1] https://www.wikidata.org/wiki/User:Jura1/Recent_deaths_in_Brazil
[2]
https://www.wikidata.org/wiki/User:Jura1/Recent_deaths_in_the_Netherlands
Hey all,
Lydia (or somebody operating the @Wikidata handle :) posted this question
on Twitter and a few great ideas started trickling in
<https://twitter.com/wikidata/status/708384895375163392>.
I went ahead and created an AllOurIdeas poll <https://t.co/IbsBmY6Kpg>,
seeded with the first ideas posted on Twitter, to crowdsource the
generation of new ideas and produce a robust ranking.
If you're unfamiliar with AllOurIdeas </>, it's an open consultation engine
allowing people to choose which idea they like best via pairwise
comparisons (I am cc'ing Matt Salganik, the project lead). It's very simple
on the surface but it uses algorithms such as the Condorcet method
<https://en.wikipedia.org/wiki/Condorcet_method> to test how strongly each
idea performs against another, reducing the weighing of the oldest ideas to
create a level playing field for newly created ideas and preventing gaming
or self-promotion of one's own ideas.
Try it out or post new ideas: the more votes it gets, the higher the
confidence of the ranking. Real-time results and statistics are here
<http://www.allourideas.org/wikidata/results>.
Dario
Hi all
I am trying to query the wiki data for entities with labels that matches a
regex. I am new in the sparql world. So could you please help me with it.
Here is what I have for now.
https://gist.github.com/anonymous/2810eb5747e51a9ae746183a43f20771
But I don't think it is the right way. Any help will be much appreciate.
Thanks
Hey everyone :)
We'll be doing another office hour on IRC for all things Wikidata on April
12th at 4PM UTC.
https://www.timeanddate.com/worldclock/fixedtime.html?hour=16&min=00&sec=0&…
has
your timezone.
I'll give an overview of what's been happening over the past 3 months and
give an update on what's coming up. We'll have time for questions as well.
If you have any topics you'd like to bring up please let me know. As always
there will be logs for people who can't attend.
Hope to see many of you there.
Cheers
Lydia
--
Lydia Pintscher - http://about.me/lydia.pintscher
Product Manager for Wikidata
Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/029/42207.
Hello all,
The Photographers' Identities Catalog (PIC) is an ongoing project of
visualizing photo history through the lives of photographers and photo
studios. I have information on 115,000 photographers and studios as of
tonight. It is still under construction, but as I've almost completed an
initial indexing of the ~12,000 photographers in WikiData, I thought I'd
share it with you. We (the New York Public Library) hope to launch it
officially in mid to late January. This represents about 12 years worth of
my work of researching in NYPL's photography collection, censuses and
business directories, and scraping or indexing trusted websites, databases,
and published biographical dictionaries pertaining to photo history.
Again, please bear in mind that our programmer is still hard at work (and I
continue to refine and add to the data*), but we welcome your feedback,
questions, critiques, etc. To see the WikiData photographers, select
WikiData from the Source dropdown. Have fun!
*PIC*
<http://mgiraldo.github.io/pic/?address.AddressTypeID=*&address.CountryID=*&…>
Thanks,
David
*Tomorrow, for instance, I'll start mining Wikidata for birth & death
locations.
Citations and references are the building blocks of Wikimedia projects.
However, as of today, they are still treated as second-class citizens.
Structured data bases such as Wikidata offer a unique opportunity
<https://www.wikidata.org/wiki/Wikidata:WikiProject_Source_MetaData> to
turn into reality over a decade of endeavors to build the sum of all
citations and bibliographic metadata into a centralized repository. To
coordinate upcoming work in this space, we're organizing a technical event
in late May and opening up applications for prospective participants.
*WikiCite 2016 <https://meta.wikimedia.org/wiki/WikiCite_2016>* is a
hands-on event focused on designing data models and technology to *improve
the coverage, quality, standards-compliance and machine-readability of
citations and source metadata in Wikipedia, Wikidata and other Wikimedia
projects*. Our goal, in particular, is to define a technical roadmap for
building a repository of all Wikimedia references in Wikidata.
We are bringing together Wikidatans, Wikipedians, software engineers, data
modelers, and information and library science experts from organizations
including *Crossref*, *Zotero*, *CSL*, *ContentMine*, *Google*, *Datacite*,
*NISO*, *OCLC* and the *NIH*. We are also inviting academic researchers
with experience working with Wikipedia's citations and bibliographic data.
WikiCite will be hosted in *Berlin* on *May 25-26, 2016*. Participation to
the event is capped at about 50 participants and we expect to have a number
of open slots for applicants:
- if you were pre-invited and have already filled in a form, you will
receive a separate note from the organizers
- if you have not been invited but you would like to participate, please
fill in this application form <http://goo.gl/forms/Yv6rve2wCt> to give
us some information about you and your interest and expected contribution
to the event.
Please help us pass this on to anyone who has done important technical work
on Wikimedia references and citations.
*Important dates*
- *March 29, 2016*: applications open
- *April 11, 2016*: applications close
- *April 15, 2016*: notifications of acceptance are issued (if you
applied for a travel grant, we'll be able to confirm by this date if we can
cover the costs of your trip)
For any question, you can contact the organizing committee:
wikicite(a)wikimedia.org
The organizers,
Dario Taraborelli
Jonathan Dugan
Lydia Pintcher
Daniel Mietchen
Cameron Neylon
*Dario Taraborelli *Head of Research, Wikimedia Foundation
wikimediafoundation.org • nitens.org • @readermeter
<http://twitter.com/readermeter>
Hey folks,
we are currently working on the final steps towards making it possible for
Wikimedia Commons to make use of all data in Wikidata in the user's
language.
As of last week we have enabled access to all data on beta Wikidata for beta
Wikimedia Commons (http://commons.wikimedia.beta.wmflabs.org/). This makes
it possible to test this new functionality in order to
evaluate it for enabling it on the actual Commons. If you are working
on templates
or Lua modules on Commons or are generally interested in playing around
with the new functionality, please do that on beta Commons.
The data access which Commons will get is the "arbitrary access" which already
is deployed to various sister projects, including all Wikipedias. Commons
is a special case because similar to Wikidata it is multi-lingual. Unlike
Wikipedia or any of the other Wikis that currently have arbitrary access
enabled, Commons will get data access in the user's language and not in a
defined content language. That means that all functionality that we
provide, in order to make use of the data on Wikidata, will be using the
users language for localization, so that each user can see the content in
their proffered language.
In detail that means that the {{#property:…|from=Q1234}} parser function will
output labels, dates, … formatted in the users language. The same applies
to the Lua functions we provide for advanced use cases. A detailed
reference of the Lua functionality we provide can be found at
https://www.mediawiki.org/wiki/Extension:Wikibase_Client/Lua.
It'd be great if those of you familiar with both Commons and Wikidata could
test this on beta Commons and let me know if you encounter any major
issues. If there are none we'll go ahead and find a date to switch it on
when I am back from the Wikimedia hackathon on the 11th of April.
Cheers,
Lydia
--
Lydia Pintscher - http://about.me/lydia.pintscher
Product Manager for Wikidata
Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/029/42207.