Hey folks,
we plan to drop the wb_entity_per_page table sometime soon[0], because
it is just not required (as we will likely always have a programmatic
mapping from entity id to page title) and it does not supported non
-numeric entity ids as it is now. Due to this removing it is a blocker
for the commons metadata.
Is anybody using that for their tools (on tool labs)? If so, please
tell us so that we can give you instructions and a longer grace period
to update your scripts.
Cheers,
Marius
[0]: https://phabricator.wikimedia.org/T95685
Hi everyone!
I am trying to configure Wikibase in a wiki that I have installed. I
want to use both, repository and client. I have followed installation
instructions <https://www.mediawiki.org/wiki/Wikibase/Installation>, but
I think that I made something wrong because now I have several errors in
my wiki.
If I access to any page of the site I see "Internal error", the error
and the backtrace. For example, in the mainpage:
[e837b2d7a6fcc7e1edaead2a] /Portada Error from line 240 of
/[PATH]/extensions/Wikibase/client/includes/Store/Sql/DirectSqlStore.php:
Class 'Wikimedia\Rdbms\SessionConsistentConnectionManager' not found
It also happens when I try to edit any page in main namespace. Then,
another error is when I visit the page Special:CreateItem:
[1d904eb6d5d155e0d0c710ae] /Especial:CreateItem TypeError from line
32 of
/[PATH]/extensions/Wikibase/repo/includes/Store/Sql/SqlIdGenerator.php:
Argument 1 passed to Wikibase\SqlIdGenerator::__construct() must be an
instance of Wikimedia\Rdbms\LoadBalancer, instance of LoadBalancer
given, called
in/[PATH]/extensions/Wikibase/repo/includes/Store/Sql/SqlStore.php on
line 298
You can check my settings for Wikibase in that gist
<https://gist.github.com/distriker/6a387d967a7b48e36f41055f00c3c366>.
But I have made is config both extensions in the same file and the
namespaces like as explain the advanced configuration
<https://www.mediawiki.org/wiki/Extension:Wikibase_Repository#Advanced_Confi…>.
I have configurated PHP 7.
Any idea? Thanks in advance!
Regards, Iván
--
Iván Hernández Cazorla.
Estudiante del Grado de Historia en la *Universidad de Las Palmas de
Gran Canaria*.
Socio de *Wikimedia España*.
Sitio web personal <http://distriker.com>.
Hoi,
Much of the content of DBpedia and Wikidata have the same origin;
harvesting data from a Wikipedia. There is a lot of discussion going on
about quality and one point that I make is that comparing "Sources" and
concentrating on the differences particularly where statements differ is
where it is easiest to make a quality difference.
So given that DBpedia harvests both Wikipedia and Wikidata, can it provide
us with a view where a Wikipedia statement and a Wikidata statement differ.
To make it useful, it is important to subset this data. I will not start
with 500.000 differences but I will begin when they are about a subset that
I care about.
When I care about entries for alumni of a university, I will consider
curating the information in question. Particularly when I know the language
of the Wikipedia.
When we can do this, another thing that will promote the use of a tool like
this is when regularly (say once a month) numbers are stored and trends are
published.
How difficult is it to come up with something like this. I know this tool
would be based on DBpedia but there are several reasons why this is good.
First it gives added relevance to DBpedia (without detracting from
Wikidata) and secondly as DBpedia updates on RSS changes for several
Wikipedias, the effect of these changes is quickly noticed when a new set
of data is requested.
Please let us know what the issues are and what it takes to move forward
with this, Does this make sense?
Thanks,
GerardM
http://ultimategerardm.blogspot.nl/2017/03/quality-dbpedia-and-kappa-alpha-…
Hello everyone,
I am looking for a text corpus that is annotated with Wikidata entites.
I need this for the evaluation of an entity linking tool based on
Wikidata, which is part of my bachelor thesis.
Does such a corpus exist?
Ideal would be a corpus annotated in the NIF format [1], as I want to
use GERBIL [2] for the evaluation. But it is not necessary.
Thanks for hints!
Samuel
[1] https://site.nlp2rdf.org/
[2] http://aksw.org/Projects/GERBIL.html
Hello,
Our next Wikidata IRC office hour will take place on April 5th, 16:00 UTC
<https://www.timeanddate.com/worldclock/meetingdetails.html?year=2017&month=…>
(18:00 in Berlin), on the channel #wikimedia-office.
During one hour, you'll be able to chat with the development team about the
past, current and future projects, and ask any question you want.
See you there,
<http://webchat.freenode.net/?channels=#wikimedia-office>
--
Léa Lacroix
Project Manager Community Communication for Wikidata
Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/029/42207.
Hello all,
The international Wikimedia conference will settle in Montreal (Q340)
<https://www.wikidata.org/wiki/Q340> on August 11-13. As the call for
submission is about to start, let's talk about what you would like to see
happening about Wikidata.
Do you have a project of submissing a talk, workshop, meetup? Which topics
would you like to talk about with the development team? Let's talk about
our ideas on Wikidata:Wikimania 2017
<https://www.wikidata.org/wiki/Wikidata:Wikimania_2017>!
Thanks,
--
Léa Lacroix
Project Manager Community Communication for Wikidata
Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/029/42207.
Hello all,
We’ve been working on a new data type that allows you to link to the
*geographical
shapes* that are now stored on Commons. This data type will be deployed on
Wikidata on *April 17th*.
This data type refers to the geographical shapes that are enabled on
Wikimedia Commons since the beginning of this year. Here you can find more
information about this <https://www.mediawiki.org/wiki/Maps>.
The property creators will be able to create properties with this geoshape
data type by selecting “Geographical shape” in the data type list.
When the property is created, you can use it in statements, and when
filling the value, if you start typing a string, you can choose the name of
a geoshape in the list of what exists on Commons.
[image: Screenshot test geoshape in Wikidata.png]
<https://commons.wikimedia.org/wiki/File:Screenshot_test_geoshape_in_Wikidat…>
One thing to note: We currently do not export statements that use this
datatype to RDF. They can therefore not be queried in the Wikidata Query
Service. The reason is that we are still waiting for geoshapes to get
stable URIs. This is handled in this ticket
<https://phabricator.wikimedia.org/T161527>.
Before the deployment, you can test it on http://test.wikidata.org (see for
example the property “geotest” on Q22 <https://test.wikidata.org/wiki/Q22>).
If you have any question, feel free to ask!
Cheers,
--
Léa Lacroix
Project Manager Community Communication for Wikidata
Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/029/42207.
Hi folks!
My name is Glorian Yapinus, but you can simply call me Glorian ;) . For the
next 6 months, I will assist Lydia in supporting you all.
Regarding to my educational background, I hold a bachelor's degree in
Information Technology and currently, I am working on my Master's in
Software Engineering and Management.
I am a warm and nice person. So, please do not hesitate to reach out to me
for any queries :-)
Last but not least, I am looking forward to working with you.
Cheers,
Glorian
--
Glorian Yapinus
Product Management Intern for Wikidata
Imagine a world, in which every single human being can freely share in the
sum of all knowledge. That‘s our commitment.
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/029/42207.
Hello,
We've been working on integrating some external databases in the Query
Service endpoint. You can now query in the same time Wikidata and other
free knowledge bases. For now, only databases released under the most
compatible licenses, such as CC0, are available. Currently available:
* Europeana[1], the huge collection of cultural data, which is entirely
available in public domain.
* Nomisma.org[2], a collaborative project to provide stable digital
representations of numismatic concepts.
* Biblioteca Virtual Miguel de Cervantes [3]
* Biblioteca Nacional de España [4]
Soon, we plan to also enable querying other databases, with licenses
such as CC-BY. We plan to continue add endpoints based on user requests.
The currently available endpoints are listed here:
https://query.wikidata.org/copyright.html
The documentation about the feature (which will also be updated when we
add more endpoints) is in the Query Service User Manual [5].
Here are some examples to understand how to build such queries:
https://www.wikidata.org/wiki/User:Smalyshev_(WMF)/Federation
Thanks a lot for your input! If you have any question or problem, feel
free to contact me.
[1] http://labs.europeana.eu/api/linked-open-data-sparql-endpoint
[2] http://nomisma.org/
[3] http://data.cervantesvirtual.com/about
[4] http://datos.bne.es/sparql?help=intro
[5]
https://www.mediawiki.org/wiki/Wikidata_query_service/User_Manual#Federation
--
Stas Malyshev
smalyshev(a)wikimedia.org
Hi all,
I'm happy to introduce yet another Wikidata data consumer :). It's called
monumental and it's designed to browse, add and display all information
about cultural heritage monuments we have in Wikidata. You can think of it
as Reasonator, Wiki Loves Monuments map and wikidata games combined :).
App is installed on labs, feel free to take a look and fiddle with it:
https://tools.wmflabs.org/monumental/#/
Application is currently on early stage, much more features will be
introduced soon. On this page you can see plans for this year:
https://www.wikidata.org/wiki/Wikidata:Tools/Monumental
Cheers,
Paweł and Stephen, authors
--
*Paweł Marynowski*
Stowarzyszenie Wikimedia Polska
http://pl.wikimedia.org/