Hello all,
In the past few months, the development team has mentored a student, Olga,
to help us developing a user script
<https://www.wikidata.org/wiki/User:Jonas_Kress_%28WMDE%29/check_constraints…>
that displays the constraints on the item pages.
To use the script, add the following line to your user/common.js:
mw.loader.load(
'//www.wikidata.org/w/index.php?title=User:Jonas_Kress_(WMDE)/check_constraints.js&action=raw&ctype=text/javascript'
);
You will see an icon [image: Icon used on the user script for constraint
reports on Wikidata]
<https://commons.wikimedia.org/wiki/File:Constraint_icon.png> next to
violations. When you click it you will see the full report.
[image: Screenshot of the user script checking constraints on Wikidata]
<https://commons.wikimedia.org/wiki/File:Screenshot_constraint_check_Wikidat…>
This script is based on a new API module for constraint checks
<https://www.wikidata.org/wiki/Special:ApiSandbox#action.3Dwbcheckconstraint…>
that one can use to check constraints on items and statements. At the
moment, the constraint checks are only derived from the property discussion
page constraint templates, not directly from statements. They are then
stored in a database table. We are running a script to update this table
every now and then or when you ask for it. Also note that some constraint
checks are disabled (for example the format check). In the future we will
support adding and updating constraints on property statements and we will
implement support for some constraints that are currently still missing.
If you try it, feel free to give us feedback! You can also add comments or
subtasks on Phabricator (see the ticket for the API module
<https://phabricator.wikimedia.org/T102757> and the user script
<https://phabricator.wikimedia.org/T97018>).
If there is no major disagreement, we would like to turn this script into a
gadget in the next days.
Thanks go to Olga and all the developers that helped her providing this new
feature :)
See also:
- API Documentation
<https://www.mediawiki.org/wiki/Wikibase/API#wbcheckconstraints>
- Self-documentation for the module
<https://www.wikidata.org/w/api.php?action=help&modules=wbcheckconstraints>
Cheers,
--
Léa Lacroix
Project Manager Community Communication for Wikidata
Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/029/42207.
Hi,
I'm referring to a query like this, based on the Wikidata cats example:
SELECT ?item
WHERE
{
?item wdt:P31 wd:Q146 .
?crash rdfs:label ?_crashLabel
}
on my local Blazegraph, I can see that it uses up all the CPU cores and
goes up in RAM, and eventually kills the query with:
Caused by: com.bigdata.rwstore.sector.MemoryManagerOutOfMemory
at
com.bigdata.rwstore.sector.MemoryManager.getSectorFromFreeList(MemoryManager.java:646)
Could Blazegraph prevent this query from even starting?
Thanks!
Cheers
Miguel
I am not able to successfully get results on topics that have diacritics in
their label while logged in and searching Wikidata.
Searching for "Marc Marquez" this morning did not show any results. I
began to create an entry for this motorcycle road racer and later
discovered that he already had an entry. I noticed that the "Also known
as" for English only had a diacritic label "Marc Márquez i Alentà ". I
then added another "Also known as" Marc Marquez and this resolved the
problem of finding the correct original topic. I merged my erroneous new
topic into the original https://www.wikidata.org/wiki/Q40615
Is there an effort already in progress to fix diacritic insensitive search
or investigate why it might not be working for logged in English users ?
I only found this older Phabricator issue for Autocomplete during data
entry, and not main search: https://phabricator.wikimedia.org/T49114
-Thad
+ThadGuidry <https://www.google.com/+ThadGuidry>
Hi!
I'm looking to speed up my Wikidata-Toolkit code that updates a lot of
statements.
Does a single WikibaseDataEditor support multiple updateStatements running
in different threads?
If not, could I instead create a pool of WikibaseDataEditor instances and
then use them separately from different threads
Thanks!
Miguel
Hello!
It looks like Wikibase extension is not compatible with PostgreSQL backend.
There are many MySQL specific code in sql scripts (e.g. auto_increment,
varbinary).
How about to add this information to Wikibase docs?
Hello all,
We’ve been working on a new data type that allows you to link to the
*geographical
shapes* that are now stored on Commons. This data type will be deployed on
Wikidata on *April 17th*.
This data type refers to the geographical shapes that are enabled on
Wikimedia Commons since the beginning of this year. Here you can find more
information about this <https://www.mediawiki.org/wiki/Maps>.
The property creators will be able to create properties with this geoshape
data type by selecting “Geographical shape” in the data type list.
When the property is created, you can use it in statements, and when
filling the value, if you start typing a string, you can choose the name of
a geoshape in the list of what exists on Commons.
[image: Screenshot test geoshape in Wikidata.png]
<https://commons.wikimedia.org/wiki/File:Screenshot_test_geoshape_in_Wikidat…>
One thing to note: We currently do not export statements that use this
datatype to RDF. They can therefore not be queried in the Wikidata Query
Service. The reason is that we are still waiting for geoshapes to get
stable URIs. This is handled in this ticket
<https://phabricator.wikimedia.org/T161527>.
Before the deployment, you can test it on http://test.wikidata.org (see for
example the property “geotest” on Q22 <https://test.wikidata.org/wiki/Q22>).
If you have any question, feel free to ask!
Cheers,
--
Léa Lacroix
Project Manager Community Communication for Wikidata
Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/029/42207.
Hi!
I'm interested in using SPARQL for my own Wikibase installation. But, I was
getting errors in the munge step.
To reproduce the problem, I set up a new Wikibase instance and tried to
load it into wikidata-query-rdf.
It has only one item, one property, and one statement.
First I used the included dumpRdf.php to create the RDF file, attached.
Then I did the munge step. It produced the errors, attached.
I don't understand why there is "Unrecognized subjects" when the subjects
listed do match the patterns, "Expected only sitelinks
and subjects starting with http://www.wikidata.org/wiki/Special:EntityData/
and
http://www.wikidata.org/entity/"
I don't understand what is the revision id it is looking for.
What is the proper procedure for indexing?
If necessary, I can code in Java and fix the tools. I just need an overview
on how this system works.
Thanks!
Regards,
Miguel