Hi,
I just noticed a bug in the RDF live exports of Wikidata: they still use
the base URI <http://wikiba.se/ontology-beta#> for all Wikidata
vocabulary terms. The correct base URI would be
<http://wikiba.se/ontology#>. I guess this has been forgotten and never
got noticed yet (not sure if there are consumers of the live exports).
The SPARQL query service uses the correct URIs in its example queries
and data. The URIs in the ontology documents at wikiba.se are also
correct, so this only seems to affect the PHP code.
Cheers,
Markus
[1] wikiba.se/ontology
--
Markus Kroetzsch
Faculty of Computer Science
Technische Universität Dresden
+49 351 463 38486
http://korrekt.org/
Hi folks,
It's fantastic to see that we have such interesting tools to contribute
to Wikidata like Magnus' games.
With Wikidata Game and The Distributed Game as a base, I think we could
go further and get a tool that serve, not only as a game, but as a real
competition. In particular, with the following additions and a few
suggestions, I believe we could celebrate great /in situ/ Wikidata
competitions over the world:
* A chronometer with a start and a scheduled end while contributions are
registered for the contest.
* Some quorum (e.g., three) so that edits in the contest are only
applied to Wikidata if that quorum of people agrees on an answer.
* A scoring system that only provide points (or much more points) to
those who get a quorum. This avoids people answering randomly while they
destroy Wikidata and earn more and more points.
* A way to show the same questions to the quorum number of participants
during the competition.
* A real-time ranking in the competition scope.
* A way to manage the list of participants and to register an
administrator, or multiple ones, for every contest.
Would this be a good idea? Would anyone like to develop some of these
features?
Regards,
--
David Abián - davidabian.com
Vocal de Comunicación
Wikimedia España
Vega Sicilia, 2
47008 - Valladolid
https://wikimedia.es
Wikimedia España es una asociación sin ánimo de lucro española con
CIF G-10413698 inscrita en el Registro Nacional de Asociaciones,
Grupo 1, Sección 1, Núm. Nacional 597390.
«Imagina un mundo en el que cada persona
tenga acceso libre a todo el conocimiento».
One of the demonstrations that I plan to use in ConceptMap.io is a sort
of six-degrees game with musicians. As I've been looking at the
Wikidata entries for artists (e.g. Eric Clapton) I see that there are
missing relationships to musical groups. What is the plan to keep these
relationships in-sync with, for example, MusicBrainz?
Thanks,
James Weaver
As Lydia announced, we are going to deploy support for two new data types soon
(think of "data types" as "property types", as opposed to "value types"):
* The "math" type for formulas. This will use TeX syntax and is provided by the
same extension that implements <math> for wikitext. We plan to roll this out on
Feb 9th.
* The "external-id" type for references to external resources. We plan to roll
this out on Feb 16th. NOTE: Many of the existing properties for external
identifiers will be converted from the plain "string" data type to the new
"external-id" data type, see
<https://www.wikidata.org/wiki/User:Addshore/Identifiers>.
Both these new types will use the "string" value type. Below are two examples of
Snaks that use the new data type, in JSON:
{
"snaktype": "value",
"property": "P717",
"datavalue": {
"value": "\\sin x^2 + \\cos_b x ^ 2 = e^{2 \\tfrac\\pi{i}}",
"type": "string"
},
"datatype": "math"
}
{
"snaktype": "value",
"property": "P708",
"datavalue": {
"value": "BADWOLF",
"type": "string"
},
"datatype": "external-id"
}
As you can see, the only thing that is new is the value of the "datatype" field.
Similarly, in RDF, both new data types use plain string literals for now, as you
can see from the turtle snippet below:
wd:Q2209 a wikibase:Item ;
wdt:P717 "\\sin x^2 + \\cos_b x ^ 2 = e^{2 \\tfrac\\pi{i}}" ;
wdt:P708 "BADWOLF" .
The datatypes themselves are declared as follows:
wd:P708 a wikibase:Property ;
wikibase:propertyType wikibase:ExternalId .
wd:P717 a wikibase:Property ;
wikibase:propertyType wikibase:Math .
Accordingly, the URIs of the datatypes (not the types of the literals!) are:
<http://wikiba.se/ontology-beta#ExternalId>
<http://wikiba.se/ontology-beta#Math>
These are, for now, the only changes to the representation of Snaks. We do
however consider some additional changes for the future. To avoid confusion,
I'll put them below a big separator:
ANNOUNCEMENT ABOVE!
--------------------------------------------------------------------------------
ROUGH PLANS BELOW!
Here are some changes concerning the math and external-id data types that we are
considering or planning for the future.
* For the Math datatype, we may want to provide a type URI for the RDF string
literal that indicates that the format is indeed TeX.
Perhaps we could use <http://purl.org/xtypes/Fragment-LaTeX>.
* For the ExternalId data type, we would like to use resource URIs for external
IDs (in "direct claims"), if possible. This would only work if we know the base
URI for the property (provided by a statement on the property definition). For
properties with no base URI set, we would still use plain string literals.
In our example above, the base URI for P708 might be
<https://tardis.net/allonzy/>. The Turtle snippet would read:
wd:Q2209 a wikibase:Item ;
wdt:P717 "\\sin x^2 + \\cos_b x ^ 2 = e^{2 \\tfrac\\pi{i}}"
^^purl:Fragment-LaTeX;
wdt:P708 <https://tardis.net/allonzy/BADWOLF> .
However, the full representation of the statement would still use the original
string literal:
wds:Q2209-24942a17-4791-a49d-6469-54e581eade55 a wikibase:Statement,
wikibase:BestRank ;
wikibase:rank wikibase:NormalRank ;
ps:P708 "BADWOLF" .
We would also like to provide the full URI of the external resource in JSON,
making us a good citizen of the web of linked data. We plan to do this using a
mechanism we call "derived values", which we also plan to use for other kinds of
normalization in the JSON output. The idea is to include additional data values
in the JSON representation of a Snak:
{
"snaktype": "value",
"property": "P708",
"datavalue": {
"value": "BADWOLF",
"type": "string"
},
"datavalue-uri": {
"value": "https://tardis.net/allonzy/BADWOLF",
"type": "string"
},
"datatype": "external-id"
}
In some cases, such as ISBNs, we would want a URL as well as a URI:
{
"snaktype": "value",
"property": "P708",
"datavalue": {
"value": "3827370191",
"type": "string"
},
"datavalue-uri": {
"value": "urn:isbn:3827370191",
"type": "string"
},
"datavalue-url": {
"value": "https://www.wikidata.org/wiki/Special:BookSources/3827370191",
"type": "string"
},
"datatype": "external-id"
}
The base URL would be given as a statement on the property, just like the base URI.
We plan to use the same mechanism for giving Quantities in a standard unit,
providing thumbnail URLs for CommonsMedia values, etc.
--
Daniel Kinzler
Senior Software Developer
Wikimedia Deutschland
Gesellschaft zur Förderung Freien Wissens e.V.
Hey folks :)
I just sat down with Katie to plan the next important feature deployments
that are coming up this month. Here is the plan:
* new datatype for mathematical expressions: We'll get it live on
test.wikidata.org tomorrow and then bring it to wikidata.org on the 9th
* Article Placeholder: We'll get it to test.wikipedia.org on the 9th
* new datatype for identifiers: we'll bring it to wikidata.org on the 16th.
We'll convert existing properties according to the list on
https://www.wikidata.org/wiki/User:Addshore/Identifiers in two rounds on
17th and 18th.
* In Other Projects Sidebar: We'll enable it by default on 16th for all
projects that do not opt-out on https://phabricator.wikimedia.org/T103102.
* interwiki links via Wikidata for Wikiversity: We'll enable phase 1 on
Wikiversity on the 23rd.
Cheers
Lydia
--
Lydia Pintscher - http://about.me/lydia.pintscher
Product Manager for Wikidata
Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/029/42207.
Hello,
I was wondering what the possibilities are of setting up a local instance of WikiData?
Such that the local instance stays relatively up to date with the official instance and that the local version is accessible via SPARQL.
Are such mirror options already available or would it require a fair amount of customisations?
The reason for this is that it would allow us to use a local instance for teaching possibilities and SPARQL practices.
Kind regards,
Jasper
Hi!
As it was noted on the list, we recently tried to update Blazegraph -
software running Wikidata Query Service - to version 2.0, which has
numerous bugfixes and performance improvements, and some infrastructure
for future work on Geospatial search, etc.
Unfortunately, it seems, as it sometimes happens with new major
releases, that there are certain bugs in it, and yet more unfortunately,
one of the bugs seems to be of a race condition nature, which is very
hard to trigger on test environment, and that, when triggered, seriously
impacts the stability of the service. All this lead to WDQS service
being somewhat unstable last couple of days.
Due to this, I have rolled the production deployment back to pre-2.0
state. This means the service should be stable again and not experience
glitches anymore. I'll be watching it just in case and if you notice
anything that looks broken (like queries producing weird exceptions -
timeout does not count - or service being down, etc.) please ping me.
In the meantime, we will look for the cause of instability, and once it
is identified and fixed, we'll try the Blazegraph 2.0 roll-out again,
with the fixes applied. I'll send a note to the list when it happens.
Thanks,
--
Stas Malyshev
smalyshev(a)wikimedia.org
Hi,
We are accessing Wikidata using embedded SPARQL queries.
We need to have the abstract of an article (namely to retrieve it from the
Wikipedia article).
I can get the site link of an article - though I cannot get a specific
paragraph, particularly the abstract of the article.
It works for me when I use the DBpedia but I want to do it using wikidata.
Please let me know how?
Thanks a lot,
Miriam
=================
Miriam Allalouf, PhD
Software Engineering Department, JCE
Academic Head of Mahar Project
Mobile tel: +972-52-3664129
email: miriamal(a)jce.ac.il
Azrieli College of Engineering Jerusalem - JCE