I am pleased to announce public beta testing of EventZoom.net - a site for
displaying
sets of historical events on maps zoomable in time and space, using data
from Wikidata.
EventZoom.net is a data visualisation site that plots events in time and
space on a map.
Currently Google Maps, and with plans of adding others in the future. Here,
you can zoom
on events in the dimension of time as well as on the map itself. For this,
EventZoom uses
time spans (with a beginning and end) tightly integrated into the solution,
enabling a useful
way of filtering out and hiding those events that are outside of your zoom
focus. You can
select individual years, months or days to have the map display only those
events that
co-occured during that time span, down to single days. Stepping through,
for example,
World War II or other major events year by year, or month by month, might
make an
interesting view on the progression of history.
As for the future, besides adding more maps (and types of maps), the import
and
export options of the EventZoom platform will be extended and improved.
Plans
also include adding more sites. At the moment, the Wikidata import
specifically
depends on wdq.wmflabs.org, but that will change as the Wikidata API
evolves.
Feedback from the Wikidata and Wikimedia community is most welcome,
especially
regarding the UI and functionality: How useful do you think it is in terms
of giving an
overview over major events - and for example identifying events missing
from Wikidata?
And how best to present other types of data? (Everything with a combination
of
geolocations and temporal data may be considered.)
Link:
http://eventzoom.net/https://en.wikipedia.org/wiki/User:Johansen.fred
Hey folks :)
As announced earlier we've just deployed the next step towards the new
header design. We're not there yet but this is the next step. This
will allow you to collapse the in other languages box for example and
adds a hint about how to configure the displayed languages.
In addition to the new header the next deployment will bring a lot of
under-the-hood changes and bug fixes. The most relevant changes for
you are:
* we made the diff for time values more meaningful
* we fixed a lot of bugs in the time datatype
* edit links are no longer cached incorrectly based on the users
permission (This lead to users sometimes seeing edit buttons on pages
that they could not edit and no edit buttons on pages that they could
edit.)
* we fixed some issues with propagating page moves and deletions on
the clients (Wikipedia, etc) to Wikidata
* we corrected an issue where you would see new data in the old part
of a diff (This affected qualifiers mainly.)
* the sitetointerwiki gadget now also works on diff pages
* the precision is now detected correctly when entering a quantity in
scientific notation
* we added mailto as an accepted protocol for the URL datatype
Please let me know if you encounter any issues.
Cheers
Lydia
--
Lydia Pintscher - http://about.me/lydia.pintscher
Product Manager for Wikidata
Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
Hi,
I just wanted to quickly let you know that on Friday Lucie discovered
that it's possible to use the wbmergeitems API without passing an edit
token to it, also it was possible to use it via GET requests.
Not requiring a token made that module vulnerable to CSRF attacks.
We opened a security bug, fixed the problem and deployed a patch on
Friday, thus the problem has been fixed on Wikidata.org.
Anyone running their own Wikibase installations is advised to update to
master or to cherry-pick https://gerrit.wikimedia.org/r/198736.
Users of the wbmergeitems API should check whether they use POST for
their requests and are sending a valid token.
Cheers,
Marius
For further details, please see:
https://phabricator.wikimedia.org/T93365
Greetings,
After a delay in updates to the Structured data on Commons[1] project, I
wanted to catch you up with what has been going on over the past three
months. In short: The project is on hold, but that doesn't mean nothing is
happening.
The meeting in Berlin[2] in October provided the engineering teams with a
lot to start on. Unfortunately the Structured Data on Commons project was
put on hold not too long after this meeting. Development of the actual
Structured data system for Commons will not begin until more resources can
be allocated to it.
The Wikimedia Foundation and Wikimedia Germany have been working to improve
the Wikidata query process on the back-end. This is designed to be a
production-grade replacement of WikidataQuery integrated with search. The
full project is described at Mediawiki.org[3].This will benefit the
structured data project greatly since developing a high-level search for
Commons is a desired goal of this project.
The Wikidata development team is working on the arbitrary access feature.
Currently it's only possible to access items that are connected to the
current page. So for example on Vincent van Gogh you can access the
statements on Q5582, but you can't access these statements on Commons at
Category:Vincent van Gogh or Creator:Vincent van Gogh. With arbitrary
access enabled on Commons we no longer have this limitation. This opens up
the possibility to use Wikidata data on Creator, Institution, Authority
control and other templates instead of duplicating the data (what we do
now). This will greatly enhance the usefulness of Wikidata for Commons.
To use the full potential of arbitrary access the Commons community needs
to reimplement several templates in LUA. In LUA it's possible to use the
local fields and fallback to Wikidata if it's not locally available. Help
with this conversion is greatly appreciated. The different tasks are
tracked in Phabricator[4].
Volunteers are continuing to add data about artworks to Wikidata. Sometimes
an institution website is used and sometimes data is being transfered from
Commons to Wikidata. Wikidata now has almost 35.000 items about paintings.
This is done as part of the Wikidata WikiProject "Sum of All Paintings"[5].
This helps us to learn how to refine metadata structure about artworks.
Experience that will of course be very useful for Commons too.
Additionally, the metadata cleanup drive continues to produce results[6].
The drive, which is intended to identify files missing {{information}} or
the like structured data fields and to add such fields when absent, has
reduced the number of files missing information by almost 100,000 on
Commons. You can help by looking for files[7] with similarly-formatted
description pages, and listing them at Commons:Bots/Work requests[8] so
that a bot can add the {{information}} template on them.
At the Amsterdam Hackathon in November 2014, a couple of different models
were developed about how artwork can be viewed on the web using structured
data from Wikidata. You can browse two examples[9][10]. These examples can
give you an idea of the kind of data that file pages have the potential to
display on-wiki in the future.
The Structured Data project is a long-term one, and the volunteers and
staff will continue working together to provide the structure and support
in the back-end toward front-end development. There are still many things
to do to help advance the project, and I hope to have more news for you in
the near future. Contact me any time with questions, comments, concerns.
1. https://commons.wikimedia.org/wiki/Commons:Structured_data
2.
https://commons.wikimedia.org/wiki/Commons:Structured_data/Berlin_bootcamp
3. https://www.mediawiki.org/wiki/Wikibase/Indexing
4. https://phabricator.wikimedia.org/T89594
5. https://www.wikidata.org/wiki/Wikidata:WikiProject_sum_of_all_paintings
6. https://meta.wikimedia.org/wiki/File_metadata_cleanup_drive
7. https://tools.wmflabs.org/mrmetadata/commons/commons/index.html
8. https://commons.wikimedia.org/wiki/Commons:Bots/Work_requests
9. http://www.zone47.com/crotos/?p=1&p276=190804&y1=1600&y2=2014
10. http://sum.bykr.org/432253
--
Keegan Peterzell
Community Liaison, Product
Wikimedia Foundation
Dear all,
Thanks to the people at the Center of Semantic Web Research in Chile
[1], we have a very first public SPARQL endpoint for Wikidata running.
This is very preliminary, so do not rely on it in applications and
expect things to fail, but you may still enjoy some things.
http://milenio.dcc.uchile.cl/sparql
The endpoint has all the data from our current RDF exports in one big
database [2]. Below this email are some example queries to get you
started (this is a bit of a learning-by-doing crash course in SPARQL
too, but you may want to consult a tutorial if you don't know it ;-).
There are some known bugs in the RDF that we will hopefully fix soon
[3]. Also, the service uses a dump that is already a few weeks old now.
We are more interested in testing functions right now before going
production. Also, this is a raw API interface, not a proposal for a nice UI.
Feedback (and other interesting queries) are welcome :-)
Cheers,
Markus
[1] http://ciws.cl/ -- a joint team from University of Chile and
Pontificia Universidad Catolica de Chile
[2] http://tools.wmflabs.org/wikidata-exports/rdf/
[3]
https://github.com/Wikidata/Wikidata-Toolkit/issues?q=is%3Aopen+is%3Aissue+…
==Lighthouses (Q39715) with their English label (LIMIT 100 for demo)==
PREFIX : <http://www.wikidata.org/entity/>
SELECT *
WHERE {
?lighthouse a :Q39715 .
?lighthouse rdfs:label ?label FILTER(LANG(?label) = "en")
} LIMIT 100
(Just paste the query into the box at http://milenio.dcc.uchile.cl/sparql)
The actual query condition is in the WHERE {...} part. Things starting
with ? are variables. Basic conditions take the form of triples:
"subject property value". For example, "?lighthouse a :Q39715" looks for
things that are a lighthouse ("a" is short for "rdf:type" which we use
to encode P31 statements without qualifiers). The dot "." is used as a
separator between triples.
Note that the label output is a bit cumbersome because you want to
filter by language (without the FILTER you get all labels in all
languages). A future UI would better fetch the labels after the query,
similar to WDQ, to get smaller & faster queries.
==People born in the same place that they died in==
PREFIX : <http://www.wikidata.org/entity/>
SELECT ?person ?personname ?placename
WHERE {
?person a :Q5 .
?person :P19c ?place .
?person :P20c ?place .
?person rdfs:label ?personname FILTER(LANG(?personname) = "en") .
?place rdfs:label ?placename FILTER(LANG(?placename) = "en")
} LIMIT 100
Here we use a few actual Wikidata properties. Properties in their simple
form (Entity->Value) use ids with a "c" in the end, like :P19c here.
Only qualifier-free statements will be available in this form right now.
Note that we use the variable ?place in two places as a value. This is
how we query for things that have the same place in both cases.
==People who have Wikipedia (Q52) accounts==
PREFIX : <http://www.wikidata.org/entity/>
SELECT ?person ?personname ?username
WHERE {
?person :P553s ?statement .
?statement :P553v :Q52 .
?statement :P554q ?username .
?person rdfs:label ?personname FILTER(LANG(?personname) = "en") .
} LIMIT 100
This query needs to access qualifiers of a statement for "website
account on" (P553). To do this in RDF (and SPARQL), we access the
statement object instead of using simple property :P553c (which would
only give us the value). The statement is found through an "...s"
property; its value is found through a "...v" property; its qualifiers
are found through "...q" properties. Check out the graph in our paper to
get the picture
(http://korrekt.org/page/Introducing_Wikidata_to_the_Linked_Data_Web).
There you can also find how references are accessed.
==Currently existing countries==
PREFIX : <http://www.wikidata.org/entity/>
SELECT ?country ?countryName
WHERE {
?country :P31s ?statement .
?statement :P31v :Q3624078 .
FILTER NOT EXISTS { ?statement :P582q ?endDate }
?country rdfs:label ?countryName FILTER(lang(?countryName)="en")
}
Similar pattern as with the Wikipedia accounts, but now we check that a
certain qualifier (end time) does not exist. You could also find
currently married people in this way, etc.
==Descendants of Queen Victoria (Q9439) ==
PREFIX : <http://www.wikidata.org/entity/>
SELECT DISTINCT *
WHERE {
:Q9439 ((^:P25c|^:P22c)+) ?person .
?person rdfs:label ?label
FILTER(LANG(?label) = "en")
} LIMIT 1000
Here, ((^:P25c|^:P22c)+) is a regular expression; ^ is for changing the
direction of a property (has mother -> mother of ...); | is for "or", +
is for one or more repetitions.
==Currently existing countries, ordered by the number of their current
neighbours==
PREFIX : <http://www.wikidata.org/entity/>
SELECT ?countryName (COUNT (DISTINCT ?neighbour) AS ?neighbours)
WHERE {
?country :P31s ?statement .
?statement :P31v :Q3624078 .
FILTER NOT EXISTS { ?statement :P582q ?endDate }
?country rdfs:label ?countryName FILTER(lang(?countryName)="en")
OPTIONAL { ?country (:P47s/:P47v) ?neighbour .
?neighbour :P31s ?statement2 .
?statement2 :P31v :Q3624078 .
FILTER NOT EXISTS { ?statement2 :P582q ?endDate2 }
}
} ORDER BY DESC(?neighbours)
Just to give an example of a slightly more complex query ;-) Note how we
use the expression (:P47s/:P47v) rather than :P47c to access the value
of potentially qualified statements here (since qualified statements are
currently not converted to direct :P47c statements).
--
Markus Kroetzsch
Faculty of Computer Science
Technische Universität Dresden
+49 351 463 38486
http://korrekt.org/
Hey folks :)
At the end of the month Freebase is going read-only and I expect we'll
get an influx of new people. I started an FAQ for them at
https://www.wikidata.org/wiki/Help:FAQ/Freebase
Cheers
Lydia
--
Lydia Pintscher - http://about.me/lydia.pintscher
Product Manager for Wikidata
Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
How can this tool work? I am not a Wikidata patroller (not even
autopatrolled) but the tool seems to accept my patrols and my edits are
autopatrolled:
https://www.wikidata.org/w/index.php?title=Special%3ALog&type=patrol&user=F…https://www.wikidata.org/w/index.php?title=Q13915&oldid=205178699&diff=prev
It seems that I do not understand the issue of patrolling on Wikidata.
- Finn Årup Nielsen
Den 19-03-2015 kl. 15:52 skrev Magnus Manske:
> Cool! And nice use of my API :-)
>
> On Thu, Mar 19, 2015 at 2:45 PM Lydia Pintscher
> <lydia.pintscher(a)wikimedia.de <mailto:lydia.pintscher@wikimedia.de>> wrote:
>
> Hey folks :)
>
> Pasleim has created a nice new patrolling tool to help with
> vandalism and spam fighting. Here's their announcement text:
>
> Hey. To fight against vandalism, I wrote an oAuth application which
> is similar to Special:RecentChanges
> <https://www.wikidata.org/wiki/Special:RecentChanges> but has some
> more features:
>
> * 1-click patrol button
> * select edits by type (edited terms, sitelinks, merges...)
> * mass patrolling
> * clickable edits comments, i.e. identifiers of external databases
> and URLs are linked
> * hints showing you various additional information, e.g. if the
> format of an identifier is violated
> * 1-click translation button for labels and descriptions
> * automated description <http://tools.wmflabs.org/autodesc> when
> hovering over the item's label
>
> You find the tool on https://tools.wmflabs.org/pltools/rech. I hope
> with it that some more users get motivated to patrol recent changes.
> --Pasleim <https://www.wikidata.org/wiki/User:Pasleim> (talk
> <https://www.wikidata.org/wiki/User_talk:Pasleim>) 21:46, 18 March
> 2015 (UTC)
>
>
> Cheers
> Lydia
>
> --
> Lydia Pintscher - http://about.me/lydia.pintscher
> Product Manager for Wikidata
>
> Wikimedia Deutschland e.V.
> Tempelhofer Ufer 23-24
> 10963 Berlin
> www.wikimedia.de <http://www.wikimedia.de>
>
> Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
>
> Eingetragen im Vereinsregister des Amtsgerichts
> Berlin-Charlottenburg unter der Nummer 23855 Nz. Als gemeinnützig
> anerkannt durch das Finanzamt für Körperschaften I Berlin,
> Steuernummer 27/681/51985.
> _________________________________________________
> Wikidata-l mailing list
> Wikidata-l(a)lists.wikimedia.org <mailto:Wikidata-l@lists.wikimedia.org>
> https://lists.wikimedia.org/__mailman/listinfo/wikidata-l
> <https://lists.wikimedia.org/mailman/listinfo/wikidata-l>
>
>
>
> _______________________________________________
> Wikidata-l mailing list
> Wikidata-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>