I am trying to synchronise FactGrid data with Wikidata.
A strange question: If I run a SPARQL search such as this one:
https://query.wikidata.org/#SELECT%20%3FGemeinde_in_Deutschland%20%3FGemeind...
I get the coordinates with what looks like swapped values:
"Point(10.7183 50.9489)" on the QueryService table output "50°56'56"N, 10°43'6"E" on P625 at the respective Wikidata Item Q6986
any idea why this is so?
Best, Olaf
Dr. Olaf Simons Forschungszentrum Gotha der Universität Erfurt Schloss Friedenstein, Pagenhaus 99867 Gotha
Büro: +49-361-737-1722 Mobil: +49-179-5196880
Privat: Hauptmarkt 17b/ 99867 Gotha
Hello Olaf,
this is a simple one. As it is common in natural language to present the world coordinates in order of [latitude, longitude], WKT standard https://en.wikipedia.org/wiki/Well-known_text_representation_of_geometry, which is used to represent coordinates in the SPARQL endpoint, on the other hand, expects an order of [x, y], which in this case means [longitude, latitude]. For more info, see https://en.wikipedia.org/wiki/OGC_GeoSPARQL and https://stackoverflow.com/questions/18636564/lat-long-or-long-lat
So it is a desired behaviour, not a bug ;-)
Best regards, Jan
On Mon, 2 Sep 2019 at 12:06, Olaf Simons olaf.simons@pierre-marteau.com wrote:
I am trying to synchronise FactGrid data with Wikidata.
A strange question: If I run a SPARQL search such as this one:
https://query.wikidata.org/#SELECT%20%3FGemeinde_in_Deutschland%20%3FGemeind...
I get the coordinates with what looks like swapped values:
"Point(10.7183 50.9489)" on the QueryService table output "50°56'56"N, 10°43'6"E" on P625 at the respective Wikidata Item Q6986
any idea why this is so?
Best, Olaf
Dr. Olaf Simons Forschungszentrum Gotha der Universität Erfurt Schloss Friedenstein, Pagenhaus 99867 Gotha
Büro: +49-361-737-1722 Mobil: +49-179-5196880
Privat: Hauptmarkt 17b/ 99867 Gotha
Wikidata mailing list Wikidata@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata
Thank you Jan for the swift answer (I wish I had known before this morning's first data input...)
allow me 2 follow-up questions:
Is there an elegant way to get data out of wikidata in a format that you can then fill back into another Wikibase without the pain of such conversions (like splitting coordinates, changing columns, changing the prefixes...)
Question 2 is related: When you extract dates with the QueryService that will change just years like 1971 from 1971-00-00 into 1971-01-01 dates. I felt unable to tell whether such a date was just a year or actually a January 1 entry. Is there a way to get the exact date as it has been put in to Wikibase back from the QueryService?
All my thanks, Olaf
Jan Macura macurajan@gmail.com hat am 2. September 2019 um 12:20 geschrieben:
Hello Olaf,
this is a simple one. As it is common in natural language to present the world coordinates in order of [latitude, longitude], WKT standard https://en.wikipedia.org/wiki/Well-known_text_representation_of_geometry, which is used to represent coordinates in the SPARQL endpoint, on the other hand, expects an order of [x, y], which in this case means [longitude, latitude]. For more info, see https://en.wikipedia.org/wiki/OGC_GeoSPARQL and https://stackoverflow.com/questions/18636564/lat-long-or-long-lat
So it is a desired behaviour, not a bug ;-)
Best regards, Jan
On Mon, 2 Sep 2019 at 12:06, Olaf Simons olaf.simons@pierre-marteau.com wrote:
I am trying to synchronise FactGrid data with Wikidata.
A strange question: If I run a SPARQL search such as this one:
https://query.wikidata.org/#SELECT%20%3FGemeinde_in_Deutschland%20%3FGemeind...
I get the coordinates with what looks like swapped values:
"Point(10.7183 50.9489)" on the QueryService table output "50°56'56"N, 10°43'6"E" on P625 at the respective Wikidata Item Q6986
any idea why this is so?
Best, Olaf
Dr. Olaf Simons Forschungszentrum Gotha der Universität Erfurt Schloss Friedenstein, Pagenhaus 99867 Gotha
Büro: +49-361-737-1722 Mobil: +49-179-5196880
Privat: Hauptmarkt 17b/ 99867 Gotha
Wikidata mailing list Wikidata@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata
Wikidata mailing list Wikidata@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata
Dr. Olaf Simons Forschungszentrum Gotha der Universität Erfurt Schloss Friedenstein, Pagenhaus 99867 Gotha
Büro: +49-361-737-1722 Mobil: +49-179-5196880
Privat: Hauptmarkt 17b/ 99867 Gotha
On Mon, Sep 2, 2019, 11:39 PM Olaf Simons, olaf.simons@pierre-marteau.com wrote:
Is there an elegant way to get data out of wikidata in a format that you can then fill back into another Wikibase without the pain of such conversions (like splitting coordinates, changing columns, changing the prefixes...)
Depends on how elegant you want it to be but it won't be trivial. If you want to get data from WDQS, you can use any of the available SPARQL text/regex manipulation functions to convert the WKT format into a different format.
Question 2 is related: When you extract dates with the QueryService that
will change just years like 1971 from 1971-00-00 into 1971-01-01 dates. I felt unable to tell whether such a date was just a year or actually a January 1 entry. Is there a way to get the exact date as it has been put in to Wikibase back from the QueryService?
Again, this is not trivial. You need to also query the datePrecision field of the date value and that means querying for the actual date statement and not just the simple value that the usual SPARQL queries provide. Then based on the datePrecision value (I think 9 is for year precision vs. 11 for day precision), you can then truncate the date to just the year.
Hi Olaf
https://databus.dbpedia.org/dbpedia/wikidata/geo-coordinates/2019.08.01 has monthly (around the 7th) extractions of Wikidata's geo-coordinates.
The website still has a bug and the download links are currently not displayed any more at the bottom. But you can query for the latest version.
https://databus.dbpedia.org/yasgui/
PREFIX dataid: http://dataid.dbpedia.org/ns/core# PREFIX dct: http://purl.org/dc/terms/ PREFIX dcat: http://www.w3.org/ns/dcat#
SELECT ?downloadURL ?sha256sum WHERE { ?dataset dataid:artifact https://databus.dbpedia.org/dbpedia/wikidata/geo-coordinates . ?dataset dcat:distribution/dcat:downloadURL ?downloadURL . ?dataset dcat:distribution/dataid:sha256sum ?sha256sum . ?dataset dct:hasVersion ?version .
} ORDER BY DESC (?version) LIMIT 1
-- Sebastian
On 02.09.19 17:56, Eugene Alvin Villar wrote:
On Mon, Sep 2, 2019, 11:39 PM Olaf Simons, <olaf.simons@pierre-marteau.com mailto:olaf.simons@pierre-marteau.com> wrote:
Is there an elegant way to get data out of wikidata in a format that you can then fill back into another Wikibase without the pain of such conversions (like splitting coordinates, changing columns, changing the prefixes...)
Depends on how elegant you want it to be but it won't be trivial. If you want to get data from WDQS, you can use any of the available SPARQL text/regex manipulation functions to convert the WKT format into a different format.
Question 2 is related: When you extract dates with the QueryService that will change just years like 1971 from 1971-00-00 into 1971-01-01 dates. I felt unable to tell whether such a date was just a year or actually a January 1 entry. Is there a way to get the exact date as it has been put in to Wikibase back from the QueryService?
Again, this is not trivial. You need to also query the datePrecision field of the date value and that means querying for the actual date statement and not just the simple value that the usual SPARQL queries provide. Then based on the datePrecision value (I think 9 is for year precision vs. 11 for day precision), you can then truncate the date to just the year.
Wikidata mailing list Wikidata@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata
This might be then the central thing to have in the next QueryService Version: In addition to the present download formats an option to get a csv or tsv output that can be put into QuickStatements for another Wikibase without tedious conversions.
We will have more and more people with Wikibase installations who will use Wikidata (or other Wikibases) as data source for their platforms.
Maybe I can inspire Magnus... Thanks,
Olaf
Eugene Alvin Villar seav80@gmail.com hat am 2. September 2019 um 17:56 geschrieben:
On Mon, Sep 2, 2019, 11:39 PM Olaf Simons, olaf.simons@pierre-marteau.com wrote:
Is there an elegant way to get data out of wikidata in a format that you can then fill back into another Wikibase without the pain of such conversions (like splitting coordinates, changing columns, changing the prefixes...)
Depends on how elegant you want it to be but it won't be trivial. If you want to get data from WDQS, you can use any of the available SPARQL text/regex manipulation functions to convert the WKT format into a different format.
Question 2 is related: When you extract dates with the QueryService that
will change just years like 1971 from 1971-00-00 into 1971-01-01 dates. I felt unable to tell whether such a date was just a year or actually a January 1 entry. Is there a way to get the exact date as it has been put in to Wikibase back from the QueryService?
Again, this is not trivial. You need to also query the datePrecision field of the date value and that means querying for the actual date statement and not just the simple value that the usual SPARQL queries provide. Then based on the datePrecision value (I think 9 is for year precision vs. 11 for day precision), you can then truncate the date to just the year.
Wikidata mailing list Wikidata@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata
Dr. Olaf Simons Forschungszentrum Gotha der Universität Erfurt Schloss Friedenstein, Pagenhaus 99867 Gotha
Büro: +49-361-737-1722 Mobil: +49-179-5196880
Privat: Hauptmarkt 17b/ 99867 Gotha
On Mon, 2 Sep 2019 at 21:11, Olaf Simons olaf.simons@pierre-marteau.com wrote:
This might be then the central thing to have in the next QueryService Version: In addition to the present download formats an option to get a csv or tsv output that can be put into QuickStatements for another Wikibase without tedious conversions.
Well, this is probably more on the QuickStatements site, to allow input in a form of WKT...
We will have more and more people with Wikibase installations who will use Wikidata (or other Wikibases) as data source for their platforms.
Of course. And this is what is currently (yet a bit confusingly) called Federation: https://www.wikidata.org/wiki/Wikidata:Federation_input
Best regards, Jan
Cool,
that is a very useful link for us to keep an eye on!
Thanks, Olaf
Jan Macura macurajan@gmail.com hat am 2. September 2019 um 21:22 geschrieben:
On Mon, 2 Sep 2019 at 21:11, Olaf Simons olaf.simons@pierre-marteau.com wrote:
This might be then the central thing to have in the next QueryService Version: In addition to the present download formats an option to get a csv or tsv output that can be put into QuickStatements for another Wikibase without tedious conversions.
Well, this is probably more on the QuickStatements site, to allow input in a form of WKT...
We will have more and more people with Wikibase installations who will use Wikidata (or other Wikibases) as data source for their platforms.
Of course. And this is what is currently (yet a bit confusingly) called Federation: https://www.wikidata.org/wiki/Wikidata:Federation_input
Best regards, Jan _______________________________________________ Wikidata mailing list Wikidata@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata
Dr. Olaf Simons Forschungszentrum Gotha der Universität Erfurt Schloss Friedenstein, Pagenhaus 99867 Gotha
Büro: +49-361-737-1722 Mobil: +49-179-5196880
Privat: Hauptmarkt 17b/ 99867 Gotha
Is there an elegant way to get data out of wikidata in a format that you can then fill back into another Wikibase without the pain of such conversions (like splitting coordinates, changing columns, changing the prefixes...)
It would probably be easier for you to get longitude and latitude separately, but if I understand correctly the SPARQL query is not the most straightforward: https://w.wiki/7ny
Cheers,
Ettore Rizza
On Mon, 2 Sep 2019 at 21:34, Olaf Simons olaf.simons@pierre-marteau.com wrote:
Cool,
that is a very useful link for us to keep an eye on!
Thanks, Olaf
Jan Macura macurajan@gmail.com hat am 2. September 2019 um 21:22
geschrieben:
On Mon, 2 Sep 2019 at 21:11, Olaf Simons <olaf.simons@pierre-marteau.com
wrote:
This might be then the central thing to have in the next QueryService Version: In addition to the present download formats an option to get
a csv
or tsv output that can be put into QuickStatements for another Wikibase without tedious conversions.
Well, this is probably more on the QuickStatements site, to allow input
in
a form of WKT...
We will have more and more people with Wikibase installations who will
use
Wikidata (or other Wikibases) as data source for their platforms.
Of course. And this is what is currently (yet a bit confusingly) called Federation: https://www.wikidata.org/wiki/Wikidata:Federation_input
Best regards, Jan _______________________________________________ Wikidata mailing list Wikidata@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata
Dr. Olaf Simons Forschungszentrum Gotha der Universität Erfurt Schloss Friedenstein, Pagenhaus 99867 Gotha
Büro: +49-361-737-1722 Mobil: +49-179-5196880
Privat: Hauptmarkt 17b/ 99867 Gotha
Wikidata mailing list Wikidata@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata
Hey,
While this won't help you for data from the SPARQL endpoint, the Geo library can be of use when working with coordinates from Wikidata/Wikibase.
https://github.com/DataValues/Geo/
This library is used by Wikibase and can thus both parse the coordinates coming from the wiki software and format coordinates to a format the wiki software supports. This library is also used by Semantic MediaWiki [0] and the Maps extension [1], so aids interoperability there as well.
[0] https://www.semantic-mediawiki.org [1] https://github.com/JeroenDeDauw/Maps#maps
Cheers
-- Jeroen De Dauw | www.EntropyWins.wtf https://entropywins.wtf | www.Professional.Wiki https://professional.wiki/ Software Crafter | Entrepreneur | Speaker | Strategist | Contributor to Wikimedia and Open Source ~=[,,_,,]:3
I am afraid that those questions does not have as simple and straight-forward answers as the first one...
On Mon, 2 Sep 2019 at 17:39, Olaf Simons olaf.simons@pierre-marteau.com wrote:
Is there an elegant way to get data out of wikidata in a format that you can then fill back into another Wikibase without the pain of such conversions (like splitting coordinates, changing columns, changing the prefixes...)
I quite agree with what Eugene has written -- it depends what is elegant for you. Sadly, I have no experience with other Wikibase instances, so I have no idea how the input for it might look like. But SPARQL has quite useful string functions that might come handy in this case: https://en.wikibooks.org/wiki/SPARQL/Expressions_and_Functions#Functions_on_...
Question 2 is related: When you extract dates with the QueryService that will change just years like 1971 from 1971-00-00 into 1971-01-01 dates. I felt unable to tell whether such a date was just a year or actually a January 1 entry. Is there a way to get the exact date as it has been put in to Wikibase back from the QueryService?
Sorry, I am not familiar with the date representations in Wikibase enough to provide a meaningful answer.
Best regards, Jan
The use of tiny urls has changed over night - in a welcome move towards more independence.
The problem for us (FactGrid, an external wikibase) is now that all the given links have stopped working.
Is there an elegant way to fix this?
Best wishes, Olaf
On Wed, 11 Sep 2019 at 14:38, Olaf Simons olaf.simons@pierre-marteau.com wrote:
The use of tiny urls has changed over night
Changed in what way?
The new Wikidata tinyurl (substitute) is
ours are still
...and these are no longer supported. They create messy error notes on our QueryService.
Andy Mabbett andy@pigsonthewing.org.uk hat am 11. September 2019 um 16:16 geschrieben:
On Wed, 11 Sep 2019 at 14:38, Olaf Simons olaf.simons@pierre-marteau.com wrote:
The use of tiny urls has changed over night
Changed in what way?
-- Andy Mabbett @pigsonthewing http://pigsonthewing.org.uk
Wikidata mailing list Wikidata@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata
Dr. Olaf Simons Forschungszentrum Gotha der Universität Erfurt Schloss Friedenstein, Pagenhaus 99867 Gotha
Büro: +49-361-737-1722 Mobil: +49-179-5196880
Privat: Hauptmarkt 17b/ 99867 Gotha
Hello, I saw some information on Twitter https://twitter.com/f_u_e_n_t_e/status/1171739406249734144 about the fact that TinyURL apparently implemented some url-encoding that causes various issues, but I couldn't find any official announcement yet. The new Wikimedia URL shortener https://meta.wikimedia.org/wiki/Wikimedia_URL_Shortener is in place since April 2019 but is not really adapted for external Wikibase instance, as it can only generate links pointing to a few services hosted by WMF. I guess that if the issue with TinyURL persists, a solution could be to switch to another service or implement your own. Cheers, Léa
On Wed, 11 Sep 2019 at 16:36, Olaf Simons olaf.simons@pierre-marteau.com wrote:
The new Wikidata tinyurl (substitute) is
ours are still
...and these are no longer supported. They create messy error notes on our QueryService.
Andy Mabbett andy@pigsonthewing.org.uk hat am 11. September 2019 um
16:16 geschrieben:
On Wed, 11 Sep 2019 at 14:38, Olaf Simons olaf.simons@pierre-marteau.com wrote:
The use of tiny urls has changed over night
Changed in what way?
-- Andy Mabbett @pigsonthewing http://pigsonthewing.org.uk
Wikidata mailing list Wikidata@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata
Dr. Olaf Simons Forschungszentrum Gotha der Universität Erfurt Schloss Friedenstein, Pagenhaus 99867 Gotha
Büro: +49-361-737-1722 Mobil: +49-179-5196880
Privat: Hauptmarkt 17b/ 99867 Gotha
Wikidata mailing list Wikidata@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata
seems someone very nice (Lucas?) has fixed it.
thank you all, Olaf
Olaf Simons olaf.simons@pierre-marteau.com hat am 11. September 2019 um 16:35 geschrieben:
The new Wikidata tinyurl (substitute) is
ours are still
...and these are no longer supported. They create messy error notes on our QueryService.
Andy Mabbett andy@pigsonthewing.org.uk hat am 11. September 2019 um 16:16 geschrieben:
On Wed, 11 Sep 2019 at 14:38, Olaf Simons olaf.simons@pierre-marteau.com wrote:
The use of tiny urls has changed over night
Changed in what way?
-- Andy Mabbett @pigsonthewing http://pigsonthewing.org.uk
Wikidata mailing list Wikidata@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata
Dr. Olaf Simons Forschungszentrum Gotha der Universität Erfurt Schloss Friedenstein, Pagenhaus 99867 Gotha
Büro: +49-361-737-1722 Mobil: +49-179-5196880
Privat: Hauptmarkt 17b/ 99867 Gotha
Dr. Olaf Simons Forschungszentrum Gotha der Universität Erfurt Schloss Friedenstein, Pagenhaus 99867 Gotha
Büro: +49-361-737-1722 Mobil: +49-179-5196880
Privat: Hauptmarkt 17b/ 99867 Gotha
I didn’t do anything – but yeah, while I could reproduce the issue earlier today, it seems to be working again right now. My guess would be that the TinyURL folks fixed it.
Cheers, Lucas
On 11.09.19 18:26, Olaf Simons wrote:
seems someone very nice (Lucas?) has fixed it.
thank you all, Olaf
Olaf Simons olaf.simons@pierre-marteau.com hat am 11. September 2019 um 16:35 geschrieben:
The new Wikidata tinyurl (substitute) is
ours are still
...and these are no longer supported. They create messy error notes on our QueryService.
Andy Mabbett andy@pigsonthewing.org.uk hat am 11. September 2019 um 16:16 geschrieben:
On Wed, 11 Sep 2019 at 14:38, Olaf Simons olaf.simons@pierre-marteau.com wrote:
The use of tiny urls has changed over night
Changed in what way?
-- Andy Mabbett @pigsonthewing http://pigsonthewing.org.uk
Wikidata mailing list Wikidata@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata
Dr. Olaf Simons Forschungszentrum Gotha der Universität Erfurt Schloss Friedenstein, Pagenhaus 99867 Gotha
Büro: +49-361-737-1722 Mobil: +49-179-5196880
Privat: Hauptmarkt 17b/ 99867 Gotha
Dr. Olaf Simons Forschungszentrum Gotha der Universität Erfurt Schloss Friedenstein, Pagenhaus 99867 Gotha
Büro: +49-361-737-1722 Mobil: +49-179-5196880
Privat: Hauptmarkt 17b/ 99867 Gotha
Wikidata mailing list Wikidata@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata