Hi everyone,
I'd just like to announce another experimental Wikidata SPARQL endpoint [1], kindly provided by the folks at SpazioDati [2].
It contains both the simplified and the complete dumps, as per [3]. Each dump file is stored under a different named graph. We are collecting the query logs, and will share the most frequent queries.
Cheers!
[1] http://wikisparql.org/ [2] http://spaziodati.eu/en/ [3] http://tools.wmflabs.org/wikidata-exports/rdf/exports/20150223/
On 3/21/15 1:00 PM, wikidata-l-request@lists.wikimedia.org wrote:
On 3/20/15 2:08 PM, Markus Kroetzsch wrote:
Dear all,
Thanks to the people at the Center of Semantic Web Research in Chile [1], we have a very first public SPARQL endpoint for Wikidata running. This is very preliminary, so do not rely on it in applications and expect things to fail, but you may still enjoy some things.
You have a SPARQL that provides access to Wikidata dumps loaded into an RDF compliant RDBMS (in this case a Virtuoso RDBMS instance). I emphasis "a" because "the first" isn't accurate.
There are other endpoints that provide access to Wikidata dumps:
[1]http://lod.openlinksw.com/sparql -- 61 Billion+ RDF triples culled from across the LOD Cloud (if you lookup Wikidata URIs that are objects of owl:sameAs relations you'll end up in Wikidata own Linked Data Space)
[2]http://wikidata.metaphacts.com/sparql -- another endpoint I discovered yesterday .
Brilliant, we should set up a page with a list of SPARQL endpoits for Wikidata! For production usage, it is great to have a variety to chose from.
==WARNING==
The RDF format is currently in flux. The purpose of the Chilean endpoint http://milenio.dcc.uchile.cl/sparql is to gather feedback that helps us how to possibly the RDF structure for greater utility/simplicity. There will be several updates to the format in the net few weeks. Please help us to find out what is needed by also testing at http://milenio.dcc.uchile.cl/sparql. We gather query logs and are happy to share them. This said, please also tell us if other endpoints work better/differently/worse for you. Any feedback is welcome.
Cheers,
Markus
On 23.03.2015 12:51, Marco Fossati wrote:
Hi everyone,
I'd just like to announce another experimental Wikidata SPARQL endpoint [1], kindly provided by the folks at SpazioDati [2].
It contains both the simplified and the complete dumps, as per [3]. Each dump file is stored under a different named graph. We are collecting the query logs, and will share the most frequent queries.
Cheers!
[1] http://wikisparql.org/ [2] http://spaziodati.eu/en/ [3] http://tools.wmflabs.org/wikidata-exports/rdf/exports/20150223/
On 3/21/15 1:00 PM, wikidata-l-request@lists.wikimedia.org wrote:
On 3/20/15 2:08 PM, Markus Kroetzsch wrote:
Dear all,
Thanks to the people at the Center of Semantic Web Research in Chile [1], we have a very first public SPARQL endpoint for Wikidata running. This is very preliminary, so do not rely on it in applications and expect things to fail, but you may still enjoy some things.
You have a SPARQL that provides access to Wikidata dumps loaded into an RDF compliant RDBMS (in this case a Virtuoso RDBMS instance). I emphasis "a" because "the first" isn't accurate.
There are other endpoints that provide access to Wikidata dumps:
[1]http://lod.openlinksw.com/sparql -- 61 Billion+ RDF triples culled from across the LOD Cloud (if you lookup Wikidata URIs that are objects of owl:sameAs relations you'll end up in Wikidata own Linked Data Space)
[2]http://wikidata.metaphacts.com/sparql -- another endpoint I discovered yesterday .
Hello, Thank you for this initiative.
However, there is a little problem with the Properties on the result pages. For instance on the result page of this query http://wikisparql.org/sparql?query=DESCRIBE+%3Chttp%3A//www.wikidata.org/ent... all Wikidata properties have an extra letter after the property ID, for instance P463 https://www.wikidata.org/wiki/Property:P463 has an extra "s" (P463s) so that the link to this property (on the arrow icon) leads to a bad request page.
A click on this link : http://www.wikidata.org/entity/P463s
Brings you on this page rather than http://www.wikidata.org/entity/P463.
There is also a problem with the link to the wikidata ontology on this page http://wikisparql.org/sparql?query=DESCRIBE+%3Chttp%3A//www.wikidata.org/ent.... For instance time precision has a link to http://www.wikidata.org/ontology#timePrecision which does not exists.
Cheers,
Jean-Baptiste Pressac
Traitement et analyse de bases de données Production et diffusion de corpus numériques
Centre de Recherche Bretonne et Celtique Unité mixte de service (UMS) 3554 20 rue Duquesne CS 93837 29238 Brest cedex 3
tel : +33 (0)2 98 01 68 95 fax : +33 (0)2 98 01 63 93
Le 23/03/2015 12:51, Marco Fossati a écrit :
Hi everyone,
I'd just like to announce another experimental Wikidata SPARQL endpoint [1], kindly provided by the folks at SpazioDati [2].
It contains both the simplified and the complete dumps, as per [3]. Each dump file is stored under a different named graph. We are collecting the query logs, and will share the most frequent queries.
Cheers!
[1] http://wikisparql.org/ [2] http://spaziodati.eu/en/ [3] http://tools.wmflabs.org/wikidata-exports/rdf/exports/20150223/
On 3/21/15 1:00 PM, wikidata-l-request@lists.wikimedia.org wrote:
On 3/20/15 2:08 PM, Markus Kroetzsch wrote:
Dear all,
Thanks to the people at the Center of Semantic Web Research in Chile [1], we have a very first public SPARQL endpoint for Wikidata running. This is very preliminary, so do not rely on it in applications and expect things to fail, but you may still enjoy some things.
You have a SPARQL that provides access to Wikidata dumps loaded into an RDF compliant RDBMS (in this case a Virtuoso RDBMS instance). I emphasis "a" because "the first" isn't accurate.
There are other endpoints that provide access to Wikidata dumps:
[1]http://lod.openlinksw.com/sparql -- 61 Billion+ RDF triples culled from across the LOD Cloud (if you lookup Wikidata URIs that are objects of owl:sameAs relations you'll end up in Wikidata own Linked Data Space)
[2]http://wikidata.metaphacts.com/sparql -- another endpoint I discovered yesterday .
2015-03-23 16:37 GMT+01:00 Markus Krötzsch markus@semantic-mediawiki.org:
Brilliant, we should set up a page with a list of SPARQL endpoits for Wikidata! For production usage, it is great to have a variety to chose from.
strong +1
Also, would you mind if the examples you shared on this list are reused on the other projects? I specifically have in mind to use them to provide example queries on Wikidata-LDF?[1]
Thank you,
Cristian
On 4/8/15 6:14 AM, Cristian Consonni wrote:
2015-03-23 16:37 GMT+01:00 Markus Krötzschmarkus@semantic-mediawiki.org:
Brilliant, we should set up a page with a list of SPARQL endpoits for Wikidata! For production usage, it is great to have a variety to chose from.
strong +1
Also, would you mind if the examples you shared on this list are reused on the other projects? I specifically have in mind to use them to provide example queries on Wikidata-LDF?[1]
Thank you,
Cristian
See: http://www.wikidata.org/wiki/Wikidata:Data_access re., current SPARQL Endpoint list.
Hi Jean-Baptiste,
Your observation is correct. This is because a single Wikidata statement (with one Wikidata property) does not translate into a single triple (with one RDF property) in RDF. Rather, several RDF triples are used, they need to use more than one property, and these properties have different declarations (e.g., some are DatatypeProperties and some are ObjectProperties). Therefore, we create property variants, currently by appending letters like "s".
The current RDF encoding is documented in our ISWC 2014 paper: https://ddll.inf.tu-dresden.de/web/Inproceedings4005/en
You are still right that there is a problem here in that you cannot get from "P123s" to "P123" in the current RDF. This is being worked on: https://github.com/Wikidata/Wikidata-Toolkit/issues/84 The dumps will hopefully have this soon.
Regards,
Markus
On 08.04.2015 12:05, Jean-Baptiste Pressac wrote:
Hello, Thank you for this initiative.
However, there is a little problem with the Properties on the result pages. For instance on the result page of this query http://wikisparql.org/sparql?query=DESCRIBE+%3Chttp%3A//www.wikidata.org/ent... all Wikidata properties have an extra letter after the property ID, for instance P463 https://www.wikidata.org/wiki/Property:P463 has an extra "s" (P463s) so that the link to this property (on the arrow icon) leads to a bad request page.
A click on this link : http://www.wikidata.org/entity/P463s
Brings you on this page rather than http://www.wikidata.org/entity/P463.
There is also a problem with the link to the wikidata ontology on this page http://wikisparql.org/sparql?query=DESCRIBE+%3Chttp%3A//www.wikidata.org/ent.... For instance time precision has a link to http://www.wikidata.org/ontology#timePrecision which does not exists.
Cheers,
Jean-Baptiste Pressac
Traitement et analyse de bases de données Production et diffusion de corpus numériques
Centre de Recherche Bretonne et Celtique Unité mixte de service (UMS) 3554 20 rue Duquesne CS 93837 29238 Brest cedex 3
tel : +33 (0)2 98 01 68 95 fax : +33 (0)2 98 01 63 93
Le 23/03/2015 12:51, Marco Fossati a écrit :
Hi everyone,
I'd just like to announce another experimental Wikidata SPARQL endpoint [1], kindly provided by the folks at SpazioDati [2].
It contains both the simplified and the complete dumps, as per [3]. Each dump file is stored under a different named graph. We are collecting the query logs, and will share the most frequent queries.
Cheers!
[1] http://wikisparql.org/ [2] http://spaziodati.eu/en/ [3] http://tools.wmflabs.org/wikidata-exports/rdf/exports/20150223/
On 3/21/15 1:00 PM, wikidata-l-request@lists.wikimedia.org wrote:
On 3/20/15 2:08 PM, Markus Kroetzsch wrote:
Dear all,
Thanks to the people at the Center of Semantic Web Research in Chile [1], we have a very first public SPARQL endpoint for Wikidata running. This is very preliminary, so do not rely on it in applications and expect things to fail, but you may still enjoy some things.
You have a SPARQL that provides access to Wikidata dumps loaded into an RDF compliant RDBMS (in this case a Virtuoso RDBMS instance). I emphasis "a" because "the first" isn't accurate.
There are other endpoints that provide access to Wikidata dumps:
[1]http://lod.openlinksw.com/sparql -- 61 Billion+ RDF triples culled from across the LOD Cloud (if you lookup Wikidata URIs that are objects of owl:sameAs relations you'll end up in Wikidata own Linked Data Space)
[2]http://wikidata.metaphacts.com/sparql -- another endpoint I discovered yesterday .
Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Hi Markus,
would you recommend to add some sort of "patch" until the new dumps are out, either in the data (by adding some triples to a temporary graph) or just in the Web interface for the external links?
Cheers,
Nicola (wikisparql.org)
Il 08/04/2015 14:49, Markus Krötzsch ha scritto:
Hi Jean-Baptiste,
Your observation is correct. This is because a single Wikidata statement (with one Wikidata property) does not translate into a single triple (with one RDF property) in RDF. Rather, several RDF triples are used, they need to use more than one property, and these properties have different declarations (e.g., some are DatatypeProperties and some are ObjectProperties). Therefore, we create property variants, currently by appending letters like "s".
The current RDF encoding is documented in our ISWC 2014 paper: https://ddll.inf.tu-dresden.de/web/Inproceedings4005/en
You are still right that there is a problem here in that you cannot get from "P123s" to "P123" in the current RDF. This is being worked on: https://github.com/Wikidata/Wikidata-Toolkit/issues/84 The dumps will hopefully have this soon.
Regards,
Markus
On 08.04.2015 12:05, Jean-Baptiste Pressac wrote:
Hello, Thank you for this initiative.
However, there is a little problem with the Properties on the result pages. For instance on the result page of this query http://wikisparql.org/sparql?query=DESCRIBE+%3Chttp%3A//www.wikidata.org/ent...
all Wikidata properties have an extra letter after the property ID, for instance P463 https://www.wikidata.org/wiki/Property:P463 has an extra "s" (P463s) so that the link to this property (on the arrow icon) leads to a bad request page.
A click on this link : http://www.wikidata.org/entity/P463s
Brings you on this page rather than http://www.wikidata.org/entity/P463.
There is also a problem with the link to the wikidata ontology on this page http://wikisparql.org/sparql?query=DESCRIBE+%3Chttp%3A//www.wikidata.org/ent....
For instance time precision has a link to http://www.wikidata.org/ontology#timePrecision which does not exists.
Cheers,
Jean-Baptiste Pressac
Traitement et analyse de bases de données Production et diffusion de corpus numériques
Centre de Recherche Bretonne et Celtique Unité mixte de service (UMS) 3554 20 rue Duquesne CS 93837 29238 Brest cedex 3
tel : +33 (0)2 98 01 68 95 fax : +33 (0)2 98 01 63 93
Le 23/03/2015 12:51, Marco Fossati a écrit :
Hi everyone,
I'd just like to announce another experimental Wikidata SPARQL endpoint [1], kindly provided by the folks at SpazioDati [2].
It contains both the simplified and the complete dumps, as per [3]. Each dump file is stored under a different named graph. We are collecting the query logs, and will share the most frequent queries.
Cheers!
[1] http://wikisparql.org/ [2] http://spaziodati.eu/en/ [3] http://tools.wmflabs.org/wikidata-exports/rdf/exports/20150223/
On 3/21/15 1:00 PM, wikidata-l-request@lists.wikimedia.org wrote:
On 3/20/15 2:08 PM, Markus Kroetzsch wrote:
Dear all,
Thanks to the people at the Center of Semantic Web Research in Chile [1], we have a very first public SPARQL endpoint for Wikidata running. This is very preliminary, so do not rely on it in applications and expect things to fail, but you may still enjoy some things.
You have a SPARQL that provides access to Wikidata dumps loaded into an RDF compliant RDBMS (in this case a Virtuoso RDBMS instance). I emphasis "a" because "the first" isn't accurate.
There are other endpoints that provide access to Wikidata dumps:
[1]http://lod.openlinksw.com/sparql -- 61 Billion+ RDF triples culled from across the LOD Cloud (if you lookup Wikidata URIs that are objects of owl:sameAs relations you'll end up in Wikidata own Linked Data Space)
[2]http://wikidata.metaphacts.com/sparql -- another endpoint I discovered yesterday .
Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Thank you, I will have a look to your publication to have a better understanding of the mecanism of "RDFisation".
Are you also going to solve the problem with the links to the wikidata ontology ? For instance on this page http://wikisparql.org/sparql?query=DESCRIBE+%3Chttp%3A//www.wikidata.org/ent... "time precision" has a link to http://www.wikidata.org/ontology#timePrecision which does not exists.
Regards,
Jean-Baptiste Pressac
Traitement et analyse de bases de données Production et diffusion de corpus numériques
Centre de Recherche Bretonne et Celtique Unité mixte de service (UMS) 3554 20 rue Duquesne CS 93837 29238 Brest cedex 3
tel : +33 (0)2 98 01 68 95 fax : +33 (0)2 98 01 63 93
Le 08/04/2015 15:07, Nicola Vitucci a écrit :
Hi Markus,
would you recommend to add some sort of "patch" until the new dumps are out, either in the data (by adding some triples to a temporary graph) or just in the Web interface for the external links?
Cheers,
Nicola (wikisparql.org)
Il 08/04/2015 14:49, Markus Krötzsch ha scritto:
Hi Jean-Baptiste,
Your observation is correct. This is because a single Wikidata statement (with one Wikidata property) does not translate into a single triple (with one RDF property) in RDF. Rather, several RDF triples are used, they need to use more than one property, and these properties have different declarations (e.g., some are DatatypeProperties and some are ObjectProperties). Therefore, we create property variants, currently by appending letters like "s".
The current RDF encoding is documented in our ISWC 2014 paper: https://ddll.inf.tu-dresden.de/web/Inproceedings4005/en
You are still right that there is a problem here in that you cannot get from "P123s" to "P123" in the current RDF. This is being worked on: https://github.com/Wikidata/Wikidata-Toolkit/issues/84 The dumps will hopefully have this soon.
Regards,
Markus
On 08.04.2015 12:05, Jean-Baptiste Pressac wrote:
Hello, Thank you for this initiative.
However, there is a little problem with the Properties on the result pages. For instance on the result page of this query http://wikisparql.org/sparql?query=DESCRIBE+%3Chttp%3A//www.wikidata.org/ent...
all Wikidata properties have an extra letter after the property ID, for instance P463 https://www.wikidata.org/wiki/Property:P463 has an extra "s" (P463s) so that the link to this property (on the arrow icon) leads to a bad request page.
A click on this link : http://www.wikidata.org/entity/P463s
Brings you on this page rather than http://www.wikidata.org/entity/P463.
There is also a problem with the link to the wikidata ontology on this page http://wikisparql.org/sparql?query=DESCRIBE+%3Chttp%3A//www.wikidata.org/ent....
For instance time precision has a link to http://www.wikidata.org/ontology#timePrecision which does not exists.
Cheers,
Jean-Baptiste Pressac
Traitement et analyse de bases de données Production et diffusion de corpus numériques
Centre de Recherche Bretonne et Celtique Unité mixte de service (UMS) 3554 20 rue Duquesne CS 93837 29238 Brest cedex 3
tel : +33 (0)2 98 01 68 95 fax : +33 (0)2 98 01 63 93
Le 23/03/2015 12:51, Marco Fossati a écrit :
Hi everyone,
I'd just like to announce another experimental Wikidata SPARQL endpoint [1], kindly provided by the folks at SpazioDati [2].
It contains both the simplified and the complete dumps, as per [3]. Each dump file is stored under a different named graph. We are collecting the query logs, and will share the most frequent queries.
Cheers!
[1] http://wikisparql.org/ [2] http://spaziodati.eu/en/ [3] http://tools.wmflabs.org/wikidata-exports/rdf/exports/20150223/
On 3/21/15 1:00 PM, wikidata-l-request@lists.wikimedia.org wrote:
On 3/20/15 2:08 PM, Markus Kroetzsch wrote:
> Dear all, > > Thanks to the people at the Center of Semantic Web Research in Chile > [1], we have a very first public SPARQL endpoint for Wikidata running. > This is very preliminary, so do not rely on it in applications and > expect things to fail, but you may still enjoy some things. > > http://milenio.dcc.uchile.cl/sparql
You have a SPARQL that provides access to Wikidata dumps loaded into an RDF compliant RDBMS (in this case a Virtuoso RDBMS instance). I emphasis "a" because "the first" isn't accurate.
There are other endpoints that provide access to Wikidata dumps:
[1]http://lod.openlinksw.com/sparql -- 61 Billion+ RDF triples culled from across the LOD Cloud (if you lookup Wikidata URIs that are objects of owl:sameAs relations you'll end up in Wikidata own Linked Data Space)
[2]http://wikidata.metaphacts.com/sparql -- another endpoint I discovered yesterday .
Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
On 08.04.2015 16:02, Jean-Baptiste Pressac wrote:
Thank you, I will have a look to your publication to have a better understanding of the mecanism of "RDFisation".
Are you also going to solve the problem with the links to the wikidata ontology ? For instance on this page http://wikisparql.org/sparql?query=DESCRIBE+%3Chttp%3A//www.wikidata.org/ent... "time precision" has a link to http://www.wikidata.org/ontology#timePrecision which does not exists.
Yes, all URIs should eventually be resolvable. We could not make this work with our RDF dumps but the ongoing official RDF dump discussions will certainly result in resolvable URIs. There will also be some changes in the URI schemes as part of this process. Stay tuned. ;-)
Markus
Regards,
Jean-Baptiste Pressac
On 08.04.2015 15:07, Nicola Vitucci wrote:
Hi Markus,
would you recommend to add some sort of "patch" until the new dumps are out, either in the data (by adding some triples to a temporary graph) or just in the Web interface for the external links?
If you need it ASAP, you could actually just implement it in our Java code and make a pull request. It should not be too much effort. You can use the issue to ask about details if you are not sure what to do.
Otherwise the ETA would be end of April/beginning of May (several other RDF extensions are currently being worked on and will happen first, e.g., ranks in RDF).
Cheers
Markus
Cheers,
Nicola (wikisparql.org)
Il 08/04/2015 14:49, Markus Krötzsch ha scritto:
Hi Jean-Baptiste,
Your observation is correct. This is because a single Wikidata statement (with one Wikidata property) does not translate into a single triple (with one RDF property) in RDF. Rather, several RDF triples are used, they need to use more than one property, and these properties have different declarations (e.g., some are DatatypeProperties and some are ObjectProperties). Therefore, we create property variants, currently by appending letters like "s".
The current RDF encoding is documented in our ISWC 2014 paper: https://ddll.inf.tu-dresden.de/web/Inproceedings4005/en
You are still right that there is a problem here in that you cannot get from "P123s" to "P123" in the current RDF. This is being worked on: https://github.com/Wikidata/Wikidata-Toolkit/issues/84 The dumps will hopefully have this soon.
Regards,
Markus
On 08.04.2015 12:05, Jean-Baptiste Pressac wrote:
Hello, Thank you for this initiative.
However, there is a little problem with the Properties on the result pages. For instance on the result page of this query http://wikisparql.org/sparql?query=DESCRIBE+%3Chttp%3A//www.wikidata.org/ent...
all Wikidata properties have an extra letter after the property ID, for instance P463 https://www.wikidata.org/wiki/Property:P463 has an extra "s" (P463s) so that the link to this property (on the arrow icon) leads to a bad request page.
A click on this link : http://www.wikidata.org/entity/P463s
Brings you on this page rather than http://www.wikidata.org/entity/P463.
There is also a problem with the link to the wikidata ontology on this page http://wikisparql.org/sparql?query=DESCRIBE+%3Chttp%3A//www.wikidata.org/ent....
For instance time precision has a link to http://www.wikidata.org/ontology#timePrecision which does not exists.
Cheers,
Jean-Baptiste Pressac
Traitement et analyse de bases de données Production et diffusion de corpus numériques
Centre de Recherche Bretonne et Celtique Unité mixte de service (UMS) 3554 20 rue Duquesne CS 93837 29238 Brest cedex 3
tel : +33 (0)2 98 01 68 95 fax : +33 (0)2 98 01 63 93
Le 23/03/2015 12:51, Marco Fossati a écrit :
Hi everyone,
I'd just like to announce another experimental Wikidata SPARQL endpoint [1], kindly provided by the folks at SpazioDati [2].
It contains both the simplified and the complete dumps, as per [3]. Each dump file is stored under a different named graph. We are collecting the query logs, and will share the most frequent queries.
Cheers!
[1] http://wikisparql.org/ [2] http://spaziodati.eu/en/ [3] http://tools.wmflabs.org/wikidata-exports/rdf/exports/20150223/
On 3/21/15 1:00 PM, wikidata-l-request@lists.wikimedia.org wrote:
On 3/20/15 2:08 PM, Markus Kroetzsch wrote:
> Dear all, > > Thanks to the people at the Center of Semantic Web Research in Chile > [1], we have a very first public SPARQL endpoint for Wikidata running. > This is very preliminary, so do not rely on it in applications and > expect things to fail, but you may still enjoy some things. > > http://milenio.dcc.uchile.cl/sparql
You have a SPARQL that provides access to Wikidata dumps loaded into an RDF compliant RDBMS (in this case a Virtuoso RDBMS instance). I emphasis "a" because "the first" isn't accurate.
There are other endpoints that provide access to Wikidata dumps:
[1]http://lod.openlinksw.com/sparql -- 61 Billion+ RDF triples culled from across the LOD Cloud (if you lookup Wikidata URIs that are objects of owl:sameAs relations you'll end up in Wikidata own Linked Data Space)
[2]http://wikidata.metaphacts.com/sparql -- another endpoint I discovered yesterday .
Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Thanks,
Jean-Baptiste Pressac
Traitement et analyse de bases de données Production et diffusion de corpus numériques
Centre de Recherche Bretonne et Celtique Unité mixte de service (UMS) 3554 20 rue Duquesne CS 93837 29238 Brest cedex 3
tel : +33 (0)2 98 01 68 95 fax : +33 (0)2 98 01 63 93
Le 08/04/2015 16:33, Markus Krötzsch a écrit :
On 08.04.2015 16:02, Jean-Baptiste Pressac wrote:
Thank you, I will have a look to your publication to have a better understanding of the mecanism of "RDFisation".
Are you also going to solve the problem with the links to the wikidata ontology ? For instance on this page http://wikisparql.org/sparql?query=DESCRIBE+%3Chttp%3A//www.wikidata.org/ent...
"time precision" has a link to http://www.wikidata.org/ontology#timePrecision which does not exists.
Yes, all URIs should eventually be resolvable. We could not make this work with our RDF dumps but the ongoing official RDF dump discussions will certainly result in resolvable URIs. There will also be some changes in the URI schemes as part of this process. Stay tuned. ;-)
Markus
Regards,
Jean-Baptiste Pressac
Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Il 08/04/2015 16:36, Markus Krötzsch ha scritto:
On 08.04.2015 15:07, Nicola Vitucci wrote:
Hi Markus,
would you recommend to add some sort of "patch" until the new dumps are out, either in the data (by adding some triples to a temporary graph) or just in the Web interface for the external links?
If you need it ASAP, you could actually just implement it in our Java code and make a pull request. It should not be too much effort. You can use the issue to ask about details if you are not sure what to do.
Otherwise the ETA would be end of April/beginning of May (several other RDF extensions are currently being worked on and will happen first, e.g., ranks in RDF).
I don't need it right now, so given the short ETA I'd wait. Anyway, in order to let people use external links more easily, I could just "manually" drop the last letter (or apply any other rule) only in the href links for now, while leaving the URIs intact. Do you see any harm in this?
Nicola
On 08.04.2015 17:24, Nicola Vitucci wrote:
Il 08/04/2015 16:36, Markus Krötzsch ha scritto:
On 08.04.2015 15:07, Nicola Vitucci wrote:
Hi Markus,
would you recommend to add some sort of "patch" until the new dumps are out, either in the data (by adding some triples to a temporary graph) or just in the Web interface for the external links?
If you need it ASAP, you could actually just implement it in our Java code and make a pull request. It should not be too much effort. You can use the issue to ask about details if you are not sure what to do.
Otherwise the ETA would be end of April/beginning of May (several other RDF extensions are currently being worked on and will happen first, e.g., ranks in RDF).
I don't need it right now, so given the short ETA I'd wait. Anyway, in order to let people use external links more easily, I could just "manually" drop the last letter (or apply any other rule) only in the href links for now, while leaving the URIs intact. Do you see any harm in this?
Ah, you mean for dispayling links in HTML? No, there is no harm in this at all. Most likely, the final life exports will also redirect to the property entity exports (which would make most sense for LOD crawlers).
Markus
Il 08/04/2015 23:43, Markus Krötzsch ha scritto:
On 08.04.2015 17:24, Nicola Vitucci wrote:
Il 08/04/2015 16:36, Markus Krötzsch ha scritto:
On 08.04.2015 15:07, Nicola Vitucci wrote:
Hi Markus,
would you recommend to add some sort of "patch" until the new dumps are out, either in the data (by adding some triples to a temporary graph) or just in the Web interface for the external links?
If you need it ASAP, you could actually just implement it in our Java code and make a pull request. It should not be too much effort. You can use the issue to ask about details if you are not sure what to do.
Otherwise the ETA would be end of April/beginning of May (several other RDF extensions are currently being worked on and will happen first, e.g., ranks in RDF).
I don't need it right now, so given the short ETA I'd wait. Anyway, in order to let people use external links more easily, I could just "manually" drop the last letter (or apply any other rule) only in the href links for now, while leaving the URIs intact. Do you see any harm in this?
Ah, you mean for dispayling links in HTML? No, there is no harm in this at all. Most likely, the final life exports will also redirect to the property entity exports (which would make most sense for LOD crawlers).
Indeed. I made this temporary change on WikiSPARQL, so that links like in Jean-Baptiste's examples may work "properly". If you try this:
http://wikisparql.org/sparql?query=DESCRIBE+%3Chttp%3A//www.wikidata.org/ent...
and then click on the external link on any property, now you should be redirected to the "right" wiki page.
Nicola
This is a great development!
I managed to run some simple queries, but I am having trouble with profiling-type queries such as
select ?p (count(*) as ?cnt) { ?s ?p ?o} group by ?p order by desc(?cnt)
You can generally run those O.K. on the DBpedia SPARQL endpoint. It would be nice to see a few more horsepower put behind this.
On Wed, Apr 8, 2015 at 7:11 PM, Nicola Vitucci nicola.vitucci@gmail.com wrote:
Il 08/04/2015 23:43, Markus Krötzsch ha scritto:
On 08.04.2015 17:24, Nicola Vitucci wrote:
Il 08/04/2015 16:36, Markus Krötzsch ha scritto:
On 08.04.2015 15:07, Nicola Vitucci wrote:
Hi Markus,
would you recommend to add some sort of "patch" until the new dumps
are
out, either in the data (by adding some triples to a temporary graph) or just in the Web interface for the external links?
If you need it ASAP, you could actually just implement it in our Java code and make a pull request. It should not be too much effort. You can use the issue to ask about details if you are not sure what to do.
Otherwise the ETA would be end of April/beginning of May (several other RDF extensions are currently being worked on and will happen first, e.g., ranks in RDF).
I don't need it right now, so given the short ETA I'd wait. Anyway, in order to let people use external links more easily, I could just "manually" drop the last letter (or apply any other rule) only in the href links for now, while leaving the URIs intact. Do you see any harm in this?
Ah, you mean for dispayling links in HTML? No, there is no harm in this at all. Most likely, the final life exports will also redirect to the property entity exports (which would make most sense for LOD crawlers).
Indeed. I made this temporary change on WikiSPARQL, so that links like in Jean-Baptiste's examples may work "properly". If you try this:
http://wikisparql.org/sparql?query=DESCRIBE+%3Chttp%3A//www.wikidata.org/ent...
and then click on the external link on any property, now you should be redirected to the "right" wiki page.
Nicola
Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
On 09.04.2015 01:11, Nicola Vitucci wrote: ...
Indeed. I made this temporary change on WikiSPARQL, so that links like in Jean-Baptiste's examples may work "properly". If you try this:
http://wikisparql.org/sparql?query=DESCRIBE+%3Chttp%3A//www.wikidata.org/ent...
and then click on the external link on any property, now you should be redirected to the "right" wiki page.
Nice, here is a query to make good use of this:
"All properties used on Wikidata to connect too humans, ordered by the number of times that they are used in this way"
http://wikisparql.org/sparql?query=PREFIX+%3A+%3Chttp%3A%2F%2Fwww.wikidata.o...
Most of the properties with less than, say, 5 uses are probably errors (one could do other queries to find out which humans have this particular connection).
As soon as we have the explicit links to the property entities, it will also be possible to create the (more useful) list of property labels.
By the way, if you can control the output UI completely, you may consider adding this: instead of a property or item URI, always display its label (in a selected language), and merely show the URI as a tooltip or similar. This could be done after querying with Javascript so that people don't need to do many join+filter parts in queries just to retrieve the labels.
Markus
P.S. Your interface is very nice, but as Paul remarked it seems that some of the queries are a little slow. Could you maybe rewire the SPARQL execution to our endpoint at http://milenio.dcc.uchile.cl/sparql? It seems to be faster. Of course, I understand if you want to use it for your own testing, but if your main interest is in the UI and not the backend, this might be a nice cooperation.
Cheers,
Markus
On 09.04.2015 22:16, Markus Krötzsch wrote:
On 09.04.2015 01:11, Nicola Vitucci wrote: ...
Indeed. I made this temporary change on WikiSPARQL, so that links like in Jean-Baptiste's examples may work "properly". If you try this:
http://wikisparql.org/sparql?query=DESCRIBE+%3Chttp%3A//www.wikidata.org/ent...
and then click on the external link on any property, now you should be redirected to the "right" wiki page.
Nice, here is a query to make good use of this:
"All properties used on Wikidata to connect too humans, ordered by the number of times that they are used in this way"
http://wikisparql.org/sparql?query=PREFIX+%3A+%3Chttp%3A%2F%2Fwww.wikidata.o...
Most of the properties with less than, say, 5 uses are probably errors (one could do other queries to find out which humans have this particular connection).
As soon as we have the explicit links to the property entities, it will also be possible to create the (more useful) list of property labels.
By the way, if you can control the output UI completely, you may consider adding this: instead of a property or item URI, always display its label (in a selected language), and merely show the URI as a tooltip or similar. This could be done after querying with Javascript so that people don't need to do many join+filter parts in queries just to retrieve the labels.
Markus
On 4/9/15 4:24 PM, Markus Krötzsch wrote:
P.S. Your interface is very nice, but as Paul remarked it seems that some of the queries are a little slow. Could you maybe rewire the SPARQL execution to our endpoint at http://milenio.dcc.uchile.cl/sparql? It seems to be faster. Of course, I understand if you want to use it for your own testing, but if your main interest is in the UI and not the backend, this might be a nice cooperation.
Cheers,
Markus
SPARQL URLs should work across SPARQL endpoints. Basically, you should only change the host part of the URL to execute the same query across different endpoints.
Examples:
1. http://lod.openlinksw.com/sparql?default-graph-uri=&query=PREFIX+%3A+%3C... -- LOD Cloud SPARQL Query Results URL
2. http://milenio.dcc.uchile.cl/sparql?default-graph-uri=&query=PREFIX+%3A+... -- Center of Semantic Web Research in Chile SPARQL Query Results URL
3. http://wikisparql.org/sparql?default-graph-uri=&query=PREFIX+%3A+%3Chttp... -- WikiSPARQL Results.
In addition to the approach above, you can use SPARQL-FED to distribute query patterns (in the SPARQL query body) across SPARQL Endpoints.
This is fundamentally what SPARQL is all about. You can use a structured query language to exploit relations across data spaces, on an HTTP network.
Kingsley
On 09.04.2015 22:16, Markus Krötzsch wrote:
On 09.04.2015 01:11, Nicola Vitucci wrote: ...
Indeed. I made this temporary change on WikiSPARQL, so that links like in Jean-Baptiste's examples may work "properly". If you try this:
http://wikisparql.org/sparql?query=DESCRIBE+%3Chttp%3A//www.wikidata.org/ent...
and then click on the external link on any property, now you should be redirected to the "right" wiki page.
Nice, here is a query to make good use of this:
"All properties used on Wikidata to connect too humans, ordered by the number of times that they are used in this way"
http://wikisparql.org/sparql?query=PREFIX+%3A+%3Chttp%3A%2F%2Fwww.wikidata.o...
Most of the properties with less than, say, 5 uses are probably errors (one could do other queries to find out which humans have this particular connection).
As soon as we have the explicit links to the property entities, it will also be possible to create the (more useful) list of property labels.
By the way, if you can control the output UI completely, you may consider adding this: instead of a property or item URI, always display its label (in a selected language), and merely show the URI as a tooltip or similar. This could be done after querying with Javascript so that people don't need to do many join+filter parts in queries just to retrieve the labels.
Markus
Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Hi all,
sorry for the late answer. First of all, thanks for the encouraging comments! In order to track suggestions and bugs more easily I've created a Github repo here:
https://github.com/nvitucci/wikisparql-project
I'll try and write up stuff as things progress, but feel free to start using the issue tracker.
@Paul:
You can generally run those O.K. on the DBpedia SPARQL endpoint. It would be nice to see a few more horsepower put behind this.
You're definitely right. Speaking of "pure horsepower" the machine hosting WikiSPARQL is not that big right now (32 GB of RAM, 8-thread CPU, no SSDs), but there are a couple things I'd like to do from the backend side in order to try and improve this kind of queries.
@Markus:
By the way, if you can control the output UI completely, you may consider adding this: instead of a property or item URI, always display its label (in a selected language), and merely show the URI as a tooltip or similar. This could be done after querying with Javascript so that people don't need to do many join+filter parts in queries just to retrieve the labels.
Showing a (localized) label in place of a URI is a neat idea, I planned to do that but my main concern is for when you get a lot of URIs as a result of your query, because this might add some overhead - to be tried though.
P.S. Your interface is very nice, but as Paul remarked it seems that some of the queries are a little slow. Could you maybe rewire the SPARQL
execution
to our endpoint at http://milenio.dcc.uchile.cl/sparql? It seems to be
faster.
Of course, I understand if you want to use it for your own testing, but if your main interest is in the UI and not the backend, this might be a
nice cooperation.
I decided to spend some time on the UI because it makes the use of the endpoint easier, but my main goal is to try and understand together with the community what can be done to make the use of an endpoint as efficient as possible, not only by tuning the right parameters (e.g. should there be a timeout on the queries? how much RAM should I make available? etc.) but even by looking at different technologies. It's great though to have a chance to compare results with other endpoints, and it might be a good idea to give a chance to decide which endpoint(s) to use as you suggested, thus I can see many chances for cooperation. By the way, can I reuse your queries as example queries?
@Kingsley:
SPARQL URLs should work across SPARQL endpoints. Basically, you should only change the host part of the URL to execute the same query across different endpoints.
Indeed, that would be pretty easy, provided that SPARQL "variants" (e.g. the use of parentheses around aggregate expressions, or magic properties) are handled correctly.
Cheers,
Nicola
Hello, Speaking about horsepower on Linked data, this could be a project to follow : http://linkeddatafragments.org/ Cheers,
Nicola Vitucci nicola.vitucci@gmail.com a écrit :
Hi all,
sorry for the late answer. First of all, thanks for the encouraging comments! In order to track suggestions and bugs more easily I've created a Github repo here:
https://github.com/nvitucci/wikisparql-project
I'll try and write up stuff as things progress, but feel free to start using the issue tracker.
@Paul:
You can generally run those O.K. on the DBpedia SPARQL endpoint. It would be nice to see a few more horsepower put behind this.
You're definitely right. Speaking of "pure horsepower" the machine hosting WikiSPARQL is not that big right now (32 GB of RAM, 8-thread CPU, no SSDs), but there are a couple things I'd like to do from the backend side in order to try and improve this kind of queries.
@Markus:
By the way, if you can control the output UI completely, you may
consider
adding this: instead of a property or item URI, always display its label (in a selected language), and merely show the URI as a tooltip or similar. This could be done after querying with Javascript so that people don't need to do many join+filter parts in queries just to retrieve the labels.
Showing a (localized) label in place of a URI is a neat idea, I planned to do that but my main concern is for when you get a lot of URIs as a result of your query, because this might add some overhead - to be tried though.
P.S. Your interface is very nice, but as Paul remarked it seems that
some
of the queries are a little slow. Could you maybe rewire the SPARQL
execution
to our endpoint at http://milenio.dcc.uchile.cl/sparql? It seems to be
faster.
Of course, I understand if you want to use it for your own testing, but if your main interest is in the UI and not the backend, this might be a
nice cooperation.
I decided to spend some time on the UI because it makes the use of the endpoint easier, but my main goal is to try and understand together with the community what can be done to make the use of an endpoint as efficient as possible, not only by tuning the right parameters (e.g. should there be a timeout on the queries? how much RAM should I make available? etc.) but even by looking at different technologies. It's great though to have a chance to compare results with other endpoints, and it might be a good idea to give a chance to decide which endpoint(s) to use as you suggested, thus I can see many chances for cooperation. By the way, can I reuse your queries as example queries?
@Kingsley:
SPARQL URLs should work across SPARQL endpoints. Basically, you should only change the host part of the URL to execute the same query across different endpoints.
Indeed, that would be pretty easy, provided that SPARQL "variants" (e.g. the use of parentheses around aggregate expressions, or magic properties) are handled correctly.
Cheers,
Nicola
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.orghttps://lists.wikimedia.org/mailman/listinfo/wikidata-l Jean-Baptiste Pressac Traitement et analyse de bases de données Centre de Recherche Bretonne et Celtique 20 rue Duquesne CS 93837 29238 BREST cedex 3 tel : +33 (0)2 98 01 68 95 fax : +33 (0)2 98 01 63 93