JFC Morfin wrote:
- Since we have a W3C expert: what is the best document/book to get
a comprehensive and clear (not too massive) documentation on the semantic web?
You surely don't want to know all about semantic web - especially the Ontology stuff with OWL dialects and entailment regimes is far too academic and won't be part of wikidata because of computational complexity anyway. In short, you should be *very sceptical* and cautious every time you stumple upon anything that requires inference rules. Even trivial inference rules such as those based on owl:sameAs and rdf:type can be problematic in practice! The less inference you assume, the better.
I can recommend the "Linked Data Patterns" book by Dodds and Davis: http://patterns.dataincubator.org/book/
Jakob
At 11:17 31/03/2012, Jakob Voss wrote:
JFC Morfin wrote:
- Since we have a W3C expert: what is the best document/book to get
a comprehensive and clear (not too massive) documentation on the semantic web?
You surely don't want to know all about semantic web - especially the Ontology stuff with OWL dialects and entailment regimes is far too academic and won't be part of wikidata because of computational complexity anyway.
Thx. What I meant by "comprehensive" is that it covers all the areas, in a state of the art manner, at a useful level to understand, take or repell decisions.
The problem we face today with SDOs' documentation is that they come as separate "bills" (standard, RFCs, etc.) and not as part of maintained structured "codes" (lawyers do that better). The first target for a wikidata project could be to ask W3C, IETF, ISO, IEEE, JTC1, etc. to reduce their "bills" into "sections" that could be rebuilt as "codes" through framework interlinks.
* each new "bill" would result into sections updates, that in turn would update and partly reshape the code structure. * anyone could obtain access to a general current view of their areas and appropriately dig into it.
This is not feasible in the general and multicultural concepts areas. But this would only be English documentation. It would help everyone and provide experience and momentum for the wikidata project.
In short, you should be *very sceptical* and cautious every time you stumple upon anything that requires inference rules. Even trivial inference rules such as those based on owl:sameAs and rdf:type can be problematic in practice! The less inference you assume, the better.
Oh! yes! We are not talking of wikilogica yet :-)
I can recommend the "Linked Data Patterns" book by Dodds and Davis: http://patterns.dataincubator.org/book/
Thank you. I have printed it. WIll try to read it at least in part this WE. jfc
On Mar 31, 2012, at 11:17 , Jakob Voss wrote:
JFC Morfin wrote:
- Since we have a W3C expert: what is the best document/book to get
a comprehensive and clear (not too massive) documentation on the semantic web?
You surely don't want to know all about semantic web - especially the Ontology stuff with OWL dialects and entailment regimes is far too academic and won't be part of wikidata because of computational complexity anyway. In short, you should be *very sceptical* and cautious every time you stumple upon anything that requires inference rules. Even trivial inference rules such as those based on owl:sameAs and rdf:type can be problematic in practice! The less inference you assume, the better.
Let us avoid the all-to-simplistic view that says Semantic Web == OWL:-)
Indeed, bringing in (OWL) inferencing into the core WD project would be a mistake. From the SW stack, RDF, RDFS, and, on a different note, SPARQL and maybe RDB2RDF should be the technologies having a role in the project, as well and Linked Data patterns in general.
That being said, it is probably good to have the vocabularies being used in WD be properly defined/described. If *somebody else* wants to do inferencing, for example, we should not stand in the way.
Ivan
I can recommend the "Linked Data Patterns" book by Dodds and Davis: http://patterns.dataincubator.org/book/
Indeed. That is a great one, too
Ivan
Jakob
-- Verbundzentrale des GBV (VZG) Digitale Bibliothek - Jakob Voß Platz der Goettinger Sieben 1 37073 Goettingen - Germany +49 (0)551 39-10242 http://www.gbv.de jakob.voss@gbv.de
Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
---- Ivan Herman, W3C Semantic Web Activity Lead Home: http://www.w3.org/People/Ivan/ mobile: +31-641044153 FOAF: http://www.ivan-herman.net/foaf.rdf
A very interesting discussion. Some general answers to this are:
* Wikidata does, of course, not intend to implement complex reasoning (or any other algorithm that qualifies as "complex").
* If useful for serving its requirements, Wikidata will not exclude modelling features just because they are also supported in OWL ;-) For example, it could be useful to say that Wikidata item describes the same as an external resource, which can be done in OWL using sameAs. Many communities could use this for integrating Wikidata information with other Web databases.
* The "reasoning" support in Wikidata will not in general limit the modelling support in Wikidata: it might be possible to say something that has a formal meaning in OWL, even if this formal meaning is not relevant for query answering in Wikidata (sameAs with external resources is a possible example, since Wikidata would surely not pull data from these sources for internal query answering).
* Wikidata will support various export formats, which have more or less native support for certain modelling features. We will use whatever expressivity is available in the given format to describe the Wikidata information as accurately as possible. This might again lead to some OWL constructs being used in RDF/OWL exports. All Wikidata content will have a formal meaning, and we will draw from existing experience and standards for defining this so that it is as widely compatible as possible.
In summary, it is not about endorsing or rejecting a particular ontology language. We will be open and inclusive with what we support, and user requirements will be the main guideline for defining "what can be said" in the system.
Best regards,
Markus
On 01/04/12 08:54, Ivan Herman wrote:
On Mar 31, 2012, at 11:17 , Jakob Voss wrote:
JFC Morfin wrote:
- Since we have a W3C expert: what is the best document/book to get
a comprehensive and clear (not too massive) documentation on the semantic web?
You surely don't want to know all about semantic web - especially the Ontology stuff with OWL dialects and entailment regimes is far too academic and won't be part of wikidata because of computational complexity anyway. In short, you should be *very sceptical* and cautious every time you stumple upon anything that requires inference rules. Even trivial inference rules such as those based on owl:sameAs and rdf:type can be problematic in practice! The less inference you assume, the better.
Let us avoid the all-to-simplistic view that says Semantic Web == OWL:-)
Indeed, bringing in (OWL) inferencing into the core WD project would be a mistake. From the SW stack, RDF, RDFS, and, on a different note, SPARQL and maybe RDB2RDF should be the technologies having a role in the project, as well and Linked Data patterns in general.
That being said, it is probably good to have the vocabularies being used in WD be properly defined/described. If *somebody else* wants to do inferencing, for example, we should not stand in the way.
Ivan
I can recommend the "Linked Data Patterns" book by Dodds and Davis: http://patterns.dataincubator.org/book/
Indeed. That is a great one, too
Ivan
Jakob
-- Verbundzentrale des GBV (VZG) Digitale Bibliothek - Jakob Voß Platz der Goettinger Sieben 1 37073 Goettingen - Germany +49 (0)551 39-10242 http://www.gbv.de jakob.voss@gbv.de
Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Ivan Herman, W3C Semantic Web Activity Lead Home: http://www.w3.org/People/Ivan/ mobile: +31-641044153 FOAF: http://www.ivan-herman.net/foaf.rdf
Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
On Sun, 1 Apr 2012, Markus Krötzsch wrote:
A very interesting discussion. Some general answers to this are:
- Wikidata does, of course, not intend to implement complex reasoning (or any
other algorithm that qualifies as "complex").
- If useful for serving its requirements, Wikidata will not exclude modelling
features just because they are also supported in OWL ;-) For example, it could be useful to say that Wikidata item describes the same as an external resource, which can be done in OWL using sameAs. Many communities could use this for integrating Wikidata information with other Web databases.
- The "reasoning" support in Wikidata will not in general limit the modelling
support in Wikidata: it might be possible to say something that has a formal meaning in OWL, even if this formal meaning is not relevant for query answering in Wikidata (sameAs with external resources is a possible example, since Wikidata would surely not pull data from these sources for internal query answering).
- Wikidata will support various export formats, which have more or less
native support for certain modelling features. We will use whatever expressivity is available in the given format to describe the Wikidata information as accurately as possible. This might again lead to some OWL constructs being used in RDF/OWL exports. All Wikidata content will have a formal meaning, and we will draw from existing experience and standards for defining this so that it is as widely compatible as possible.
In summary, it is not about endorsing or rejecting a particular ontology language. We will be open and inclusive with what we support, and user requirements will be the main guideline for defining "what can be said" in the system.
This sounds good to me. (Not in the least because this would allow data representations that are optimized for certain classes of knowledge, e.g., Topic Maps, or mathematical/physical relationships.)
But it triggers the obvious question: when and how will such discussions (and decision making) be done in the course of the coming year?
Best regards,
Markus
Best regards,
Herman Bruyninckx
On 01/04/12 08:54, Ivan Herman wrote:
On Mar 31, 2012, at 11:17 , Jakob Voss wrote:
JFC Morfin wrote:
- Since we have a W3C expert: what is the best document/book to get
a comprehensive and clear (not too massive) documentation on the semantic web?
You surely don't want to know all about semantic web - especially the Ontology stuff with OWL dialects and entailment regimes is far too academic and won't be part of wikidata because of computational complexity anyway. In short, you should be *very sceptical* and cautious every time you stumple upon anything that requires inference rules. Even trivial inference rules such as those based on owl:sameAs and rdf:type can be problematic in practice! The less inference you assume, the better.
Let us avoid the all-to-simplistic view that says Semantic Web == OWL:-)
Indeed, bringing in (OWL) inferencing into the core WD project would be a mistake. From the SW stack, RDF, RDFS, and, on a different note, SPARQL and maybe RDB2RDF should be the technologies having a role in the project, as well and Linked Data patterns in general.
That being said, it is probably good to have the vocabularies being used in WD be properly defined/described. If *somebody else* wants to do inferencing, for example, we should not stand in the way.
Ivan
I can recommend the "Linked Data Patterns" book by Dodds and Davis: http://patterns.dataincubator.org/book/
Indeed. That is a great one, too
Ivan
Jakob