Is there a better way to get at more items in the "In
Other Languages" section than to change the interface
language with ?setlang= ?
I had an item I wanted to give the name in Italian for
(it was mentioned in the English Wikipedia article)
and I discovered that the only way I could find to get
access to the form field for Italian name was to
change Wikidata to Italian.
I didn't see anything immediately upon Googling.
Thank you,
Derric Atzrott
Wikimedia Finland calls for a Wikidata developer/advocate to present
Wikidata at Avoin Suomi 2014 fair http://avoinsuomi2014.fi/ September
15–16. The event is organized by the Prime Minister's Office in
collaboration with several public sector actors and open knowledge
organizations.
Wikimedia Finland has a booth, and will present projects in GLAM and
education, and the Wikimedia sister projects with focus on Wikidata.
To take advantage of the presence of the skilled wikidatan, we plan to
arrange a Wikidata hands-on workshop before or after the event.
The closer you are to Finland the better, but Finnish language is not a
requirement.
Please contact susanna.anas(a)wikimedia.fi if you are interested. You may
forward this to a person you know might be interested.
Looking forward to talking with you!
Susanna
--
*Susanna Ånäs *Käyttäjä:Susannaanas
Wikimedia Suomi <http://wikimedia.fi/> – Wikimaps
<http://wikimaps.wikimedia.fi/> – GLAM
<http://fi.wikipedia.org/wiki/Wikipedia:GLAM>
@ <https://twitter.com/WMFinland>WMFinland <https://twitter.com/WMFinland>
/ Facebook <https://www.facebook.com/WikimediaSuomi> / Liity jäseneksi!
<http://fi.wikimedia.org/wiki/Liity_j%C3%A4seneksi>
Hi all,
We have a lot of statements saying that something is an instance of a
"Wikipedia disambiguation page" (Q4167410). Unfortunately, this kind of
information says something about a particular Wikipedia article in a
particular language, and often is not true for other languages.
Moreover, even if there is a language where the according article is
marked as disambiguation page, it is still common that the page gives a
description of a real item.
Example (bad use of instance of:Wikipedia disambiguation page)
==============================================================
https://www.wikidata.org/wiki/Q247819 (VW Polo)
Enwiki (like many languages) has a normal article here that is not a
disambiguation page. It says "The Volkswagen Polo is a supermini car
produced by the German manufacturer Volkswagen". That's very different
from "The Volkswagen Polo is a disambiguation page."
Even Wikipedias where the VW-Polo article is marked as disambiguation
page do not claim that the thing they are talking about is the
disambiguation page. For instance, frwiki has the article in
Catégorie:Homonymie, yet it says:
"Volkswagen Polo est une automobile, de la gamme des polyvalentes, de la
marque allemande Volkswagen"
Again, it is not said that VW Polo is a disambiguation page, even though
the page (not the car) is marked as one.
Proper use of instance of:Wikipedia disambiguation page
=======================================================
Now there are also many proper disambiguation pages. They do not have a
joint concept, other than the ambiguous title in a particular language.
Examples:
https://en.wikipedia.org/wiki/Jaguar_(disambiguation)
and, entertainingly:
https://en.wikipedia.org/wiki/Disambiguation_(disambiguation)
An item that is "instance of:Wikipedia disambiguation page":
* should not have sitelinks to pages that are not disambiguation pages
(an item can either be about a Wikipedia page or about a car, but these
should be kept separate),
* should always use the exact page title as the label (because this is
the real label of the page; the page "Jaguar (disambiguation)" is not
called "Jaguar" by anybody),
* should hardly have any statements at all, since there is almost
nothing that you can truthfully say about a group of pages in many
different languages, and since we want to avoid project-specific
statements (that's one reason we have badges as part of site links).
Whether disambiguation pages should have more than a single sitelink at
all is another question. In my view, if we are talking about a "page",
it is not the same page in French as it is in English (most properties
that pages could naturally have, such as authors, language, creation
date, etc. apply to a single page only). However, I can see that it is
practical to group such pages nonetheless.
Conclusion
==========
It would be nice if somebody could analyse this problem in more detail
(how many of our "disambiguation page" items have statements that are
obviously not about a page but about a car make, animal, etc.). We might
need some manual effort to clean this up (basically, a kind of
un-merging game).
The immediate conclusion is that we need to be much more careful
importing this type of information from one Wikipedia, since it is (by
its very nature) not project-independent and not universal across languages.
Cheers,
Markus
https://www.wikidata.org/wiki/Wikidata:WikiProject_Structured_Data_for_Comm…
The aim of of the WikiProject Structured Data for Commons is:
* To develop templates that draw directly on Wikidata (and in future
also on Commons Wikibase), that will act as drop-in replacements for
templates currently in use on Commons.
* To develop new templates that can bring new functionality to
Commons filepages (eg "topics" listings)
* To support the cataloguing of particularly idiosyncratic templates
currently in use on Commons (eg institutional credit/backlink templates,
and other source templates), and try to produce more generalised,
standardised forms that can draw on Wikidata.
* To work with other WikiProjects on Wikidata to understand, document
and develop the data models on Wikidata, and make sure that they are
sufficient to accommodate the needs of GLAM organisations and others
currently or in future uploading or maintaining metadata on Commons.
* To start to port existing such data that can be represented in
structured form, and is appropriate to do so, from Commons to Wikidata
* To examine the divide between what should be stored on Wikidata and
what should be stored on the proposed Commons Wikibase.
* To support, as a user-space community, the work of the staffers
developing Commons Wikibase and other aspects of the Foundation
initiative for Structured Data for Commons in any way we can.
Sign up now!
Talk-page comments, or wholescale re-editing, of this essay at
https://commons.wikimedia.org/wiki/Commons:Wikidata/How_GLAMs_can_help_the_…
also very welcome.
-- J.
Hey folks :)
As announced last week we just deployed a number of new features. Those are:
* Wikinews is now able to manage its sitelinks via Wikidata.
https://www.wikidata.org/wiki/Wikidata:Wikinews for
questions/coordination/...
* Wikidata is now also its own client. This means you can for example
add a sitelink to Wikidata:Help to the item for all main help pages.
You are able to make use of the data in items on other pages in
Wikidata with Lua. (Arbitrary access has been enabled for Wikidata
for this but when data in an item changes we will not be able to purge
the page using the data yet.)
* Sitelinks for projects with just one sitelink in the group (like
Commons, Wikidata and in the future Meta for example) are now grouped
together in one sitelink group.
* Badges for good and featured articles can be stored on Wikidata
right next to the sitelink. We have badges for featured and good
articles. More can be added on request later. Thanks to Bene* and
lazowik for this feature.
* Redirects between items can be created. When two items are merged
one of them can be turned into a redirect. This way our identifiers
can be considered much more stable by 3rd parties for example. It also
makes it unnecessary to delete duplicate items. This will reduce the
workload of our admins considerably.
* We have the new datatype monolingual text. This allows you to make
statements with a string and an associated language.
Known issues/limitations:
* Redirects can so far only be created via the API
* Arbitrary access on Wikidata to the data on Wikidata itself is only
possible via Lua. The parser function still needs to be adapted.
* Badges can not yet be shown on the Wikipedias etc. This will follow next week.
* Diffs for badges changes have a link to a wrong target
(https://bugzilla.wikimedia.org/show_bug.cgi?id=69758)
Cheers
Lydia
--
Lydia Pintscher - http://about.me/lydia.pintscher
Product Manager for Wikidata
Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
There was an RFC last year about linking interwiki
linking to redirect pages on the various Wikipedias:
https://www.wikidata.org/wiki/Wikidata:Requests_for_comment/A_need_for_a_re…
ion_regarding_article_moves_and_redirects#new_Proposal_zero
Any idea what ever happened to that? Was a bug
opened eventually, was there another discussion where
it was decided not to do it? Is it still just a
work in progress?
Thank you,
Derric Atzrott
Hey folks :)
Just an update on badges support on Wikidata. We will be rolling out
support for badges on Wikidata on August 19th. At this point you will
be able to store the information that a given article is a good or
featured article on English Wikipedia for example. More badges can be
added on request. One week later we will enable showing those badges
on Wikipedia/Wikisource/Wikiquote in the language links in the
sidebar.
If your Wikipedia wants them to be removed from the wikitext please
get in touch with Amir. He has a bot to do it for you.
You can try out the Wikidata-side of it on our test system:
https://test.wikidata.org/wiki/Q296
Cheers
Lydia
--
Lydia Pintscher - http://about.me/lydia.pintscher
Product Manager for Wikidata
Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
Apologies for cross-posting.
This is a kind reminder about the weekly Google Hangout to prepare for
the LIDER Hackathon in Leipzig (Sept 1st).
The preparation Hangouts will happen each Tuesday at 2pm Leipzig time
until the event.
Links to join can be found here:
http://mlode2014.nlp2rdf.org/hackathon/
You are still able to submit topics for hacking. Please add them to this
document:
https://docs.google.com/document/d/13riJU5LY50Q6AeHzlkIqln9enlq9a1EDsOyp6C4…
or send an email to Bettina Klimek <klimek(a)informatik.uni-leipzig.de>
Currently we have the confirmed topics below. Furthermore we have
experts available that will help you to get in touch with Linked Data
and RDF and help you to bring your own tools to the Semantic Web world.
T7: [Confirmed] Roundtrip conversion from TBX2RDF and back
The idea of this is to work on a roundtrip conversion from the TBX
standard for representing terminology to RDF and back. The idea would
be to build on the existing code at bitbucket:
https://bitbucket.org/vroddon/tbx2rdf
Potential industry partner: TILDE (Tatiana)
Source code: https://bitbucket.org/vroddon/tbx2rdf
TBX Standard: http://www.ttt.org/oscarstandards/tbx/
Contact person: Philipp Cimiano, John McCrae, Victor Rodriguez-Doncel
T8: [Confirmed] Converting multilingual dictionaries as LD on the Web
The experience on the creation of the Apertium RDF
<http://linguistic.linkeddata.es/apertium/>dictionaries will be
presented. Taking as starting point a bilingual dictionary represented
in LMF/XML, a mapping into RDF was made by using tools such as Open
Refine <http://openrefine.org/>. From each bilingual dictionary three
components (graphs) were created in RDF: two lexicons and a translation
set. The used vocabularies were lemon <http://lemon-model.net/>for
representing lexical information and the translation module
<http://linguistic.linkeddata.es/def/translation/>for representing
translations. Once they were published on the Web, some immediate
benefits arise such as: automatic enrichment of the monolingual lexicons
each time a new dictionary is published (due to the URIs ruse), simple
graph-based navigation across the lexical information and, more
interestingly, simple querying across (initially) independent dictionaries.
The task could be either to reproduce part of the Apertium generation
process, for those willing to learn about lemon and about techniques for
representing translations in RDF, or to repeat the process with other
input data (bilingual or multilingual lexica) provided by participants.
Contact person: Jorge Gracia
T9: [Confirmed] Based on the NIF-LD output of Babelfy we can try to
deploy existing RDF visualizations out of the box and query the
output with SPARQL
Babelfy <http://babelfy.org/>is a unified, multilingual, graph-based
approach to Entity Linking and Word Sense Disambiguation. Based on a
loose identification of candidate meanings, coupled with a densest
subgraph heuristic which selects high-coherence semantic
interpretations, Babelfy is able to annotate free text with with both
concepts and named entities drawn from BabelNet
<http://www.babelnet.org/>’s sense inventory.
The task consists of converting text annotated by Babelfy into RDF
format. In order to accomplish this, participants will start from free
text, will annotate it with Babelfy and will eventually make use of the
NLP2RDF NIF module <http://site.nlp2rdf.org/>. Data can also be
displayed using visualization tools such as RelFinder
<http://www.visualdataweb.org/relfinder/relfinder.php>.
Contact person: Tiziano Flati (flati(a)di.uniroma1.it
<mailto:flati@di.uniroma1.it>), Roberto Navigli (navigli(a)di.uniroma1.it
<mailto:navigli@di.uniroma1.it>)
--
Sebastian Hellmann
AKSW/NLP2RDF research group
Insitute for Applied Informatics (InfAI) and DBpedia Association
Events:
* *Sept. 1-5, 2014* Conference Week in Leipzig, including
** *Sept 2nd*, MLODE 2014 <http://mlode2014.nlp2rdf.org/>
** *Sept 3rd*, 2nd DBpedia Community Meeting
<http://wiki.dbpedia.org/meetings/Leipzig2014>
** *Sept 4th-5th*, SEMANTiCS (formerly i-SEMANTICS) <http://semantics.cc/>
Venha para a Alemanha como PhD: http://bis.informatik.uni-leipzig.de/csf
Projects: http://dbpedia.org, http://nlp2rdf.org,
http://linguistics.okfn.org, https://www.w3.org/community/ld4lt
<http://www.w3.org/community/ld4lt>
Homepage: http://aksw.org/SebastianHellmann
Research Group: http://aksw.org
Thesis:
http://tinyurl.com/sh-thesis-summaryhttp://tinyurl.com/sh-thesis
Hoi,
Amir has created functionality that compares data from en.wp de.wp and
it.wp. It is data about "humans" and it only shows differences where they
exist. It compares those four Wikipedias with information in Wikidata.
The idea is that the report will be updated regularly.
The problem we face is: what should it actually look like. Should it just
splatter the info on a page or is more needed. At this time we just have
data [1].
Please help us with something that works easily for now. Once we have
something, it can be prettified and more functional.
Thanks,
GerardM
[1] http://paste.ubuntu.com/8079742/
Hi Micru and glad to meet you,
On 8/16/14, 2:00 PM, wikidata-l-request(a)lists.wikimedia.org wrote:
> Hi Marco,
>
> Thanks for getting in touch. Looking at the list there seems to be many
> wrong mappings.
Sure, the list should be the result of the first automatic matching
attempts and currently the only human validator is the GSoC student.
> Is there any way that we can collaborate to increase the number of matches
> or does it have to do with qualifiers?
Of course, and we definitely should collaborate!
Both the Wikidata and the DBpedia communities can strongly benefit from
each other.
The DBpedia ontology is based on a wiki as well, i.e., the mappings wiki
[1].
All you have to do is to create an account and request editor rights on
our mailing list [2] (please subscribe first).
> And do you plan to document 1:1
> matches on your OntologyProperty namespace?
Sure, we will add an 'owl:equivalentProperty' assertion pointing to the
matching Wikidata property.
BTW, it should be partially implemented for classes too, e.g. 'Person' [3].
This is reflected in the data. See for instance the 'rdf:type' property
in this Italian chapter example [4], which has links to Wikidata classes.
Cheers!
[1] http://mappings.dbpedia.org
[2] dbpedia-discussion(a)lists.sourceforge.net
[3] http://mappings.dbpedia.org/index.php/OntologyClass:Person
[4] http://it.dbpedia.org/resource/Joey_Ramone
>
> Thanks and regards,
> Micru
--
Marco Fossati
http://about.me/marco.fossati
Twitter: @hjfocs
Skype: hell_j