Here at the Wikimania conference, there is a lot of talk
about Wikidata and Wikibase coming to Commons, but it is
not clear how or when. Since it's very abstract still, it
is not clear whether one should be happy or scared.
I see that interwiki links for Commons categories can in
some cases be covered by Wikidata. I just removed all the
interwiki links and "Sister Wikipedia" links from the
Category:Sweden. The result was not obvious at first, as
all interwiki links seem to still be there, now pulled
from Wikidata. But now they go to Category:Sweden on
Wikipedia and not to the article Sweden.
https://commons.wikimedia.org/w/index.php?title=Category%3ASweden&diff=1311…
Is this something we should start to do on a large scale?
Or something we should avoid at all cost, because we
prefer to link Commons categories to Wikipedia articles?
It is confusing that even though Wikidata claims to represent
concepts, the country Sweden has two different nodes for
the Wikipedia article, https://www.wikidata.org/wiki/Q34
and the category, https://www.wikidata.org/wiki/Q4368475
both representing the same country.
Can I find out how many pages on Commons pull interwiki
links from Wikidata? Did this count increase by one,
when I made that edit?
The category page also contains short descriptions in a
handfull of languages (en, de, fr, it, ja, nb, sv, fi, uk),
but these could also be better served from Wikidata.
What do I write in the Category:Sweden page to pull the
short text descriptions from Wikidata?
Wikidata currently has short text descriptions of Sweden
in only 4 languages (da, de, nb, sv) and they differ
from the short text descriptions on Commons. For example,
in nb (Norwegian bokmål), Wikidata says "Kategori:Sverige"
whereas Commons only says "Sverige". Shouldn't all
categories that pull their interwiki links from Wikidata
also pull the short text descriptions from Wikidata?
Since this category represents a geographic entity, its
center coordinate should also be given as an "Object
location", which is currently missing.
https://commons.wikimedia.org/wiki/Template:Object_location
But perhaps that information should rather be inserted
into Wikidata? Even better than the center coordinate would
be the border outline from OpenStreetMap? In OSM this is
relation 52822, which also provides the name in many
different languages,
www.openstreetmap.org/relation/52822
--
Lars Aronsson (lars(a)aronsson.se)
Aronsson Datateknik - http://aronsson.se
Hi everyone,
I'm starting a new project. Let's get all paintings on Wikidata! I could
use some help. I already imported 4 museums in the Netherlands, but
that's of course just a start. More information at
https://www.wikidata.org/wiki/Wikidata:WikiProject_sum_of_all_paintings .
Maarten
Dear colleagues and fellow enthusiast of the semantic web,
we are pleased to announce that the SEMANTiCS conference on Sep 4-5,
2014 in Leipzig, will become a major industry conference on semantic web
and linked data applications in Europe.
With partners such as PoolParty, STI2, Eccenca, IntraFind, LOD2 Project,
MarkLogic, Ontos and Wolters Kluwer as well as more than 50% of our
currently registered attendees being from the industry sector, SEMANTiCS
has lifted the primarily academic topic of semantic web to the next
level of business application.
From our keynote speakers Sofia Angeletou (BBC), Thomas Kelly
(Cognizant), Phil Archer (W3C) and Orri Erling (OpenLink) to 40+
speakers on 5 parallel tracks and special events like the vocabulary
carnival, H2020 networking event and conference dinner, SEMANTiCS offers
a wide variety of industry insights and networking chances. You can see
our programm here: http://www.semantics.cc/programme/
Hence, this year's SEMANTiCS conference is your chance to get in touch
with potential business clients and industry partners to push your own
projects and developments in the semantic web sector.
You can still submit to the Vocabulary Carnival:
http://www.semantics.cc/vocarnival/ as well as the H2020 networking
session. Furthermore, we will collect and print your H2020 organisation
profile description in the program guide, so you can be approached at
the conference for potential projects.
Being a fellow enthusiast in this future-defining field, we'd like to
offer you a special discount of 20% on your ticket to the conference.
Simply go to www.semantics.cc/registration/discount and claim your
discount with the following promo code: “semantic-web-fellow”
This offer is valid until 15th of August.
For further information on the programme and our keynote speaker, please
visit www.semantics.cc
Feel free to forward this email to any interested fellow.
See you in Leipzig,
Sebastian Hellmann
on behalf of the all conference committee members
--
Sebastian Hellmann
AKSW/NLP2RDF research group
Insitute for Applied Informatics (InfAI) and DBpedia Association
Events:
* *Sept. 1-5, 2014* Conference Week in Leipzig, including
** *Sept 2nd*, MLODE 2014 <http://mlode2014.nlp2rdf.org/>
** *Sept 3rd*, 2nd DBpedia Community Meeting
<http://wiki.dbpedia.org/meetings/Leipzig2014>
** *Sept 4th-5th*, SEMANTiCS (formerly i-SEMANTICS) <http://semantics.cc/>
Venha para a Alemanha como PhD: http://bis.informatik.uni-leipzig.de/csf
Projects: http://dbpedia.org, http://nlp2rdf.org,
http://linguistics.okfn.org, https://www.w3.org/community/ld4lt
<http://www.w3.org/community/ld4lt>
Homepage: http://aksw.org/SebastianHellmann
Research Group: http://aksw.org
Thesis:
http://tinyurl.com/sh-thesis-summaryhttp://tinyurl.com/sh-thesis
Hello everyone!
To help you plan your Wikimania 2014 visit and to ensure you have all
things Wikidata, Lydia asked me to compile a list of all Wikidata-related
events currently planned for Wikimania. With it; also includes a list of
Wikimedia Deutschland staff going all of which you should say hi to and
thank them for their awesome work for this project.
You can see the list at
https://wikimania2014.wikimedia.org/wiki/User:John_F._Lewis/Wikidata_Events
If I missed anything - please add! Also add your name to the community
members list if you are attending so everyone can meet everyone.
Thanks,
John Lewis
Hi everybody,
We are happy to announce an experimental RDF dump of the Wikimedia Commons. A complete first draft is now available online at http://nl.dbpedia.org/downloads/commonswiki/20140705/, and will be eventually accesible from http://commons.dbpedia.org. A small sample dataset, which may be easier to browse, is available on Github at https://github.com/gaurav/commons-extraction/tree/master/commonswiki/201401…
The following datasets showcases some of the improvements that we’ve been working on over the last two months:
- File information (*-file-information.*) is a completely new dataset that contains information on the files in the Commons, including file and thumbnail URLs, file extensions, file type classes and MIME types.
- DBpedia’s Mappings Extractor (*-mappingbased-properties.*) uses templates stored on the Mapping server (http://mappings.dbpedia.org/) to create RDF for information-rich templates. This system still has some important limitations, such as not being able to process process embedded templates (e.g. license templates inside {{Information}}), but top-level templates are completely configurable. The existing mappings are available at http://mappings.dbpedia.org/index.php/Mapping_commons
- This includes 363 license templates that indicate licensing for Commons files under public domain, Creative Commons and other open access licenses. These were created by bots and still require verification before use. They are listed at http://mappings.dbpedia.org/index.php/Category:Commons_media_license
- The DBpedia Geoextractor (*-geo-coordinates.*) now extracts geographical coordinates from Commons files using the {{Location}} template.
- The DBpedia SKOS Extractor (*-skos-categories.*) now identifies relationships between Commons categories, building a SKOS-based description of the entire Commons category tree.
Please have a look and let us know what you think. We’ll be working on a number of open tasks over the next three weeks, listed at https://github.com/gaurav/extraction-framework/issues?state=open -- if you see something wrong with what we’ve done above, or have an issue you’d particularly like us to tackle, please report it there or drop me an e-mail!
This work is sponsored by the Google Summer of Code program
(https://www.google-melange.com/gsoc/project/details/google/gsoc2014/gaurav/…).
Thanks!
cheers,
The DBpedia Commons extraction team:
Gaurav Vaidya
Dimitris Kontokostas
Andrea Di Menna
Jimmy O’Regan
Hello,
I want the subject's mailing list to be translatable and delivered the
same way as it's with Tech news. I've asked about it on it's
subscription page's talk page and faced there Lydia, who doesn't believe
that there would be any volunteers (except for me in the current point)
to translate it.
So if you would like to translate the newsletter when it would be set to
be possible, could you leave a note in
[[:m:Talk:Global_message_delivery/Targets/Wikidata#Multilingual
distribution
<https://meta.wikimedia.org/wiki/Talk:Global_message_delivery/Targets/Wikida…>]]
about your willing so we get Lydia convinced :) Poking friends who may
do it is welcome as well ;)
Also it would be nice if Odder or/and Guillom leave a note about
technical side of such delivery the same way as the latter was so kind
to leave in Education newsletter's talk :)
Yours sincerely,
--Base