Dear users, developers and all people interested in semantic wikis,
We are happy to announce SMWCon Fall 2013 - the 8th Semantic MediaWiki
* Dates: October 28th to October 30th 2013 (Monday to Wednesday)
* Location: A&O Berlin Hauptbahnhof, Lehrter Str. 12, 10557 Berlin, Germany
* Conference wikipage: https://semantic-mediawiki.org/wiki/SMWCon_Fall_2013
* Participants: Everybody interested in semantic wikis, especially in
Semantic MediaWiki, e.g., users, developers, consultants, business
SMWCon Fall 2013 will be supported by the Open Semantic Data
Association e. V. . Our platinum sponsor will be WikiVote ltd,
Following the success of recent SMWCons, we will have one tutorial day
and two conference days.
Participating in the conference: To help us planning, you can already
informally register on the wikipage, although a firm registration will
later be needed.
Contributing to the conference: If you want to present your work in
the conference please go to the conference wikipage and add your talk
there. To create an attractive program for the conference, we will
later ask you to give further information about your proposals.
Tutorials and presentations will be video and audio recorded and will
be made available for others after the conference.
==Among others, we encourage contributions on the following topics==
===Applications of semantic wikis===
* Semantic wikis for enterprise workflows and business intelligence
* Semantic wikis for corporate or personal knowledge management
* Exchange on business models with semantic wikis
* Lessons learned (best/worst practices) from using semantic wikis or
* Semantic wikis in e-science, e-learning, e-health, e-government
* Semantic wikis for finding a common vocabulary among a group of people
* Semantic wikis for teaching students about the Semantic Web
* Offering incentives for users of semantic wikis
===Development of semantic wikis===
* Semantic wikis as knowledge base backends / data integration platforms
* Comparisons of semantic wiki concepts and technologies
* Community building, feature wishlists, roadmapping of Semantic MediaWiki
* Improving user experience in a semantic wiki
* Speeding up semantic wikis
* Integrations and interoperability of semantic wikis with other
applications and mashups
* Modeling of complex domains in semantic wikis, using rules, formulas etc.
* Access control and security aspects in semantic wikis
* Multilingual semantic wikis
If you have questions you can contact me (Yury Katkov, Program Chair),
Benedikt Kämpgen (General Chair) or Karsten Hoffmeyer (Local Chair)
per e-mail (Cc).
Hope to see you in Berlin
Yury Katkov, Program Chair
Suppose I publish a web page about a notable person, building or other
entity; and that the subject has a Wikidata entry.
What's the best meta header, to assert that the page is about the same
subject as the Wikidata entry?
I'm thinking of something like:
<link rel="foo" href="https://www.wikidata.org/wiki/Q42">
<meta name="DC.bar" content="https://www.wikidata.org/wiki/Q42">
but what values of foo or bar?
Given the likely ubiquity of Wikidata in the near future, should we mint:
or a more generic:
<link rel="datasource" ?
Are there any such headers already in the wild? Should Wikipedia
articles and pages on sister projects include such headers?
I am happy to announce the very first release of Wikidata Toolkit ,
the Java library for programming with Wikidata and Wikibase. This
initial release can download and parse Wikidata dump files for you, so
as to process all Wikidata content in a streaming fashion. An example
program is provided . The libary can also be used with MediaWiki
dumps generated by other Wikibase installations (if you happen to work
in EAGLE ;-).
Maven users can get the library directly from Maven Central (see );
this is the preferred method of installation. There is also an
all-in-one JAR at github  and of course the sources .
Version 0.1.0 is of course alpha, but the code that we have is already
well-tested and well-documented. Improvements that are planned for the
next release include:
* Faster and more robust loading of Wikibase dumps
* Support for various serialization formats, such as JSON and RDF
* Initial support for Wikibase API access
Nevertheless, you can already give it a try now. In later releases, it
is also planned to support more advanced processing after loading,
especially for storing and querying the data.
Feedback is welcome. Developers are also invited to contribute via github.
(you'll also need to install the third party dependencies manually when
I'm sorry - I really, properly am - for this spamming, but it's also
something that might interest the Wikidata developing team.
I and a couple of other users (if selected) are going to hold a
presentation at Wikimania 2014 about a project conducted by Wikimedia
Italy and the Europeana network of Ancient Greek and Latin Epigraphy
(EAGLE). The full description is here:
TL;DR: Wikimedia Italy and EAGLE are using Wikibase extensions for
building up a database about Ancient Greek and Latin epigraphy,
getting the data from various universities and institutions... and the
thing is working! :) Of course, those data are in CC0, and there are
also plans to donate those data to Wikimedia community when the
Commons-Wikidata integration will be completed.
This should also be the first project outside the WMF perimeter to use
Wikibase for such a project (GerardM, please correct me if I'm wrong).
If you're interested in it, you might want to take a peek at it. :)
Sorry again for spamming!
Luca "Sannita" Martinelli
some weeks ago Anja asked me to send an email to this list with requirements from DBpedia regarding the Pubsubhubbub feed.
We are really happy that finally somebody started working on this.
The main thing DBpedia needs is the software to create an up-to-date mirror of each language version of Wikipedia . All other requirements can be deduced from this one.it would be bad for us, If this is out of scope or not working correctly at the end of the project.
All the best,
Sent from my Android device with K-9 Mail. Please excuse my brevity.
I have applied for GSoC 2014 with Mediawiki aiming to create a plugin that
can annotate statements in various websites and feed then as statements (
with references taken as website url and author in the case of Google
books, Wikisource etc.
Project Proposal is currently hosted at :
I need to get more feedback from community such as what extra features we
from the tool which can prove this project more useful. I hope this project
is considered useful by the community.
So kindly, take a look at proposal and provide any valuable comments you
This mail has been crossposted to wikitech-l and wikidata-tech
Hey everyone :)
Since Adam's internship with us is coming to an end we are looking for
a new amazing intern or working student to help us out around
Wikidata. You can find all the details at
http://wikimedia.de/wiki/Working_student_Wikidata_(f/m). If you have
any questions please let me know.
Lydia Pintscher - http://about.me/lydia.pintscher
Product Manager for Wikidata
Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.