Dear users, developers and all people interested in semantic wikis,
We are happy to announce SMWCon Fall 2013 - the 8th Semantic MediaWiki
Conference:
* Dates: October 28th to October 30th 2013 (Monday to Wednesday)
* Location: A&O Berlin Hauptbahnhof, Lehrter Str. 12, 10557 Berlin, Germany
* Conference wikipage: https://semantic-mediawiki.org/wiki/SMWCon_Fall_2013
* Participants: Everybody interested in semantic wikis, especially in
Semantic MediaWiki, e.g., users, developers, consultants, business
representatives, researchers.
SMWCon Fall 2013 will be supported by the Open Semantic Data
Association e. V. [1]. Our platinum sponsor will be WikiVote ltd,
Russia [2].
Following the success of recent SMWCons, we will have one tutorial day
and two conference days.
Participating in the conference: To help us planning, you can already
informally register on the wikipage, although a firm registration will
later be needed.
Contributing to the conference: If you want to present your work in
the conference please go to the conference wikipage and add your talk
there. To create an attractive program for the conference, we will
later ask you to give further information about your proposals.
Tutorials and presentations will be video and audio recorded and will
be made available for others after the conference.
==Among others, we encourage contributions on the following topics==
===Applications of semantic wikis===
* Semantic wikis for enterprise workflows and business intelligence
* Semantic wikis for corporate or personal knowledge management
* Exchange on business models with semantic wikis
* Lessons learned (best/worst practices) from using semantic wikis or
their extensions
* Semantic wikis in e-science, e-learning, e-health, e-government
* Semantic wikis for finding a common vocabulary among a group of people
* Semantic wikis for teaching students about the Semantic Web
* Offering incentives for users of semantic wikis
===Development of semantic wikis===
* Semantic wikis as knowledge base backends / data integration platforms
* Comparisons of semantic wiki concepts and technologies
* Community building, feature wishlists, roadmapping of Semantic MediaWiki
* Improving user experience in a semantic wiki
* Speeding up semantic wikis
* Integrations and interoperability of semantic wikis with other
applications and mashups
* Modeling of complex domains in semantic wikis, using rules, formulas etc.
* Access control and security aspects in semantic wikis
* Multilingual semantic wikis
If you have questions you can contact me (Yury Katkov, Program Chair),
Benedikt Kämpgen (General Chair) or Karsten Hoffmeyer (Local Chair)
per e-mail (Cc).
Hope to see you in Berlin
Yury Katkov, Program Chair
[1] http://www.opensemanticdata.org/
[2] http://wikivote.ru
Suppose I publish a web page about a notable person, building or other
entity; and that the subject has a Wikidata entry.
What's the best meta header, to assert that the page is about the same
subject as the Wikidata entry?
I'm thinking of something like:
<link rel="foo" href="https://www.wikidata.org/wiki/Q42">
or
<meta name="DC.bar" content="https://www.wikidata.org/wiki/Q42">
but what values of foo or bar?
Given the likely ubiquity of Wikidata in the near future, should we mint:
<link rel="wikidata"
or a more generic:
<link rel="datasource" ?
Are there any such headers already in the wild? Should Wikipedia
articles and pages on sister projects include such headers?
--
Andy Mabbett
@pigsonthewing
http://pigsonthewing.org.uk
Hey folks :)
>From 9th to 11th of May there will be a [[mw:Zürich Hackathon
2014|MediaWiki hackathon in Zurich]]. Scholarships are available for
this and some of them are given out by Wikimedia Germany for MediaWiki
developers who are from Germany or work on Wikidata or the migration
of tools from Toolserver to Tool Labs. You can apply for a scholarship
directly via the registration form for the hackathon at
https://docs.google.com/forms/d/1nlrQ7cox36xaNK1u9iKP-thogY5TVrilOGJR79DqQ9…
The registration deadline is 16th of March. I’m looking forward to
seeing many of you there to work on Wikidata together. The whole
Wikidata dev team will attend.
Cheers
Lydia
--
Lydia Pintscher - http://about.me/lydia.pintscher
Product Manager for Wikidata
Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
Hello everybody! Glad to join the team!
I'm a freshman to use the Wikidata, and face some confusions:
1. I though Wikidata gathered data from Wikipedia infobox, but I also find
that, the statements and infobox are not entirely corresponded. Some
infobox data are missed in statements and the Statements also have more
data than infobox. Then how does Wikidata actually create a new item?
What's the detailed process? If there are new or updated articles in
Wikipedia, how does Wikidata catch these updates?
2. I think there are more valuable information in Wikipedia but not
included in Wikidata, e.g. categories, infobox data. How can I enrich the
Wikidata? I know I may use & follow WikiData Bot's process, any directions?
Any suggestions or guidance are welcome!
BRs,
Sherlock
Hie,
I am working on the project for a Catalog for Mediawiki Extensions. The
catalog will be just like a store for extensions where users can find the
best suited extension for their need and we were thinking if we could use
wikidata as a backend for attributes?
If that is possible, then I need some assistance in deciding that list of
attributes which wikidata can provide. Please suggest how to proceed.
A proposal page for the project can be found here:
http://www.mediawiki.org/wiki/MediaWiki/Gallery_of_extensions
Some of the attributes we thought for extensions are:
# Name
# Description
# Image
# Version
# Category
# Version supported
# Added on
...and more.
Regards,
Aditya.
https://www.mediawiki.org/wiki/User:Adi.iiita
While reading this [1] and observing how other website load large pages
without freezing my computer, I was just wondering why progressive loading
of items was not an option for Wikidata.
Is it too difficult to implement? Or impossible with mediawiki? Or just
undesirable for some other reasons?
Thanks!
Micru
[1] docforge.com/wiki/Web_application/Progressive_loading
Hi all!
I had to move (via category.py) ~3200 categories on it.wiki, as you know
moving a category actually means replacing&deleting so I don't think
bots will be able to update the relevant wikidata's entries. Any
suggestion to update them? Anyone would be able to update it from a list
of all those replacements?
Vito
Hey folks :)
We have just deployed new code. The most important changes are:
* We cut down loading time of items significantly. Have a look at this
graph: https://commons.wikimedia.org/wiki/File:Wikidata_Item_Loading_Time.png
We're not done yet with performance improvements but this is already
a huge step.
* Wikisource now has access to the data on Wikidata via the property
parser function and Lua.
* We have extended the Lua interface significantly to make it easier
to use Wikidata data in Lua. Hoo will publish more info about this
soon. If you can't wait the documentation is up at
https://www.mediawiki.org/wiki/Extension:WikibaseClient/Lua (To use it
on Wikipedia you will have to wait until Thursday.)
* Bugfixes
Cheers
Lydia
--
Lydia Pintscher - http://about.me/lydia.pintscher
Product Manager for Wikidata
Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
Hi all,
The new Wikidata Toolkit project has started this week, so here is a
brief update. You are invited to participate and to comment. Early users
will enjoy a special first-level support: if you have an interesting
use-case for our project, we will help you get started and prioritize
your requirements.
Wikidata Toolkit is going to be a Java library to work with Wikidata
(and Wikibase in general). It will allow you to load, query, and analyse
the data, and to export it to a variety of formats.
Key facts:
* Homepage: https://www.mediawiki.org/wiki/Wikidata_Toolkit
* Code: https://github.com/Wikidata/Wikidata-Toolkit
* Planned milestones:
https://github.com/Wikidata/Wikidata-Toolkit/issues/milestones
* Project plan (high level):
https://meta.wikimedia.org/wiki/Grants:IEG/Wikidata_Toolkit
Our current team consists of four people: Fredo Erxleben, Julian Mendez,
Michael Günther, and myself. Development is supported by the Wikimedia
Foundation and by the German Research Foundation. The project is
initially planned for six months.
You are invited to contribute code/feature requests/use cases. If you
have a concrete task that you would like to solve, please let us know,
and we will see what we can do. Our first release 0.1 should be able to
create data exports in RDF and maybe in other formats from the internal
Wikidata dump files. If you need Wikidata in other formats, please let
us know.
Our later releases will be able to load data and add query capabilities
that allow you to analyse the data. Again, your requirements are welcome.
Cheers,
Markus
--
Markus Kroetzsch
Faculty of Computer Science
Technische Universität Dresden
+49 351 463 38486
http://korrekt.org/