*keeping crosslisting*
22.06.2014 14:10, Amir E. Aharoni написав(ла):
> What makes more sense at this point is to start adding a paragraph
> about Wikidata to the Tech News as they are rather than having a whole
> separate translatable newsletter. In the Tech News there's a whole
> paragraph every week about the VisualEditor, and there are short
> updates about CirrusSearch and MediaViewer almost every week. A weekly
> update about Wikidata can be there, too. If a whole paragraph is too
> long, a couple of the most important points would be a good beginning.
>
> It takes me less than ten minutes a week to translate each whole Tech
> News issue. Adding a few sentences about Wikidata should add no more
> than two minutes to the time that the translators already invest. I
> think that it's perfectly doable.
>
>
> --
> Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
> http://aharoni.wordpress.com
> “We're living in pieces,
> I want to live in peace.” – T. Moore
>
>
> 2014-06-22 13:32 GMT+03:00 Bohdan Melnychuk <base-w(a)yandex.ru
> <mailto:base-w@yandex.ru>>:
>
> Hello,
>
> I want the subject's mailing list to be translatable and delivered
> the same way as it's with Tech news. I've asked about it on it's
> subscription page's talk page and faced there Lydia, who doesn't
> believe that there would be any volunteers (except for me in the
> current point) to translate it.
>
> So if you would like to translate the newsletter when it would be
> set to be possible, could you leave a note in
> [[:m:Talk:Global_message_delivery/Targets/Wikidata#Multilingual
> distribution
> <https://meta.wikimedia.org/wiki/Talk:Global_message_delivery/Targets/Wikida…>]]
> about your willing so we get Lydia convinced :) Poking friends who
> may do it is welcome as well ;)
>
> Also it would be nice if Odder or/and Guillom leave a note about
> technical side of such delivery the same way as the latter was so
> kind to leave in Education newsletter's talk :)
>
> Yours sincerely,
> --Base
>
> _______________________________________________
> Translators-l mailing list
> Translators-l(a)lists.wikimedia.org
> <mailto:Translators-l@lists.wikimedia.org>
> https://lists.wikimedia.org/mailman/listinfo/translators-l
>
>
>
>
> _______________________________________________
> Translators-l mailing list
> Translators-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/translators-l
Hey folks :)
I just uploaded the first drafts for the new user interface design:
https://www.wikidata.org/wiki/Wikidata:UI_redesign_input It's not
finished yet but I wanted to hear what you're thinking about the
direction we're going. Please leave any feedback you have on that page
so we can keep it all in one place.
Cheers
Lydia
--
Lydia Pintscher - http://about.me/lydia.pintscher
Product Manager for Wikidata
Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
I'm not sure who maintains Reasonator, but I figure they probably subscribe to
this list.
It would be really awesome if Reasonator didn't ignore the Of (P:642) qualifier
in Positions Held (P:39).
See: http://tools.wmflabs.org/reasonator/?&q=17152496
It just says that Randy McClement (Q:17152496) was Mayor, even though the
Wikidata item has that we was Mayor of Frederick. I'm sure there are lots of
other instances where this issue comes up.
Thank you,
Derric Atzrott
Computer Specialist
Alizee Pathology
Dear users, developers and all people interested in semantic wikis,
We are happy to announce SMWCon Fall 2013 - the 8th Semantic MediaWiki
Conference:
* Dates: October 28th to October 30th 2013 (Monday to Wednesday)
* Location: A&O Berlin Hauptbahnhof, Lehrter Str. 12, 10557 Berlin, Germany
* Conference wikipage: https://semantic-mediawiki.org/wiki/SMWCon_Fall_2013
* Participants: Everybody interested in semantic wikis, especially in
Semantic MediaWiki, e.g., users, developers, consultants, business
representatives, researchers.
SMWCon Fall 2013 will be supported by the Open Semantic Data
Association e. V. [1]. Our platinum sponsor will be WikiVote ltd,
Russia [2].
Following the success of recent SMWCons, we will have one tutorial day
and two conference days.
Participating in the conference: To help us planning, you can already
informally register on the wikipage, although a firm registration will
later be needed.
Contributing to the conference: If you want to present your work in
the conference please go to the conference wikipage and add your talk
there. To create an attractive program for the conference, we will
later ask you to give further information about your proposals.
Tutorials and presentations will be video and audio recorded and will
be made available for others after the conference.
==Among others, we encourage contributions on the following topics==
===Applications of semantic wikis===
* Semantic wikis for enterprise workflows and business intelligence
* Semantic wikis for corporate or personal knowledge management
* Exchange on business models with semantic wikis
* Lessons learned (best/worst practices) from using semantic wikis or
their extensions
* Semantic wikis in e-science, e-learning, e-health, e-government
* Semantic wikis for finding a common vocabulary among a group of people
* Semantic wikis for teaching students about the Semantic Web
* Offering incentives for users of semantic wikis
===Development of semantic wikis===
* Semantic wikis as knowledge base backends / data integration platforms
* Comparisons of semantic wiki concepts and technologies
* Community building, feature wishlists, roadmapping of Semantic MediaWiki
* Improving user experience in a semantic wiki
* Speeding up semantic wikis
* Integrations and interoperability of semantic wikis with other
applications and mashups
* Modeling of complex domains in semantic wikis, using rules, formulas etc.
* Access control and security aspects in semantic wikis
* Multilingual semantic wikis
If you have questions you can contact me (Yury Katkov, Program Chair),
Benedikt Kämpgen (General Chair) or Karsten Hoffmeyer (Local Chair)
per e-mail (Cc).
Hope to see you in Berlin
Yury Katkov, Program Chair
[1] http://www.opensemanticdata.org/
[2] http://wikivote.ru
FYI: this project claims to use Wikidata (among other resources) for
multilingual word-sense disambiguation. One of the first third-party
uses of Wikidata that I am aware of (but other pointers are welcome if
you have them). Wiktionary and OmegaWiki are also mentioned here.
Cheers,
Markus
-------- Original Message --------
Subject: Babelfy: Word Sense Disambiguation and Entity Linking Together!
Resent-Date: Mon, 16 Jun 2014 10:34:07 +0000
Resent-From: semantic-web(a)w3.org
Date: Mon, 16 Jun 2014 09:43:12 +0200
From: Andrea Moro <andrea8moro(a)gmail.com>
To: undisclosed-recipients:;
======================================================
Babelfy: Word Sense Disambiguation and Entity Linking together!
http://babelfy.org
======================================================
As an output of the "MultiJEDI" Starting Grant <http://multijedi.org>,
funded by the European Research Council and headed by Prof. Roberto
Navigli, the Linguistic Computing Laboratory <http://lcl.uniroma1.it>of
the Sapienza University of Rome is proud to announce the first release
of Babelfy <http://babelfy.org>.
Babelfy [1] is a joint, unified approach to Word Sense Disambiguation
and Entity Linking for arbitrary languages. The approach is based on a
loose identification of candidate meanings coupled with a densest
subgraph heuristic which selects high-coherence semantic
interpretations. Its performance on both disambiguation and entity
linking tasks is on a par with, or surpasses, those of task-specific
state-of-the-art systems.
Babelfy draws primarily on BabelNet (http://babelnet.org), a very large
encyclopedic dictionary and semantic network. BabelNet 2.5 covers 50
languages and provides both lexicographic and encyclopedic knowledge for
all the open-class parts of speech, thanks to the seamless integration
of WordNet, Wikipedia, Wiktionary, OmegaWiki, Wikidata and the Open
Multilingual WordNet.
Features in Babelfy:
* 50 languages covered!
* Available via easy-to-use Java APIs.
* Disambiguation and entity linking is performed using BabelNet, thereby
implicitly annotating according to several different inventories such as
WordNet, Wikipedia, OmegaWiki, etc.
Babelfy the world(be there and get a free BabelNet t-shirt!):
* Monday, June 23 - ACL 2014 (Baltimore, MD, USA) - TACL paper
presentation <http://www.transacl.org/wp-content/uploads/2014/05/54.pdf>
* Tuesday, August 19 - ECAI 2014 (Prague, Czech Republic) - Multilingual
Semantic Processing with BabelNet <http://www.ecai2014.org/tutorials/>
* Sunday, August 24 - COLING 2014(Dublin, Ireland) - Multilingual Word
Sense Disambiguation and Entity Linking
<http://www.coling-2014.org/tutorials.php>
[1] Andrea Moro, Alessandro Raganato, Roberto Navigli. Entity Linking
meets Word Sense Disambiguation: a Unified Approach
<http://www.transacl.org/wp-content/uploads/2014/05/54.pdf>.
Transactions of the Association for Computational Linguistics (TACL), 2,
pp. 231-244 (2014).
Hi all,
We are now offering regular RDF dumps for the content of Wikidata:
http://tools.wmflabs.org/wikidata-exports/rdf/
RDF is the Resource Description Framework of the W3C that can be used to
exchange data on the Web. The Wikidata RDF exports consist of several
files that contain different parts and views of the data, and which can
be used independently. Details on the available exports and the RDF
encoding used in each can be found in the paper "Introducing Wikidata to
the Linked Data Web" [1].
The available RDF exports can be found in the directory
http://tools.wmflabs.org/wikidata-exports/rdf/exports/. New exports are
generated regularly from current data dumps of Wikidata and will appear
in this directory shortly afterwards.
All dump files have been generated using Wikidata Toolkit [2]. There are
some important differences in comparison to earlier dumps:
* Data is split into several dump files for convenience. Pick whatever
you are most interested in.
* All dumps are generated using the OpenRDF library for Java (better
quality than ad hoc serialization; much slower too ;-)
* All dumps are in N3 format, the simplest RDF serialization format that
there is
* In addition to the faithful dumps, some simplified dumps are also
available (one statement = one triple; no qualifiers and references).
* Links to external data sets are added to the data for Wikidata
properties that point to datasets with RDF exports. That's the "Linked"
in "Linked Open Data".
Suggestions for improvements and contributions on github are welcome.
Cheers,
Markus
[1] http://korrekt.org/page/Introducing_Wikidata_to_the_Linked_Data_Web
[2] https://www.mediawiki.org/wiki/Wikidata_Toolkit
--
Markus Kroetzsch
Faculty of Computer Science
Technische Universität Dresden
+49 351 463 38486
http://korrekt.org/