Dear users, developers and all people interested in semantic wikis,
We are happy to announce SMWCon Fall 2013 - the 8th Semantic MediaWiki
Conference:
* Dates: October 28th to October 30th 2013 (Monday to Wednesday)
* Location: A&O Berlin Hauptbahnhof, Lehrter Str. 12, 10557 Berlin, Germany
* Conference wikipage: https://semantic-mediawiki.org/wiki/SMWCon_Fall_2013
* Participants: Everybody interested in semantic wikis, especially in
Semantic MediaWiki, e.g., users, developers, consultants, business
representatives, researchers.
SMWCon Fall 2013 will be supported by the Open Semantic Data
Association e. V. [1]. Our platinum sponsor will be WikiVote ltd,
Russia [2].
Following the success of recent SMWCons, we will have one tutorial day
and two conference days.
Participating in the conference: To help us planning, you can already
informally register on the wikipage, although a firm registration will
later be needed.
Contributing to the conference: If you want to present your work in
the conference please go to the conference wikipage and add your talk
there. To create an attractive program for the conference, we will
later ask you to give further information about your proposals.
Tutorials and presentations will be video and audio recorded and will
be made available for others after the conference.
==Among others, we encourage contributions on the following topics==
===Applications of semantic wikis===
* Semantic wikis for enterprise workflows and business intelligence
* Semantic wikis for corporate or personal knowledge management
* Exchange on business models with semantic wikis
* Lessons learned (best/worst practices) from using semantic wikis or
their extensions
* Semantic wikis in e-science, e-learning, e-health, e-government
* Semantic wikis for finding a common vocabulary among a group of people
* Semantic wikis for teaching students about the Semantic Web
* Offering incentives for users of semantic wikis
===Development of semantic wikis===
* Semantic wikis as knowledge base backends / data integration platforms
* Comparisons of semantic wiki concepts and technologies
* Community building, feature wishlists, roadmapping of Semantic MediaWiki
* Improving user experience in a semantic wiki
* Speeding up semantic wikis
* Integrations and interoperability of semantic wikis with other
applications and mashups
* Modeling of complex domains in semantic wikis, using rules, formulas etc.
* Access control and security aspects in semantic wikis
* Multilingual semantic wikis
If you have questions you can contact me (Yury Katkov, Program Chair),
Benedikt Kämpgen (General Chair) or Karsten Hoffmeyer (Local Chair)
per e-mail (Cc).
Hope to see you in Berlin
Yury Katkov, Program Chair
[1] http://www.opensemanticdata.org/
[2] http://wikivote.ru
Hoi,
I have been blogging a lot the last two days with DBpedia in mind. My
understanding is that at DBpedia a lot of effort went into making something
of a cohesive model of properties. Now that the "main type GND" is about to
be deleted, it makes sense to adopt much of the work that has been done at
DBpedia.
The benefits are:
- we will get access to academically reviewed data structures
- we do not have to wait and ponder and get in to thebusiness enriching
the data content of DBpedia
- we can easily compare the data in DBpedia and Wikidata
- more importantly, DBpedia has spend effort in connecting to other
resources
Yes, we can import data from DBpedia and we can import data from Wikipedia.
Actually we can do both. The one thing that needs to be considered is that
we need data before we can curate it. With more data available it becomes
more relevant to invest time in tools that compare data. We can start doing
this now and, over time this will become more relevant. But now we need
more properties and the associated date.
What do you think?
Thanks,
GerardM
http://ultimategerard.blogspot.com
Heya folks :)
Here's this week's summary of what happened around Wikidata for you:
https://meta.wikimedia.org/wiki/Wikidata/Status_updates/2013_08_30
Cheers
Lydia
--
Lydia Pintscher - http://about.me/lydia.pintscher
Community Communications for Technical Projects
Wikimedia Deutschland e.V.
Obentrautstr. 72
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
Hello All ,
i was looking at WikiData dump , specifically this one :
wikidatawiki-20130818-pages-meta-hist-incr.xml.bz2
then i came to this statement
1. {
2. "m":[
3. "value",
4. 158,
5. "string",
6. "Great Seal of the United States (obverse).svg"
7. ],
8. "q":[
9.
10. ],
11. "g":"q30$D680D948-C2C1-493F-88AC-E4E2FB3764D2",
12. "rank":1,
13. "refs":[
14.
15. ]
16. },
the property P158 which is the seal flag image .
http://www.wikidata.org/wiki/Property:P158
and it's DataType should be "Commons media file" ? not "string" ?
i'm not sure if it's always the same way and i don't get it , or the
statement data is not consistent with property datatypes?
another Question : should i usually rely on the datatypes written in the
json dumps or should i build and index of wkidata properties and their
datatypes to avoid such situation ?
thanks
Regards
-------------------------------------------------
Hady El-Sahar
Research Assistant
Center of Informatics Sciences | Nile University<http://nileuniversity.edu.eg/>
Heya folks :)
Wikiyovage now has access to the data on Wikidata - meaning they got
phase 2. With this the first sister project is fully supported \o/ The
next sister projects will follow. More information on this soon.
We've also just updated the software here. This brings a number of bug
fixes and other improvements. The ones that are probably most
important for you:
* The copyright warning no longer pops up again when you change your language
* There is now a new special page to list items/properties without a
description in a given language. Thanks for the patch by Bene*.
* The message that pops up when you want to add a link to an item
which is already in use in another item has been improved. Thanks for
the patch by Nemo bis.
* Broken links to set sitelinks for Wikivoyage in the non-JavaScript
version have been fixed. ([[bugzilla:51914]], [[bugzilla:52095]])
* The automatic comments for edits have been improved. They are more
detailed now.
* API: You can now provide custom edit summaries for edits via the API.
* API: You can undo a revision via the API.
* API: Bot edits via the setclaim API module are now marked as such
([[bugzilla:50933]]).
* API: Precision and globe parameters are now required for geocoordinate data.
* Starting in a few days a Wikidata edit that shows up in recent
changes and watchlist on the client (Wikipedia/Wikivoyage) is going to
be marked with a "D". Thanks for the patch by umherirrender.
Unfortunately we were not able to put the URL datatype on the test
system on time for this deployment. It didn't get enough testing so we
couldn't deploy it today with a good concious. We know you're waiting
for this but it's better to give it a bit more testing and roll it out
in 2 weeks with the next deployment. The URL datatype is now live for
real on test.wikidata.org for you to try. Please give it a try and
report any bugs you encounter.
Please let me know if there are any issues.
Oh and one more thing: Abraham, Denny and I sat down for an evening
trying to captchure what Wikidata is about in a video. Hope you like
it :) http://www.wikidata.org/wiki/File:Wikidata%27s_World.webm
Cheers
Lydia
--
Lydia Pintscher - http://about.me/lydia.pintscher
Community Communications for Technical Projects
Wikimedia Deutschland e.V.
Obentrautstr. 72
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
Hello All,
as a part of GSoC2013 project , integrating WikiData inside DBpedia , we
are in the phase of mapping Wikidata properties to DBpedia ones .
this is important because once all mappings are done we can extract a
complete DBpedia facts dump with DBpedia properties from the Wikidata one
So, we need a little help from the community to in mapping of Wikidata
properties , there's around 544 Wikdata properties in use , that needs
mapping
you can check most important ones in that Google
spreadsheet<https://docs.google.com/spreadsheet/ccc?key=0AiphXcnJyGG-dHFHY2JaMUlaOUFBaD…>
to
start with ,also properties that weren't mapped yet.
in this wikipage<https://github.com/hadyelsahar/extraction-framework/wiki/Mapping-WikData-to…>
there
are detailed steps to add the mappings.
thanks
Regards
Hello All ,
there's a script error at the Wikidata:List of
properties<http://www.wikidata.org/wiki/Wikidata:List_of_properties/all>page
, it prevents seeing the properties names or descriptions
i checked Bugzilla and i didn't find it as a known issue (well , as far as
i searched).
thanks
Regards
-------------------------------------------------
Hady El-Sahar
Research Assistant
Center of Informatics Sciences | Nile University<http://nileuniversity.edu.eg/>
Heya folks :)
Denny and I will be doing another office hour for all things Wikidata
after Wikimania. Everyone is welcome to ask questions about Wikidata.
We'll be doing this on IRC in #wikimedia-office and start with a quick
update in the current state of Wikidata and its development. It'll be
on the 26th of August at 16:00 UTC. For your timezone see
http://www.timeanddate.com/worldclock/fixedtime.html?msg=Wikidata+office+ho…
Hope to see many of you there.
Cheers
Lydia
--
Lydia Pintscher - http://about.me/lydia.pintscher
Community Communications for Technical Projects
Wikimedia Deutschland e.V.
Obentrautstr. 72
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
I'd like to hear from the people on this list on a proposal to create a dedicated namespace to host open (tabular) data and make these datasets persistently identifiable, version controlled and easily embeddable into other wikis.
While this use case is currently not within the scope of Wikidata (and could potentially live on other Wikimedia wikis, like Meta or Commons), I'd appreciate input from the wikidata community on this draft:
https://meta.wikimedia.org/wiki/DataNamespace
Some interesting discussion on the talk page:
http://meta.wikimedia.org/wiki/Talk:DataNamespace
Dario