Being a high school (15) student who likes programming, I often develop
software to help me (&fellow students) study. In two days time we have a
history exam with a ton of dates to learn, so I began thinking of a way to
represent it in an axis via a web page.
And Wikidata came to my head. I guess it could be really useful, given the
property system (it could take into account single events and also periods
such as wars). I'll try to do it (obviously, I'm now in term exams though
it could be an incredible way to procrastinate...), but anyway, is there
any interest in this? I guess it could replace the current rather
awkward-looking axes.
>From a technical viewpoint, could it be feasible to make it as an extension
to Mediawiki which displays HTML5? I have some experience with MediaWiki's
PHP but not much.
Thanks in advance, and sorry for my non-native Englisn,
Joan Creus.
During an IRC discussion, I was told that a page in namespace 0 like
Q219937<http://www.wikidata.org/wiki/Q219937> does
not necessarily have a one-to-one relationship with an entity like "Bonnie
and Clyde".
wbgetentities<http://www.wikidata.org/w/api.php?action=wbgetentities&ids=Q219937&format=j…>
API
call gives this:
"entities": {
"q219937": {
"pageid": 214789,
"ns": 0,
"title": "Q219937",
"lastrevid": 7969610,
"modified": "2013-02-27T09:17:25Z",
"id": "q219937",
"type": "item",
"aliases": {
......
How is it possible to have more than one entity in one wiki page
titled Q219937, if the entity id is the same as page title? In what cases
would it be used? Is that a needed extra complexity?
In the case of Bonnie and Clyde (one wikipage in language A vs two
wikipages in B), wikidata can have three entities with links to static
redirects, apparently solving the need of one-to-many.
I am only considering item entities (ns:0), since query pages will
obviously have more than one entity associated with them.
Thanks!
ESWC 2013: Call for Participation
Montpellier, May 26th - 30th
http://2013.eswc-conferences.org
ESWC 2013 is the major Europe-based conference on Semantic Technologies
and the Semantic Web. It is the ideal venue for discussing the latest
scientific results and technology innovations and is equally relevant to
researchers, academics, developers and practitioners in the field of
Semantic Technologies. The motto of this year's edition is "Semantics
and Big Data".
Registration is open; early registration rates are available until March
17h!
http://2013.eswc-conferences.org/registration
We hope to see you at ESWC 2013 in Montpellier!
=== Program ===
== Keynote Speakers ==
ESWC 2013 will feature the following three keynote speakers:
- Enrico Motta (Knowledge Media Institute, Open University)
- David Karger (MIT)
- Manfred Hauswirth (Digital Enterprise Research Institute, National
University of Galway)
== Accepted Papers ==
A list of accepted papers can be found here:
http://2013.eswc-conferences.org/program/accepted-papers
== Workshops and Tutorials ==
The main conference program will be complemented by 13 workshops and 7
tutorials:
http://2013.eswc-conferences.org/program/workshopshttp://2013.eswc-conferences.org/program/tutorials
== Demonstration and poster presentations ==
If you want to present a demo on your work or a poster at the
conference, the call for posters and demonstrations is still open until
March 7th:
http://2013.eswc-conferences.org/important-dates/call-demoshttp://2013.eswc-conferences.org/important-dates/call-posters
== Semantic Mashup Challenge ==
The AI mashup challenge accepts and awards mashups that use semantic web
technology, including but not restricted to machine learning and data
mining, machine vision, natural language processing, reasoning,
ontologies in the context of the semantic web. Mashups will be presented
in a special session during the ESWC 2013 Conference. The call is still
open: http://2013.eswc-conferences.org/program/co-located-events
=== Registration ===
Early registration rates are available until March 17th
http://2013.eswc-conferences.org/registration
=== Student Travel Support ===
Some travel support is available for undergraduate and graduate
students; please contact the general chair to apply for this:
http://2013.eswc-conferences.org/registration
=== Organizers ===
General Chair : Philipp Cimiano (Bielefeld University, DE)
Program Chairs : Valentina Presutti (STLab, ISTC-CNR, IT) & Oscar Corcho
(UPM, ES)
Local Chairs : Clement Jonquet & François Scharffe (LIRMM, Université
Montpellier 2, FR)
ESWC 2013 is brought to you by STI International in collaboration with
University Montpellier
http://news.ycombinator.com/item?id=5328472
For those unfamiliar, this is a mostly U.S.-dominated site for hackers,
startup people, etc. so the discussion doesn't tend to shy away from the
technical. I'm sure a few Wikidata devotees are Hacker Newsies themselves.
Steven
Hi mediawiki developers,
We (Google) are trying to maintain our internal mirror of
wikidata.orgup-to-date in real-time, so that our indexing pipeline can
get latest
interlanguage information in real-time.
I noticed wikidata.org is also powered by standard MediaWiki software,
where standard query api to a specific revision works, e.g.
revision query:
http://www.wikidata.org/w/api.php?action=query&prop=revisions&format=xml&rv…
and
recentchanges:
http://wikidata.org/w/api.php?action=query&list=recentchanges&format=xml&rc…
My questions are:
- Are the APIs above ("action=query&prop=revisions" and
"actioon=query&list=recentchanges") the supported way to retrieve
wikidata.org in realtime?
- Is there any document about the json format in response? It looks to
me that "links" are about interwiki/interlanguage links (or sitelinks in
wikidata.org's terminology), but I feel more comfortable if I see some
official document about this.
- There are some ids like "dewiki", "enwiki" etc, which I guess can be
interpreted to corresponding languages "de", "en" respectively. But is
there a reliable map from these *wiki to the language code? And some are
even using 3-letter prefix, e.g. gotwiki, xmfwiki.
Thanks
--
Jiang BIAN
This email may be confidential or privileged. If you received this
communication by mistake, please don't forward it to anyone else, please
erase all copies and attachments, and please let me know that it went to
the wrong person. Thanks.
Thanks for all the reply. The new dump seems having generated. However, it
failed, see http://dumps.wikimedia.org/wikidatawiki/20130228/.
I'm wondering how people handle such failure. Can it be fixed soon, or we
have to wait another 11-12 days for next dump?
Thanks.
Thanks for the feedback. I think we will push forward and work directly on
wikidata. This is a conversion from database(s)->wikipedia to
database(s)->wikidata->wikipedia
All of our code will be and is open source. We'd be happy to share and to
build on what other bot developers are doing. Is there a list of wikidata
bot code repos somewhere?
I am considering the task of converting the templates from the gene
articles in Wikipedia (http://en.wikipedia.org/wiki/Portal:Gene_Wiki) to
use/create wikidata assertions. This involves an extensive update of the
template structure as well as the code for the bot that keeps them in sync
with external public databases. (https://bitbucket.org/sulab/pygenewiki)
More specifically I'm thinking about working with a Google Summer of Code
student on this project.
Given a time frame of now through August, would it make sense for us to
pursue this objective directly in the context of wikidata (through the
public API). Or would it be better for us to install our own version of
the wikibase software (kept in sync with code updates) and develop the new
gene wiki bot code locally with the aim of switching to the public API
later? Or is it too early to consider this project?
I want to get involved and support wikidata with this important data, but
I'm hesitant to ramp up development (especially with a student) in a moving
target situation.
Any thoughts?
thanks!
-Ben
Hi!
Argh... Today the update of the demo servers failed on test-repo due a
puppet problem. The client thus doesn't work either. Sorry for that -
I'm trying to figure out why this happened to get the demo servers back
and let you know. Might be that this is only tomorrow (as in European
Timezone tomorrow).
Best,
Silke
--
Silke Meyer
Systemadministratorin und Projektassistenz Wikidata
Wikimedia Deutschland e.V. | Obentrautstr. 72 | 10963 Berlin
Tel. (030) 219 158 260
http://wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt
für Körperschaften I Berlin, Steuernummer 27/681/51985.