Before porting World University and School (
http://worlduniversity.wikia.com/wiki/World_University - like Wikipedia
with MIT OCW) to WikiData, I'd like to learn of wikis which work on obscure
handheld browsers, particularly for older internet phones in the
For example, Wikipedia doesn't work on my Palm Treo mobile phone using Palm
OS 4.0. For WUaS to be helpful as wiki to all 3000-8000 languages, easy
usage on all or many older mobile phone browsers will be key.
Does anyone know of a list of diverse wikis I can try on my old, internet
mobile Palm Treo OS phone (at least as readable as New York Time's
articles, which aren't wiki, on my Palm OS), which might in turn work with
WikiData, WikiBase and SemanticWiki?
WUaS is planning for ALL ~200 countries and 3,000-8,000 languages, as wiki.
All the best,
Founder & President
World University and School
(like Wikipedia with MIT Open Course Ware)
P.O. Box 442,
(86 Ridgecrest Road),
Canyon, CA 94516
415 480 4577
Google + main, WUaS page:
Please contribute, and invite friends to contribute, tax deductibly, via
PayPal and credit card:
World University and School is a 501 (c) (3) tax-exempt educational
World University and School is sending you this because of your interest in
free, online, higher education. If you don't want to receive these, please
reply with 'remove' in the subject line. Thank you.
Date: Fri, 25 May 2012 20:28:58 +0200
From: emijrp <emijrp(a)gmail.com>
To: "Discussion list for the Wikidata project."
Subject: Re: [Wikidata-l] Who to talk to about integrating WolrCat
Content-Type: text/plain; charset="iso-8859-1"
2012/5/25 Lydia Pintscher <lydia.pintscher(a)wikimedia.de>
>> A decision hasn't been made. We're looking at cc-0 for the beginning
>> for the data but the community might decide to change this later.
>I think that CC-0 is a good choice.
The answer is that WorldCat is published under Open Data Commons,
For more information http://opendatacommons.org/category/odc-by/ .
I know CC0 is tempting choice, but I hope a choice is made that can
accommodate willing partners.
Emilio J. Rodr?guez-Posada. E-mail: emijrp AT gmail DOT com
Pre-doctoral student at the University of C?diz (Spain)
Projects: AVBOT <http://code.google.com/p/avbot/> |
| WikiEvidens <http://code.google.com/p/wikievidens/> |
| WikiTeam <http://code.google.com/p/wikiteam/>
Personal website: https://sites.google.com/site/emijrp/
Hello Wikidata Wizards,
Phoebe Ayers from the Board recommended I talk to you. My name is Max
Klein and I am the Wikipedian in Residence for OCLC. OCLC owns
Worldcat.org the world's largest holder of Library data at 264 million
bibliographic records about books, journals and other library items. We
would really like to partner with you as Wikidata is being built, in
incorporating our data into your project.
What we can offer:
* WorldCat.org metadata http://www.worldcat.org/ .
o Typically, for any work we have most of the following: title,
authors, publisher, formats, summaries, editions, subjects, languages,
intended audience, all associated ISBNs, length, and abstract.
* APIs to this data http://oclc.org/developer/
o And some other cool APIs like xISBN which returns all the ISBNs of
all the editions of book on the input of any single one.
* Library finding tools
o When viewing a record on our site, we show you the closest library
which has that work, and links to reserve it for pick-up.
* The Virtual International Authority File (VIAF)
http://viaf.org/, which is an Authoritative Disambiguation file
o That means that we have certified data on disambiguation of Authors
* WorldCat Identities, an Analytics site
o It gives you for Author metadata and analytics: Alternative names,
significant dates, publication timelines, genres, roles, related
authors, and tag clouds of associated subjects.
What's in it for us:
* We are a not-for-profit member cooperative. Our mission is
"Connecting people to knowledge through library cooperation."
* Since I work at the research group, for now this is just a
o If at some point this goes live - and you want to - we'd like to
integrate the "find it at a library near me" feature, that means
click-throughs for us.
There are a lot of possibilities, and I'd like to hear your input. These
are the first few that I've can come up with.
* Making infoboxes for each book or author that contains all
o Ready to incorporate into all language projects.
* Using authority files to disambiguate or link works to their
o Solving DABs
* Using our analytics (e.g. author timelines) as Wikidata data
types to transclude.
o Curating articles with easy to include dynamic analytics
* Populating or creating works/author pages with their
algorithmically-derived history and details.
o Extremely experimental semantic work.
I'm roaring and ready to get this collaboration going. I know Wikidata
is at an early stage, and we are willing to accommodate you.
Send me any feedback or ideas,
Wikipedia in Residence
I made some effort to draft Wikidata in Hungarian Wikipedia. Although it is
not quite up-to-date, but is a useful overview for Hungarian language
community. As far as I remember, this was the first national language
article on Wikidata except German, preceeding the current system. I created
a soft redirect to it on Meta which was eradicated by Fuzzybot (
and now I cannot edit it at all. I don't want to translate that whole stuff
from week to week again, and seemingly nobody else does it, but could we
get back the link to huwiki please? I think a long English text that is
identical to the English version and very-very up-to-date and meanwhile
states that it *is* the Hungarian version is much less useful for Hungarian
speakers than a link to the Hungarian article that backlinks to the
original one for those who want to read the current version in English.
This solution is very uniform and easy to handle by bot and very aggressive
-- whoever made it had no regard on previous content and I am really
disappointed to face this. So what is the way to have at least a sentence
on top of this nice text that leads to the useful version and will not be
ruined by the bot again and again?
I have created a first preliminary draft of how data items from the Wikidata
repository may be accessed and rendered on the client wiki, e.g. to make infoboxes.
It would be great if you could have a look and let us know about any
unclarities, omissions, or other flaws - and of course about your ideas of how
to do this.
Getting this right is an important part of implementing phase 2 of the Wikidata
project, and so I feel it's important to start drafting and discussing early.
Having a powerful but not overly complex way to create infoboxes etc from
Wikidata items is very important for the acceptance of Wikidata on the clinet
wikis, I believe.
Daniel Kinzler, Softwarearchitekt
Wikimedia Deutschland e.V. | Eisenacher Straße 2 | 10777 Berlin
http://wikimedia.de | Tel. (030) 219 158 260
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt
für Körperschaften I Berlin, Steuernummer 27/681/51985.
I just wanted to let you know that the next Wikidata office hours will
be on Tuesday and Wednesday next week. Denny and I will be around on
IRC in #wikimedia-wikidata to answer any question you might have and
discuss. I assume there will be a few more questions than usual now
that we have a demo system. Logs will be published afterwards.
English: May 29 at 16:30 UTC
German: May 30 at 16:30 UTC
Lydia Pintscher - http://about.me/lydia.pintscher
Community Communications for Wikidata
Wikimedia Deutschland e.V.
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
I'm about to head off to Berlin, but I've been quietly toiling away on
this project for a while (technically since 2005) and I figured it's
time to start making a little noise.
The project is the JsonData extension for MediaWiki, which allows for
validated editing of JSON files using a form in a MediaWiki
editor, which has been around a long time (since 2005...gah, it's
almost as old as a second grader!) When I wrote that, I always
envisioned using that on a wiki, and never finished the job until
recently. I also have never had a strong desire to maintain a schema
format, so over this past weekend, I converted it to use a subset of
Kris Zyp's draft-03.
Validation is done both server side (PHP) and client side
client side library are three-clause BSD. The rest of the MediaWiki
extension is GPLv2.
The UI is still quite 2005-ish, unfortunately. There are other
draft-03 compatible editors out there (one recently announced built on
jquery-ui) which I may take a crack at .
More information about the project can be found here:
I've got a test install there, so if you'd like to do some playing
there, feel free to play around.
I haven't made a big point of getting this checked into Gerrit, but I
plan on doing that.
This is a personal project (during my own time) and not-at-all
something I'm doing on behalf of the WMF. It's also not (yet?)
affiliated with Wikidata project, and I'm happy to change names to
p.s. the security question answer, should you hit it, is "wikitech-l".
It's amazing how quickly wikis start attracting spam.