Heya folks,
We've just installed all the extensions on the demo system that are in
use on enwp to notice problems early with any of them. However the
other Wikipedias of course have other extensions installed that might
also be relevant. Please let me know if you know of any extensions
that we really should include in our testing and that are not yet
installed.
http://wikidata-test-client.wikimedia.de/wiki/Special:Version
Cheers
Lydia
--
Lydia Pintscher - http://about.me/lydia.pintscher
Community Communications for Wikidata
Wikimedia Deutschland e.V.
Obentrautstr. 72
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
Yes, although my point is that pywikipedia will probably not be made compatible with Wikidata anytime soon.
----- Original Message -----
From: Samat
Sent: 08/13/12 09:13 PM
To: Discussion list for the Wikidata project.
Subject: Re: [Wikidata-l] thoughts about a plan for enabling interlanguage linking
On Mon, Aug 13, 2012 at 5:48 PM, Snaevar < snaevar-wiki(a)gmx.com > wrote:
...
Bináris and Andre Engels have volunteered to make pywikipedia compatible with wikidata. Bináris said on the 24th of July that he is busy, in a response from Lydia to translate the Wikibase extensions into Hungarian. Since no progress has been made on that on translatewiki, I am going to assume that he still is busy.
Do you think of these [1] messages?
I think, we can find volunteers for translation into Hungarian.
[1]: https://translatewiki.net/w/i.php?title=Special%3ATranslate&taction=transla…
Samat
Hallo,
Preamble 1: This email probably falls under this FAQ question:
Q: How will Wikidata change the way articles are edited?
A: That’s part of what we have to figure out during the development,
together with the community.
Preamble 2: It's possible that there's an answer to this issue
already, but I couldn't find it.
A popular example of using Wikidata is that it makes maintaining
articles about cities easier: When a mayor of a city changes, it must
only be updated once.
The problem is that the mayor's name can be written differently in
other languages. I didn't actually try running it myself, but as far
as I understand, Wikidata supports translating names. But what happens
when the mayor changes? It is likely that the name will be updated in
the language spoken in that city. At that point articles in Wikipedia
in other languages will probably show the name in the language of the
city, which may be unreadable.
Let's take Haifa for an example. Its previous mayor was:
he: עמרם מצנע
en: Amram Mitzna
ru: Амрам Мицна
hr: Amram Micna
etc.
Now it changes to:
he: יונה יהב
And then suddenly all the articles about Haifa in all the languages
will show the mayor's name as "יונה יהב", which most people won't be
able to read. Maybe the Wikidata community will develop some kind of a
policy that will discourage adding names in local scripts without any
translation to a more common script. Maybe at some point software
should even show a warning if somebody tries to do it.
The scenario can be even simpler: Somebody will vandalize Wikidata and
change the mayor's name to some nonsense.
The most practical way to solve this is to show that some piece of
data that affects a Wikipedia article in the watchlist, as if it is a
change in the article itself. Is it possible? If not, is it planned?
It's a problem with Commons, too: An image that is used in an article
can change in Commons and it won't appear in the watchlist. But I
expect that it will happen a lot more often with Wikidata items and
that the changes would be a lot more subtle and hard to notice: It's
easy to notice that an image changed, but it's harder to notice a
change in a number or a name of a mayor.
Another question is: What is the fallback mechanism if a name was not
translated? The usual MediaWiki fallback rules can be reused, but
there's a twist, because in Wikidata the usual fallback language may
be unavailable. So in this case it will probably be:
my language -> my fallback language -> English -> the language in
which it is written
--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
“We're living in pieces,
I want to live in peace.” – T. Moore
Hallo,
It's my first email on this list, so in case you don't know me: I am
Amir, I'm from Israel, I'm a wikipedian since 2004, I write mostly in
Hebrew and English, I care strongly about language issues in software
in general and about right-to-left support in particular, and I work
in the WMF's localization team.
Now, about the subject: you probably know that i18n is
"internationalization" and "l10n" is "localization". "m17n" is a less
common term, which means "multilingualization" - making software able
to work in many languages at once. This email is about one of the
easiest and the most important ways to make Wikidata support many
languages on one page everywhere.
I've been testing the Wikidata demo for a few days now, with the aim
of getting it deployed in the Hebrew Wikipedia very soon. The first
thing that I noticed is that even though everybody understands that
Wikidata is supposed to be massively multilingual, little or no use is
made of the lang and dir attributes in the HTML that Wikidata
generates. The most immediate example is
http://wikidata-test-repo.wikimedia.de/wiki/Data:Q2?uselang=en
It basically lists the word "Helium" in many languages, but as far as
the browser is concerned, almost all of it is written in English,
because the root <html> element says lang="en". The only exceptions
are the interlanguage links in the sidebar, where the lang attributes
are user properly, but that's a regular MediaWiki feature.
It is very much needed to explicitly specify the lang attribute and
also the dir attribute (direction: "ltr" or "rtl") on every element,
the content language of which is known to be different from the
content language of the enclosing element. Many developers may think
that this attribute doesn't do anything, but actually it does a lot:
* correct text-to-speech and speech-to-text handling
* correct font rendering (relevant for Serbian [1], for some languages
of India etc.)
* selecting the correct spell checking dictionary
* selecting the right language for machine translation
* adjusting the line-height
* selecting the web font (in MediaWiki's WebFonts extension)
* etc.
So please, use it whenever you can.
Always use the dir attribute in these circumstances, too. It must be
specified explicitly even though "ltr" is the default, because if the
user interface is right-to-left, it will propagate to elements in
other languages, too, so you would right-to-left English. (I consider
this a bug in the HTML standard... but it's a topic for a different
email).
In the case of the page that I mentioned above, it should be quite
trivial to fix, because MediaWiki's Language class provides very easy
functions for this. I also opened bug 39257 [2] about it. I am
repeating it here on the mailing list, just to say to the developers
to do it everywhere. If you are a developer and you run into any
problems with using these attributes, please contact in any way that
is convenient to you.
Thank you!
[1] See https://sr.wikipedia.org/wiki/User:Amire80
[2] https://bugzilla.wikimedia.org/show_bug.cgi?id=39257
--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
“We're living in pieces,
I want to live in peace.” – T. Moore
Hi,
Some inline comments.
----- Original Message -----
From: Amir E. Aharoni
Sent: 08/12/12 11:56 AM
To: wikidata-l
Subject: [Wikidata-l] thoughts about a plan for enabling interlanguage linking
Hi, I'm not sure that it was discussed already. If it was discussed, please point me there. I read the Technical proposal and the translatable pages on Meta and couldn't find it there. Is there any concrete plan to start the migration of the current interlanguage links to Wikidata storage? For example: * The interwiki bots' will definitely have to be modified for the Wikidata age. Did anybody start a conversation with the operators of these bots? MerlIwBot is the only bot that is compatible with Wikidata.
Lydia has started a conversation with the operators of the bots that have volunteered to transit interwiki links to Wikidata.
* How will interwiki conflicts be handled? Judging by earlier discussions, I suppose that Wikidata will just let pages with conflicts work as they always did, but this is an important issue that should be answered in the FAQ.[1] Correct. This issue was discussed at: http://meta.wikimedia.org/wiki/Talk:Wikidata/Development/Storyboard_for_lin…
* How will bots identify pages that use Wikidata? Is this http://meta.wikimedia.org/wiki/User:MerlIwBot/WikiData what you are looking for? I am going to leave this question mostly unanswered for others. This might be related to what you are talking about. Maybe the answer to all of the above questions is: "The Wikidata developers intentionally want to leave these things to the editors community". If it is, then this is fine. It is even hinted in the Technical proposal. But it should be written more explicitly, so that the communities will actually start to work on that. Again, if somebody already started it, please point me there. Thank you! [1] https://meta.wikimedia.org/wiki/Wikidata/FAQ -- Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי http://aharoni.wordpress.com “We're living in pieces, I want to live in peace.” – T. Moore _______________________________________________ Wikidata-l mailing list Wikidata-l(a)lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Hi,
I'm not sure that it was discussed already. If it was discussed,
please point me there. I read the Technical proposal and the
translatable pages on Meta and couldn't find it there.
Is there any concrete plan to start the migration of the current
interlanguage links to Wikidata storage?
For example:
* The interwiki bots' will definitely have to be modified for the
Wikidata age. Did anybody start a conversation with the operators of
these bots?
* How will interwiki conflicts be handled? Judging by earlier
discussions, I suppose that Wikidata will just let pages with
conflicts work as they always did, but this is an important issue that
should be answered in the FAQ.[1]
* How will bots identify pages that use Wikidata?
Maybe the answer to all of the above questions is: "The Wikidata
developers intentionally want to leave these things to the editors
community". If it is, then this is fine. It is even hinted in the
Technical proposal. But it should be written more explicitly, so that
the communities will actually start to work on that. Again, if
somebody already started it, please point me there.
Thank you!
[1] https://meta.wikimedia.org/wiki/Wikidata/FAQ
--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
“We're living in pieces,
I want to live in peace.” – T. Moore
Heya folks :)
We updated the demo system again. It'd be great if you could give it
some testing. Especially the client and repository working together
nicely is something that needs testing now that we're getting closer
to deployment.
http://wikidata-test.wikimedia.de
We've also gone through all our bugs today and marked more tasks that
volunteers can help with. If you're working on one of these please
leave a comment in the bug report so everyone knows what's going on.
https://bugzilla.wikimedia.org/buglist.cgi?keywords=need-volunteer&keywords…
Cheers
Lydia
--
Lydia Pintscher - http://about.me/lydia.pintscher
Community Communications for Wikidata
Wikimedia Deutschland e.V.
Obentrautstr. 72
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
Heya folks
This hasn't made it to this list yet sorry. The Hungarian Wikipedia
community has stepped up to be the first one to use Wikidata in
production once we're ready to deploy the code. They rock :D
We're keeping track of the deployment plans here:
http://meta.wikimedia.org/wiki/Wikidata/Notes/Deployment
Cheers
Lydia
---------- Forwarded message ----------
From: Bináris <wikiposta(a)gmail.com>
Date: Tue, Jul 24, 2012 at 1:52 PM
Subject: Re: [Wikitech-l] Deployment of Wikidata
To: Wikimedia developers <wikitech-l(a)lists.wikimedia.org>
2012/7/14 Lydia Pintscher <lydia.pintscher(a)wikimedia.de>
>
> Just to keep everyone updated: We have discussed this here at
> Wikimania together with a few of the admins of the Hungarian
> Wikipedia. Things are looking good and the next step is to take this
> to the Hungarian Wikipedia to figure out if the community is ok with
> this and then go for it if approved.
>
> The poll is here:
http://hu.wikipedia.org/wiki/Wikip%C3%A9dia:Kocsmafal_%28m%C5%B1szaki%29#Te…
Result: after some explanations we have 26 supporters in addition to 3
initiators/starters, and no opposers. (22 of them in the first 24 hours.)
As I wrote earlier, this is an enthusiastic community. So huwiki is looking
forward to test Wikidata. :-) Let's do it!
--
Bináris
_______________________________________________
Wikitech-l mailing list
Wikitech-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
--
Lydia Pintscher - http://about.me/lydia.pintscher
Community Communications for Wikidata
Wikimedia Deutschland e.V.
Obentrautstr. 72
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
Hello all,
we finally have published our first draft of how Wikidata phase II
will be published in RDF. Comments are very welcome, either on this
mailing list or on the respective talk page.
<http://meta.wikimedia.org/wiki/Wikidata/Development/RDF>
The document is, due to the subject matter, quite technical. If you
are looking for a less technical introduction, feel free to read
<http://meta.wikimedia.org/wiki/Wikidata/Notes/Data_model_primer>
instead.
Cheers,
Denny
--
Project director Wikidata
Wikimedia Deutschland e.V. | Obentrautstr. 72 | 10963 Berlin
Tel. +49-30-219 158 26-0 | http://wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 B. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.