[Wikizh-l] Fwd: [Wikitech-l] Interwiki link tools

shi zhao shizhao at gmail.com
Fri May 18 07:06:12 UTC 2007


---------- Forwarded message ----------
From: Stian Haklev <shaklev at gmail.com>
Date: 2007-5-18 下午1:23
Subject: [Wikitech-l] Interwiki link tools
To: wikitech-l at lists.wikimedia.org

Hi everyone,

as a multilingual user of Wikipedia, I am always looking for new ways of
using and accessing the content from different language wikis. I am
Norwegian, but I read Chinese and I have several friends who often edit
there. Whoever, if I want to look up an article in Chinese Wikipedia, it's a
hassle since I don't necessarily know how they write "Oslo" or "Plato" in
Chinese... so I did this little tool

http://70.47.70.10/redir/en/zh/Oslo

it will automatically git you the page in Chinese Wikipedia that is
interlinked from the article about Oslo in the English Wikipedia. (This
works for all language combinations - given that there are interwiki links -
and all articles. Use _ instead of spaces).

http://70.47.70.10/redir/no/ar/Demokrati

it only downloads the first article, does a quick regexp, and forwards to
the second article. I have posted about this on the Norwegian and Indonesian
community pages. Not sure what is the place to list this tool on the English
wiki / Mediawiki?

--

Another thing that has struck me is that being able to read several
languages, I often don't really mind that much which wiki a certain article
comes from - but I would like to read the longest or most complete article
on a given topic. However in practice I find myself often "constrained" to
the English wikipedia because that is generally the best, or the Indonesian
if I am reading on Indonesian topics etc.

so this url

http://70.47.70.10/bigger/en/no,sv,id,ms/Poland

will automatically redirect you to whatever of the articles in no, sv, id or
ms that is _longer_ (pure bytesize, not very scientific but still) of the
articles linkes from the English Poland page.

This is insanely useful (it's mainly meant to be a Firefox bookmark - plug
in the languages you are comfortable reading in, call it big, and whenever
you want an article type big Television, and you will get the longest
article in one of the languages you know).

However - right now what it's doing is downloading a full page from each of
the Wikipedias, to find the size. So in the above example, it would download
five full pages, before redirecting the user to one of them (causing it to
be downloaded again)... I'm obviously not happy about wasting Wikipedia
bandwidth, so I wonder if there is a better way of doing this - getting just
the size of any given page... I will not post this online anywhere else,
until I have some guidance from this list.

Both are written in respectively 20 and 43 lines of Ruby CGI code btw. I can
post the code if anyone wants it.

Stian
_______________________________________________
Wikitech-l mailing list
Wikitech-l at lists.wikimedia.org
http://lists.wikimedia.org/mailman/listinfo/wikitech-l


-- 
Chinese wikipedia: http://zh.wikipedia.org/
My blog: http://talk.blogbus.com
CNBlog: http://blog.cnblog.org/weblog.html
Social Brain: http://www.socialbrain.org/default.asp
cnbloggercon: http://www.cnbloggercon.org/

[[zh:User:Shizhao]]
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.wikimedia.org/pipermail/wikizh-l/attachments/20070518/20c970cf/attachment.htm 


More information about the Wikizh-l mailing list