Hi,
At Turkish Wikipedia (tr) we are discussing how we should use
redirections.
Because of historical reasons (I didn't want to explain the
gory details of character set standardization history) Turkish
character set was included in Latin-5. Even though all popular
operating systems support this character set, the user has to
configure it.
Regular (US) Q keyboards don't support these specialized Turkish
characters.
Because of these reasons many Turkish speakers living abroad
tend to substitute Turkish characters with other similar characters
(omitting umlauts for example).
What we want to do is to pre-process wiki URLs so that we
can redirect the request even if it was not typed using
proper Turkish characters.
I don't know if we can add a module like that, which will process
requests coming only to tr.wikipedia.org.
Thanks
-volkan
__________________________________
Do you Yahoo!?
Win a $20,000 Career Makeover at Yahoo! HotJobs
http://hotjobs.sweepstakes.yahoo.com/careermakeover
Now that we have an object cache, I think we can move toward getting rid
toward Language??.php entirely, the $wgAllMessages arrays at first and the
rest that's in there later.
Here's a rough implementation strategy:
- Multilang support is maintained on Meta, but instead of Language??.php
pages, people edit MediaWiki pages
- The user can set in the preferences which language they want to use (as
per Nikola's patch). Bonus points: Detect language prefs from browser and
set default UI language accordingly.
- For the different languages we append a suffix to the title, e.g.
MediaWiki:Gnunote.en. The suffix is added automatically when no suffix is
provided, so MediaWiki:Gnunote would go to Gnunote.en if you have that set
in your prefs
- We get rid of the message arrays entirely. Instead, the installation
package contains a dump of the MediaWiki namespace from Meta. This is
loaded into the DB on install. $wgUseDatabaseMessages is required, not
optional.
Note that the language suffixes would only work for the MediaWiki:
namespace, not for the Template: namespace -- mixing languages here would
be a bad idea.
How does that sound? Have I forgotten something important?
Having done this, we could then think about throwing all languages into
one DB (with different tables like CUR.de, RECENTCHANGES.en etc.), as a
first step toward true multilang integration.
Regards,
Erik
me
>-----Message d'origine-----
>De : wikitech-l-bounces(a)Wikipedia.org
>[mailto:wikitech-l-bounces@Wikipedia.org]De la part de Gabriel Wicke
>Envoyé : lundi 10 mai 2004 00:45
>À : wikitech-l(a)Wikipedia.org
>Objet : [Wikitech-l] old revisions on test.wikipedia.org
>
>
>There's a weird phenomenon on test.wikipedia.org with 1427 old
>revisions
>only holding the text 'abc'. If it was just a flawed manual mysql query
>done around may 4 then we could be sure it's not a bug- if not
>we'd have
>to do a fair bit of investigation.
>
>This problem didn't appear in any other wiki running the cvs code, but
>to know is better than to guess in this case. Would be a serious
>bug/security problem otherwise.
>
It's not a bug, I told to Tim, I broke test history during my test to make a php script to uncompress old tables (needed if we want de, nl and others wikis to go to utf-8)
Shiahulud
What I propose that someone of you who has bots to command should
consider implementing the following
a) Let's look up what every language name is in every language
b) Look up what's ===Translations=== or similar expression that the
bot can spot the translations area in each language
c) map the language names and language codes
d) parametrise a bot to crawl the wiktionaries linking [[:en:Word]]
Translations to *Finnish: [[:fi:sana|sana]] and *Swedish:
[[:sv:ord|ord]] and so forth and vice versa from other languages to
English
So by locating "*Language" or *[[Language]] in section titled
Translations the bot would determine to look for the word "sana" in
http://fi.wiktionary.org and upon success (the target exists) links to
it and hides the namespace.
Could someone with knowledge on wikibots assess the feasibility of such
a scheme? Once a full round is done on all wiktionaries the bot should
focus looking at new entries and linking them as they appear.
I see no harm in this kind of automation, it would actually help weed
out stupid jokes and give more precision on nyances as the thesauri
could be surfed translanguage.
-juxho
Hi,
I mentioned this here 2½ days ago, but nobody did anything about this.
tokipona inter-wiki links have stopped working. Please could someone fix
this?
Thanks,
Timwi
> >>(b) there should be "rename file" function akin to "move page".
>
> >Now that is a very good idea.
>
> I fail to see how they would really differ.
>
> Ec
They wouldn't, but "move page" doesn't work for files. If you want to rename
an image you currently have to save it to your computer, reupload it under a
new name, change any articles that point to the old name, and put the old image
up on images for deletion. Not very efficient. :)
Fabi.
---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.679 / Virus Database: 441 - Release Date: 07/05/2004
__________________________________
Do you Yahoo!?
Win a $20,000 Career Makeover at Yahoo! HotJobs
http://hotjobs.sweepstakes.yahoo.com/careermakeover
> For those who don't already know what's so bad about redirects: (1) Even
> just one redirect doubles(!) the server's response time.
If redirects are that costly, would it be useful to run a bot say once a
month to replace links to redirects by the final destination?
The en: wikipedia contains over 5 million internal links. Over 500,000 or
10% are links to redirects. (the stats show 155 K redirect links but that is
the number of redirect pages)
Erik Zachte
The new frameset un-trapper in wikibits.js prevents linking to Wiki in a
framed setting. My site www.wordwebonline.com provides an interface to
several references via a small top frame, and has just been spoilt by the
new frame killer. Would it be possible to allow frames, or perhaps only have
the frame killer on the main page? (I'm indexing wiki, so only linking
directly to wiki pages when they exist). At it stands I will probably have
to stop linking to wiki, which would be a shame.
If this isn't the right place to ask, please let me know.
thanks.
Antony
Here is attached a simple one-liner fix for is_a . The one-liner comes
from http://us2.php.net/is_a .
Thanks,
Asheesh Laroia.
--
Only kings, presidents, editors, and people with tapeworms have the right
to use the editorial ``we''.
-- Mark Twain