Hello,
Could please someone that has CVS acces put the
LanguageEs.php file that is on meta on the stable
branch of the CVS? And, after that, someone that has
server access could actually install it?
I asked Brion a couple of weeks ago, but either
because
he is, as always, terribly busy or because it was not
in
french or esperanto, he hasn't answered (I could have
tried spanish...)
AstroNomer/AstroNomo
__________________________________
Do you Yahoo!?
Protect your identity with Yahoo! Mail AddressGuard
http://antispam.yahoo.com/whatsnewfree
Brion wrote:
>Okay, I moved the bits around so it's happy again.
It may just be a temporary glitch, but the foundation's webpages are timing
out for me right now (12:50 AM Pacific time). Wikipedia is up and so is
meta, so its not that...
-- mav
Brion wrote:
>I hacked up a session data handler to optionally keep
>login session data in memcached rather than the local
>filesystem. Combined with a change to the cookie
>settings, the English Wikipedia login sessions can
>now be shared between en.wikipedia.org on larousse and
>en2.wikipedia.org on pliny, without having to separately log
>in on each server.
Sweet! I especially like the part about load balancing between en2 and en.
That may also explain these error messages on the foundation's webpages:
Warning: session_start(): Cannot send session cookie - headers already sent by
(output started at /usr/local/apache/htdocs/foundation/extract.php:13) in
/usr/local/apache/common/php/Setup.php on line 53
Warning: session_start(): Cannot send session cache limiter - headers already
sent (output started at /usr/local/apache/htdocs/foundation/extract.php:13)
in /usr/local/apache/common/php/Setup.php on line 53
http://wikimediafoundation.org/fundraising
-- Daniel Mayer (aka mav)
I hacked up a session data handler to optionally keep login session
data in memcached rather than the local filesystem. Combined with a
change to the cookie settings, the English Wikipedia login sessions can
now be shared between en.wikipedia.org on larousse and
en2.wikipedia.org on pliny, without having to separately log in on each
server.
This makes it not too painful to use a rewrite rule for simple load
balancing, bouncing some portion of the page space to en and some
portion to en2. Yay!
There may still be some issues dealing with uploaded files, but
disabling uploads on en2, ensuring that Special:Upload is redirected to
en, and that all /upload/* files get loaded off en, should keep things
pretty consistent for the meantime.
Potential problems:
* Memcached is designed to be 'lossy'; if it fills up it throws away
the least recently used data to make room for new data. This shouldn't
be a problem for session handling, hopefully; they're set to expire
after an hour of non-usage anyway.
* Expiration times haven't been thoroughly tested yet.
* There's currently no failsafe; if memcached dies, the session data
falls into a black hole and no one can log in. (Except perhaps if they
click 'remember my password'.) It should be possible to check that it
couldn't contact the server and fall back to local storage, or the
database, or something.
-- brion vibber (brion @ pobox.com)
I have installed mediawiki-20030829 at
http://libertarianwiki.org/wiki/wiki.phtml
When I edit a page I get the following messages:
______________________
Warning: open_basedir restriction in effect. File is in wrong directory in
/home/.paccowasher/pstudier/wiki.paulstudier.com/wiki/GlobalFunctions.php
on line 172
Warning: open_basedir restriction in effect. File is in wrong directory
in
/home/.paccowasher/pstudier/wiki.paulstudier.com/wiki/GlobalFunctions.php
on line 172
Warning: Cannot add header information - headers already sent by (output
started at
/home/.paccowasher/pstudier/wiki.paulstudier.com/wiki/GlobalFunctions.php:172)
in
/home/.paccowasher/pstudier/wiki.paulstudier.com/wiki/OutputPage.php on
line 318
Warning: Cannot add header information - headers already sent by (output
started at
/home/.paccowasher/pstudier/wiki.paulstudier.com/wiki/GlobalFunctions.php:172)
in
/home/.paccowasher/pstudier/wiki.paulstudier.com/wiki/OutputPage.php on
line 319
Warning: Cannot add header information - headers already sent by (output
started at
/home/.paccowasher/pstudier/wiki.paulstudier.com/wiki/GlobalFunctions.php:172)
in
/home/.paccowasher/pstudier/wiki.paulstudier.com/wiki/OutputPage.php on
line 320
Warning: Cannot add header information - headers already sent by (output
started at
/home/.paccowasher/pstudier/wiki.paulstudier.com/wiki/GlobalFunctions.php:172)
in
/home/.paccowasher/pstudier/wiki.paulstudier.com/wiki/OutputPage.php on
line 322
Warning: Cannot add header information - headers already sent by (output
started at
/home/.paccowasher/pstudier/wiki.paulstudier.com/wiki/GlobalFunctions.php:172)
in
/home/.paccowasher/pstudier/wiki.paulstudier.com/wiki/OutputPage.php on
line 338
Warning: Cannot add header information - headers already sent by (output
started at
/home/.paccowasher/pstudier/wiki.paulstudier.com/wiki/GlobalFunctions.php:172)
in
/home/.paccowasher/pstudier/wiki.paulstudier.com/wiki/OutputPage.php on
line 339
_________________________
If I ignore these, I can still do some editing. Any ideas?
Paul Studier <Studier(a)PaulStudier.com>
When you work, you create. When you win, you just take from the loser.
For an explanation, see http://paulstudier.com/win
Hello,
I am known as Hashar on irc and fr.wikipedia.
I am currently developping a php script that aims to help me updating
interwiki (also known as interlangage links). I am posting there so the
community know about what's I am doing and mainly to prevent any crash of
the server that might be cause by my script.
This is what I am doing:
Script is running on my local workstation using php and my cable connection
in France. It operates following this script:
1/ ask user for an article to check
2/ retrieve article and parse for interwiki links
3/ retrieve one of the interwiki link and parse it for interwiki links
4/ repeat 3 until all interwiki links of 2 have been parsed.
5/ display for each wikipedia the number of interwiki and a list of them.
It doesn't manage redirect yet.
This is an output result on a test I made this morning (while american are
sleeping and european not yet awake as to minimize risks). I was browsing
the site while it was running and didn't notice any slowdown.
----- SCRIPT OUTPUT -----
Enter the name of an article on the fr wikipédia.Wikilinks will be checked
on
(ar|ms|bs|cs|cy|da|de|el|en|es|eo|fr|fy|hi|hr|he|ko|hu|ml|nah|nl|ja|pl|ro|
ru|sq|sk|sl|sr|sv|tr|zh) wikipedias.
> Espagne [OK]
Current wikilink(s) for this article:
fr (11 interwiki):
[[da:Spanien]] [[de:Spanien]] [[en:Spain]] [[es:España]] [[eo:Hispanio]]
[[nl:Spanje]] [[ja:スペイン]] [[pl:Hiszpania]]
[[ro:Spania]] [[sv:Spanien]] [[zh:西班牙]]
Number and wikilinks on the linked wikis:
----------------------------------------------------------------------
da (11 interwiki):
[[en:Spain]] [[de:Spanien]] [[eo:Hispanio]] [[es:España]] [[fr:Espagne]]
[[ja:スペイン]] [[nl:Spanje]] [[pl:Hiszpania]]
[[ro:Spania]] [[sv:Spanien]] [[zh:西班牙]]
de (11 interwiki):
[[da:Spanien]] [[en:Spain]] [[eo:Hispanio]] [[es:España]] [[fr:Espagne]]
[[ja:スペイン]] [[nl:Spanje]] [[pl:Hiszpania]]
[[ro:Spania]] [[sv:Spanien]] [[zh:%E8%A5%BF%E7%8F%AD%E7%89%99]]
es (10 interwiki):
[[da:Spanien]] [[de:Spanien]] [[en:Spain]] [[eo:Hispanio]] [[fr:Espagne]]
[[ja:スペイン]] [[nl:Spanje]] [[pl:Hiszpania]]
[[sv:Spanien]] [[zh:西班牙]]
eo (8 interwiki):
[[de:Spanien]] [[en:Spain]] [[fr:Espagne]] [[es:España]]
[[ja:スペイン]] [[nl:Spanje]] [[pl:Hiszpania]]
[[ro:Spania]]
ja (0 interwiki):
doesn't got any interwiki link.
----- END SCRIPT OUTPUT -----
NB: I removed lot of entries to make things clearer.
>From this script output we can see that there is at most 11 interwiki. Eo
wiki is missing 3 of them, es wiki is missing one, ja doesn't seem to be
linked (must be a script bug :p ).
Before I work more on this idea, what's your reaction about that ? Do you
find it usefull ?
Should it be spread to some server operator around the wikipedias ?
Followup either here or on my talk page.
:0°
--
Ashar Voultoiz
[[Hashar]] @ fr.wikipedia.org
Brion raised the question the other day about the possibility of
storing the images directly in the database. Part or all of the
motivation was to make it easy for multiple webservers to serve
consistent images on the site in our new configuration.
I've been thinking of other ways to do this.
1. NFS -- all the webservers could have read/write access to an NFS
partition, probably on the new machine. This is easy to setup, but
there are questions about the security and stability of NFS, and at
least a few years ago, it was considered by some to be bad mojo to try
to serve web content off of NFS-mounted partitions -- performance is
bad.
2. AFS - Andrew File System, or DRBD - Distributed Replicated Block
Device --- These sound to me like things that hold forth great
promise, but I also regard them as fairly esoteric technologies.
We're probably better off doing something more boring.
Am I wrong? Too conservative? Not up to date?
3. Apache reverse proxying -- this is a boring and good solution that
I'm confident would work well, but it does have some drawbacks.
Essentially, the way it works is this: for image uploads and
downloads, we mod_rewrite to transparently reverse proxy the requests
to apache running on the backend (database) machine.
One possible drawback is the overhead of apache running on the
backend, and the fact that it might become a bottleneck. However,
since almost all the backend machine would be doing is static
requests, and since those could be shunted to something faster than
apache, it really wouldn't be all that hard.
--------
Hybrid approaches are possible -- the webservers could NFS mount the
/images/ directory from the db machine but only use the NFS
mountpoints for writing -- for reads, we'd go through the reverse
proxying mechanism.
--Jimbo
>-----Message d'origine-----
>De : Jimmy Wales [mailto:jwales@bomis.com]
>Envoyé : lundi 20 octobre 2003 23:31
>À : wikitech-l(a)wikipedia.org
>Objet : [Wikitech-l] Time estimate for new server
>
>
>Penguincomputing just told me that the production lead time on the new
>server will be 7-10 days from now. So I would anticipate that the new
>server will be shipped, oh, I guess next Wednesday or so. Possibly
>Jason will install it next Friday.
>
Hello Jimmy,
Have you got some news about the new db server ? When we'll got it, we'll be able to do the press release :-)
But will it be 350 000 articles press release or something else ?
Bye
Constans, Camille wrote:
>Have you got some news about the new db server ?
>When we'll got it, we'll be able to do the press release :-)
Last I heard it was going to hopefully be installed sometime late this week.
>But will it be 350 000 articles press release or something else ?
If things get really fast with the new server then why not wait for half a
million articles? By the time we reach that (early next year) we should be
starting to tax our new server setup and at least need some minor upgrades
all around (lots and lots of RAM) and an additional $2,000-$3,000 webserver
(or a server to perform full text searches on - whatever is needed most).
We've waited this long and the major point of the press release in the first
place was to get enough money through donations to get a big bad database
server. Thanks to that $5,000 donation we achieved that goal well before we
were ready to distribute the press release.
-- Daniel Mayer (aka mav)
If we go this way, why don't wait to have 500,000 articles, so we have a
really great announcement ? ;-)
Traroth
-----Ursprüngliche Nachricht-----
Von: Constans, Camille (C.C.) [mailto:cconsta4@ford.com]
Gesendet: Dienstag, 4. November 2003 08:25
An: 'Wikimedia developers'
Betreff: RE: [Wikitech-l] Time estimate for new server and 350 000 press
r elease
Hello Jimmy,
Have you got some news about the new db server ? When we'll got it, we'll be
able to do the press release :-)
But will it be 350 000 articles press release or something else ?
Bye