Old Church Slavonic (diff; hist) . . El C (Talk) (he:סלאבונית כנסייתית עתיקה
I can't write it from Gmail, but note that sign ")" is after "he:",
not after right-to-left Hebrew text.
After a few days of experimenting with configuration, and about a day
running one server as a test, we've now got 13 web servers online and
serving pages out of the batch delivered last week.
These are spiffy Opteron boxes, nice and fast, and should help take the
load off the other apache boxen quite nicely.
For now the Opteron apaches are running with APC as the opcode cache
rather than the Turck MMCache we're using on the Pentiums. We used to
use APC a couple years ago and it was a bit flakier then, but the
current version seems to be much more responsive with performance in
line with the competition* and is being actively maintained to work with
current versions of PHP.
* http://meta.wikimedia.org/wiki/Turck_vs_APC_benchmark
-- brion vibber (brion @ pobox.com)
Dear All,
I would like to know if it's possible and how to change the page page
color.
For example, an article page is white and the discussion page of this
article is blue for the english Wikipedia and yellow for the french
Wikipedia.
Is it also possible to make color different for administrative pages?
Many thanks in advance,
PRoth
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
MediaWiki 1.4.10 and 1.3.16 are security maintenance releases. A bug in
edit submission handling could cause corruption of the previous revision
in the database if an abnormal URL was used, such as those used by some
spambots.
Affected releases:
* 1.4.x <= 1.4.9; fixed in 1.4.10
* 1.3.x <= 1.3.15; fixed in 1.3.16
1.5 release candidates are not affected by this problem.
All publicly editable wikis are strongly recommended to upgrade
immediately. 1.4 releases can be manually patched by changing this bit
in EditPage.php:
~ function importFormData( &$request ) {
~ if( $request->wasPosted() ) {
to:
~ function importFormData( &$request ) {
~ if( $request->getVal( 'action' ) == 'submit' &&
~ $request->wasPosted() ) {
1.3 releases can be manually patched by changing this bit in EditPage.php:
~ if( $this->tokenOk( $request ) ) {
~ $this->save = $request->wasPosted() && !$this->preview;
~ } else {
to:
~ if( $this->tokenOk( $request ) ) {
~ $this->save = $request->getVal( 'action' ) == 'submit' &&
~ $request->wasPosted() && !$this->preview;
~ } else {
Release notes:
http://sourceforge.net/project/shownotes.php?release_id=358163http://sourceforge.net/project/shownotes.php?release_id=358162
Download:
http://prdownloads.sourceforge.net/wikipedia/mediawiki-1.4.10.tar.gz?downlo…http://prdownloads.sourceforge.net/wikipedia/mediawiki-1.3.16.tar.gz?downlo…
MD5 checksum:
mediawiki-1.4.10.tar.gz 2376f043109066d19830d05b6682c64b
mediawiki-1.3.16.tar.gz 7dae5d937c6803d970e803ddece750dc
Before asking for help, try the FAQ:
http://meta.wikimedia.org/wiki/MediaWiki_FAQ
Low-traffic release announcements mailing list:
http://mail.wikipedia.org/mailman/listinfo/mediawiki-announce
Wiki admin help mailing list:
http://mail.wikipedia.org/mailman/listinfo/mediawiki-l
Bug report system:
http://bugzilla.wikimedia.org/
Play "stump the developers" live on IRC:
#mediawiki on irc.freenode.net
- -- brion vibber (brion @ pobox.com / brion @ wikimedia.org)
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.4 (Darwin)
Comment: Using GnuPG with Thunderbird - http://enigmail.mozdev.org
iD8DBQFDMbnuwRnhpk1wk44RAhV5AJ4/1UljYlTQ6paaSkdX/Bkz8Kw6hACfVDuq
Imq2VMNjyi2TRyziRRa3O0Q=
=0YtO
-----END PGP SIGNATURE-----
> Debian Woody is my mortal enemy. I thought it was dead finally, but no...
Ah, you see to a distro-laggard like our company, Woody is still a
very viable option, in fact it's considered quite safe and stable,
whereas Sarge is considered a little bit too new and risque to be
trusted yet with mission-critical stuff. That mentality will probably
only change when the updates for woody stop (which I believe is
scheduled to happen on 1/05/2006, unless Etch is released before that
date, which honestly seems rather unlikely). So only after the 1st May
2006 will Woody finally really be dead :-)
On the data import front, today I've tried installing MediaWiki
1.5RC4, and importing via importDump, but I didn't have much luck:
======================================================
# wget http://download.wikimedia.org/wikipedia/en/20050909_pages_public.xml.gz
# gzip --test ~nickj/wikipedia/20050909_pages_public.xml.gz
// no output or error, so presumably the gzip file is not corrupt
# md5sum ~nickj/wikipedia/20050909_pages_public.xml.gz
1de5093f1dd6c5afd4ed080474456d54
/home/nickj/wikipedia/20050909_pages_public.xml.gz
// this matches the sum in
http://download.wikimedia.org/wikipedia/en/20050909-md5sums
# gzip -dc ~nickj/wikipedia/20050909_pages_public.xml.gz | php
maintenance/importDump.php
100 (58.59332294078 pages/sec 58.59332294078 revs/sec)
200 (54.030806663729 pages/sec 54.030806663729 revs/sec)
300 (51.490838593282 pages/sec 51.490838593282 revs/sec)
400 (50.320459245887 pages/sec 50.320459245887 revs/sec)
500 (49.26519486778 pages/sec 49.26519486778 revs/sec)
600 (48.360482472114 pages/sec 48.360482472114 revs/sec)
700 (48.871787388943 pages/sec 48.871787388943 revs/sec)
800 (49.154366513935 pages/sec 49.154366513935 revs/sec)
900 (49.266177573961 pages/sec 49.266177573961 revs/sec)
1000 (49.018343826441 pages/sec 49.018343826441 revs/sec)
1100 (49.167214599006 pages/sec 49.167214599006 revs/sec)
1200 (49.605583964957 pages/sec 49.605583964957 revs/sec)
1300 (49.425412530694 pages/sec 49.425412530694 revs/sec)
1400 (49.357795012659 pages/sec 49.357795012659 revs/sec)
1500 (49.401458453695 pages/sec 49.401458453695 revs/sec)
1600 (49.248578795592 pages/sec 49.248578795592 revs/sec)
1700 (49.205397806241 pages/sec 49.205397806241 revs/sec)
1800 (49.139689484041 pages/sec 49.139689484041 revs/sec)
1900 (49.369342847918 pages/sec 49.369342847918 revs/sec)
2000 (49.706945229133 pages/sec 49.706945229133 revs/sec)
2100 (49.860871622316 pages/sec 49.860871622316 revs/sec)
2200 (49.935237390351 pages/sec 49.935237390351 revs/sec)
2300 (49.976472942288 pages/sec 49.976472942288 revs/sec)
2400 (49.965834881883 pages/sec 49.965834881883 revs/sec)
2500 (50.095592279086 pages/sec 50.095592279086 revs/sec)
2600 (49.913163596511 pages/sec 49.913163596511 revs/sec)
2700 (50.346513647263 pages/sec 50.346513647263 revs/sec)
2800 (50.554639314109 pages/sec 50.554639314109 revs/sec)
2900 (50.30025952798 pages/sec 50.30025952798 revs/sec)
3000 (50.235683978552 pages/sec 50.235683978552 revs/sec)
3100 (49.935743124336 pages/sec 49.935743124336 revs/sec)
3200 (49.898859597319 pages/sec 49.898859597319 revs/sec)
Content-type: text/html
#
// (i.e. spontaneously aborts after ~3200 pages and ~60 seconds).
======================================================
I'm beginning to suspect that some kind of higher-being is determined
that under no circumstances will I be able to load this data into a
database ;-)
All the best,
Nick.
We've had our main fileserver (zwinger) accidentally rebooted a couple
of times during configuration of new servers today, resulting in downtime.
Everybody, PLEEEEEEEASE be careful when rebooting that you're typing in
the window you think you are.
And when the site *does* blow up in some new and inventive way, don't
forget to log it on the admin log at http://wp.wikidev.net/Server_admin_log
That is all.
-- brion vibber (brion @ pobox.com)
HI,
I was referred to this list by April King regarding contract work for
setting up and configuring our internal corporate wiki. If anyone is
interested, would you please contact me at:
David Anderson
Toll Free1-800-859-7249 or 623-445-9500 (AZ Residents)
davidanderson(a)uticorp.com
Thank you,
David
Confidential
This e-mail and any files transmitted with it are the property of the Universal Technical Institute, Inc. and/or its affiliates, are confidential, and are intended solely for the use of the individual or entity to whom this e-mail is addressed. If you are not one of the named recipients or otherwise have reason to believe that you have received this e-mail in error, please notify the sender and delete this message immediately from your computer. Any other use, retention, dissemination, forwarding, printing or copying of this e-mail is strictly prohibited.
Yesterday I upgraded my (tiny) 1.4.9 installation to 1.5rc4, which was
a relatively trouble-free affair (I used the web install with the same
values as on the original install, instead of running "php
update.php").
I spent quite a while trying to determine how to set up the navigation
toolbar; the only reference to it is in RELEASE-NOTES, and erroneously
mentions MediaWiki:navbar. Fortunately, after a (long) while I
discovered the MediaWiki_FAQ on the web, which nicely explains the
whole issue. Not a big deal.
More puzzling was the fact that I was getting errors about unknown
variables while loading an extension. From LocalSettings.php I load
extensions/MyExtensions.php, which starts with the following code:
function __autoload($class_name) {
require_once($class_name . '.php');
}
require_once('MyFunctions.php');
SpecialPage::addPage(new UnlistedSpecialPage('MyExtension'));
For some reason, that used to work OK on 1.4.9, but on 1.5rc4 I was
forced to add
require_once('SpecialPage.php');
before the call to addPage(). Is that expected?
Now, the only thing I miss is a way to customize $wgBookstoreListEn
which doesn't involve hacking languages/language.php (because I must
remember to update it every time I upgrade, which is a PITA). Setting
it in LocalSettings.php does not work. Any ideas?
--
/L/e/k/t/u
Another note; we really need to be able to survive a Zwinger downtime
much better than we do currently. Sometimes its an avoidable accident;
but it might be hardware failure... and we'd like to get Zwinger
upgraded some day so we don't have to special-case installations on it
for Red Hat 9. Surviving that upgrade would be nice. ;)
Most of the files the web servers need to run (eg, the PHP scripts
themselves) are stored on each machine's local disk, and we push out
updates. That's good!
Uploaded files are on another server; downtime there too can also be
bad, but at least a zwinger down shouldn't be killing those too.
However we are reading a few bits off of zwinger's NFS (some block lists
etc, some lock files) and sometimes writing (logs). Insofar as those are
currently used they should be either migrated to a more survivable
situation or should be able to fail gracefully. NFS should be set up if
it's not in a way that will fail cleanly after a short timeout.
Some other configuration files, such as php.ini, and various programs
and utilities (one of the perlbals?) are also pulled off of NFS
currently. These need to generally be fixed up so that things can
continue running while home dirs are down; pushing the files out on
update as we do with the PHP scripts is probably in order.
-- brion vibber (brion @ pobox.com)