Hello, I have a couple of mediawiki installations on two different slices at Slicehost, both of which run websites on the same slice with no speed problems, however, the mediawiki themselves run like dogs! http://wiki.medicalstudentblog.co.uk/ Any ideas what to look for or ways to optimise them? I still can't get over they need a 100mb ini_set in settings to just load due to the messages or something.
Thank you, Dawson
On Tue, Jan 27, 2009 at 5:31 AM, Dawson costelloe@gmail.com wrote:
Hello, I have a couple of mediawiki installations on two different slices at Slicehost, both of which run websites on the same slice with no speed problems, however, the mediawiki themselves run like dogs! http://wiki.medicalstudentblog.co.uk/ Any ideas what to look for or ways to optimise them? I still can't get over they need a 100mb ini_set in settings to just load due to the messages or something.
If you haven't already, you should set up an opcode cache like APC or XCache, and a variable cache like APC or XCache (if using one application server) or memcached (if using multiple application servers). Those are essential for decent performance. If you want really snappy views, at least for logged-out users, you should use Squid too, although that's probably overkill for a small site. It also might be useful to install wikidiff2 and use that for diffs.
Of course, none of this works if you don't have root access. (Well, maybe you could get memcached working with only shell . . .) In that case, I'm not sure what advice to give.
MediaWiki is a big, slow package, though. For large sites, it has scalability features that are almost certainly unparalleled in any other wiki software, but it's probably not optimized as much for quick loading on small-scale, cheap hardware. It's mainly meant for Wikipedia. If you want to try digging into what's taking so long, you can try enabling profiling:
http://www.mediawiki.org/wiki/Profiling#Profiling
If you find something that helps a lot, it would be helpful to mention it.
Modified config file as follows:
$wgUseDatabaseMessage = false; $wgUseFileCache = true; $wgMainCacheType = "CACHE_ACCEL";
I also installed xcache and eaccelerator. The improvement in speed is huge.
2009/1/27 Aryeh Gregor <Simetrical+wikilist@gmail.comSimetrical%2Bwikilist@gmail.com
On Tue, Jan 27, 2009 at 5:31 AM, Dawson costelloe@gmail.com wrote:
Hello, I have a couple of mediawiki installations on two different slices
at
Slicehost, both of which run websites on the same slice with no speed problems, however, the mediawiki themselves run like dogs! http://wiki.medicalstudentblog.co.uk/ Any ideas what to look for or ways
to
optimise them? I still can't get over they need a 100mb ini_set in
settings
to just load due to the messages or something.
If you haven't already, you should set up an opcode cache like APC or XCache, and a variable cache like APC or XCache (if using one application server) or memcached (if using multiple application servers). Those are essential for decent performance. If you want really snappy views, at least for logged-out users, you should use Squid too, although that's probably overkill for a small site. It also might be useful to install wikidiff2 and use that for diffs.
Of course, none of this works if you don't have root access. (Well, maybe you could get memcached working with only shell . . .) In that case, I'm not sure what advice to give.
MediaWiki is a big, slow package, though. For large sites, it has scalability features that are almost certainly unparalleled in any other wiki software, but it's probably not optimized as much for quick loading on small-scale, cheap hardware. It's mainly meant for Wikipedia. If you want to try digging into what's taking so long, you can try enabling profiling:
http://www.mediawiki.org/wiki/Profiling#Profiling
If you find something that helps a lot, it would be helpful to mention it.
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Thank you Platonides,
Seems now I get the error: "xcache.var_size is either 0 or too small to enable var data caching in */var/www/includes/BagOStuff.php* on line *643"
*Googling hasn't provided much info on how to fix this, anyone know?* * 2009/1/28 Platonides Platonides@gmail.com
Dawson wrote:
Modified config file as follows:
$wgUseDatabaseMessage = false; $wgUseFileCache = true; $wgMainCacheType = "CACHE_ACCEL";
This should be $wgMainCacheType = CACHE_ACCEL; (constant) not $wgMainCacheType = "CACHE_ACCEL"; (string)
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Wed, Jan 28, 2009 at 5:33 AM, Dawson costelloe@gmail.com wrote:
Seems now I get the error: "xcache.var_size is either 0 or too small to enable var data caching in */var/www/includes/BagOStuff.php* on line *643"
*Googling hasn't provided much info on how to fix this, anyone know?*
Add this to php.ini:
xcache.var_size = 32M
Or pick whatever size you like, depending on how much RAM you have available. You can check the amount of RAM used (and other things) using the xcache-admin stuff that should have been provided when you installed XCache. You might want to tweak the other options too:
To use filecache, you need to set $wgShowIPinHeader = false;
Also, see http://www.mediawiki.org/wiki/User:Aaron_Schulz/How_to_make_MediaWiki_fast -Aaron
-------------------------------------------------- From: "Dawson" costelloe@gmail.com Sent: Tuesday, January 27, 2009 9:52 AM To: "Wikimedia developers" wikitech-l@lists.wikimedia.org Subject: Re: [Wikitech-l] MediaWiki Slow, what to look for?
Modified config file as follows:
$wgUseDatabaseMessage = false; $wgUseFileCache = true; $wgMainCacheType = "CACHE_ACCEL";
I also installed xcache and eaccelerator. The improvement in speed is huge.
2009/1/27 Aryeh Gregor <Simetrical+wikilist@gmail.comSimetrical%2Bwikilist@gmail.com
On Tue, Jan 27, 2009 at 5:31 AM, Dawson costelloe@gmail.com wrote:
Hello, I have a couple of mediawiki installations on two different slices
at
Slicehost, both of which run websites on the same slice with no speed problems, however, the mediawiki themselves run like dogs! http://wiki.medicalstudentblog.co.uk/ Any ideas what to look for or ways
to
optimise them? I still can't get over they need a 100mb ini_set in
settings
to just load due to the messages or something.
If you haven't already, you should set up an opcode cache like APC or XCache, and a variable cache like APC or XCache (if using one application server) or memcached (if using multiple application servers). Those are essential for decent performance. If you want really snappy views, at least for logged-out users, you should use Squid too, although that's probably overkill for a small site. It also might be useful to install wikidiff2 and use that for diffs.
Of course, none of this works if you don't have root access. (Well, maybe you could get memcached working with only shell . . .) In that case, I'm not sure what advice to give.
MediaWiki is a big, slow package, though. For large sites, it has scalability features that are almost certainly unparalleled in any other wiki software, but it's probably not optimized as much for quick loading on small-scale, cheap hardware. It's mainly meant for Wikipedia. If you want to try digging into what's taking so long, you can try enabling profiling:
http://www.mediawiki.org/wiki/Profiling#Profiling
If you find something that helps a lot, it would be helpful to mention it.
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
On Tue, Jan 27, 2009 at 6:56 PM, Jason Schulz wrote:
Also, see http://www.mediawiki.org/wiki/User:Aaron_Schulz/How_to_make_MediaWiki_fast
The shell script you mention in step 2 has some stuff in it that makes it unusable outside Wikimedia: 1) lots of hard-coded paths 2) what is "/usr/local/bin/run-jobs"?
I'd put "0 0 * * * /usr/bin/php /var/www/wiki/maintenance/runJobs.php 2>&1 > /var/log/runJobs.log" as crontab entry in your guide, as it's a bit more compatible with non-wikimedia environments ;)
Marco
http://svn.wikimedia.org/viewvc/mediawiki/trunk/tools/jobs-loop/run-jobs.c?r...
As mentioned, it is just a sample script. For sites with just one master/slave cluster, any simple script that keeps looping to run maintenance/runJobs.php will do.
-Aaron
-------------------------------------------------- From: "Marco Schuster" marco@harddisk.is-a-geek.org Sent: Tuesday, January 27, 2009 6:56 PM To: "Wikimedia developers" wikitech-l@lists.wikimedia.org Subject: Re: [Wikitech-l] MediaWiki Slow, what to look for?
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
On Tue, Jan 27, 2009 at 6:56 PM, Jason Schulz wrote:
Also, see http://www.mediawiki.org/wiki/User:Aaron_Schulz/How_to_make_MediaWiki_fast
The shell script you mention in step 2 has some stuff in it that makes it unusable outside Wikimedia:
- lots of hard-coded paths
- what is "/usr/local/bin/run-jobs"?
I'd put "0 0 * * * /usr/bin/php /var/www/wiki/maintenance/runJobs.php 2>&1 > /var/log/runJobs.log" as crontab entry in your guide, as it's a bit more compatible with non-wikimedia environments ;)
Marco -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.7 (MingW32) Comment: Use GnuPG with Firefox : http://getfiregpg.org (Version: 0.7.2)
iD8DBQFJf59oW6S2GapJUuQRAvYCAJ4vWBAHSTHlJljfnnUSF7IpZlechQCcCY5A Zb5SMJz146sM5HalNQuA/9k= =Ie27 -----END PGP SIGNATURE-----
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
I find that generating the serialized file helps a bit.
http://svn.wikimedia.org/viewvc/mediawiki/trunk/phase3/serialized/README?vie...
And $wgCheckSerialized = false;
wikitech-l@lists.wikimedia.org