I was running mediawiki on a Shared host and traffic was around 10K views a
day (small to moderate size wiki). I was forced to leave that setup because
of high CPU usage. I was not able to install Squid there or do anything to
speed things up. I had talked about that before on this list and I'm
thankful for the recommendations.
Now I'm on a VPS where Squid is running and currently I don't have CPU
issues except when there's a traffic spike. So I've decided to look for a
dedicated server. I've seen on web hosting forums that (low-end?) dedicated
servers are available for pretty cheap ($100). Currently I'm paying $70 for
the VPS.
My key issue is that the webhost has to willing to let me remain anonymous
and because of this my options are limited. For example they have to accept
Paypal. I have not looked around yet at what options are available but I
will look into that next after this discussion.
To be prepared for the future, I want the server to be able to support 30K
views a day (3 times the current traffic) and display pages with no
noticeable/serious delays. I hope a $100 server with Squid can do this for
me.
Are there any server specs that I should look for? The first one would be
RAM. What's the minimum RAM I should have? Other desirable specs?
My second issue is the hit ratio for Squid: According to Squid's cache
manager, the cache hit rate is about 40% and the byte hit ratio is 20%.
Average time taken to serve a "missed" request is 0.7 seconds, while for a
hit its only 0.02 seconds (35 times faster). So a higher hit ratio would be
really nice.
Looking at Squid's access logs, I also noticed that calls to Load.php are
always "misses". Can anything be done to fix that?
What can be done to optimize Squid for mediawiki and increase the hit
ratio? The RAM I have available is 1.3GB and I told Squid it can use 130MB
and it goes over and the total RAM used usually stays around 40%. I know
1.3GB may be small. I've heard we need to leave some ram free, to ensure
system stability. I may have more RAM in the dedicated server when I get it.
If anyone has a high hit ratio, I would really be thankful if you could
email me your Squid.conf (remove any sensitive information) and I can
compare it with my setup. Or you could tell me the settings I should change
or add.
thanks!
Dan
Just a guess, but this switch statement is in a template that is 2-levels deep. IOW, a page calls template 1 which calls template 2. Would this cause a problem? I've tried numerous examples and it works when it's the 1st template called, but not the second.
Al
Hi,
I am trying to get TeX working in MediaWiki 1.18.2. I have successfully
installed TeXLive on my Fedora server. And have texvc compiled and working.
And have made the following additions to LocalSettings.php :-
$wgUseTex = true;
$wgTexvc = "/usr/bin/texvc";
$wgUploadPath = "$wgScriptPath/images";
$wgUploadDirectory = "$IP/images";
$wgMathPath = "$wgUploadPath/math";
$wgTmpPath = "$wgUploadPath/tmp";
$wgMathDirectory = "$wgUploadDirectory/math";
$wgTmpDirectory = "$wgUploadDirectory/tmp";
$wgUploadBaseUrl = false;
And run :-
php maintenance*/*update.php
But am getting the following error message :-
*Failed to parse (PNG conversion failed; check for correct installation
of latex and dvipng (or dvips + gs + convert)): a=b+c-d*
I had it working on my previous version of MediaWiki using LaTeX and
MediaWIki 1.11.0.
Hope you can help and its something obvious.
Many thanks in advance,
Aaron
On Mon, Dec 24, 2012 at 5:00 AM, <mediawiki-l-request(a)lists.wikimedia.org>wrote:
> On Sun, Dec 23, 2012 at 11:19 PM, Benny Lichtner <bennlich(a)gmail.com>
> wrote:
> > Sticking a category link in a wiki page automatically adds that wiki page
> > to the corresponding category page. For example, inserting
> > [[Category:Puppies]] adds the page to the Puppies category.
> >
> > Is it possible to achieve the same result with something like
> > [[Category:{{#puppiesfunc}}]], where {{#puppiesfunc}} is replaced by
> > "Puppies"? I guess I'm wondering how the automatic population of category
> > pages works under the hood.
>
> Can you state the actual problem you're trying to solve instead of the
> way you want to solve it?
>
> Good idea.
> Anyway, maybe you're looking for addTrackingCategory()
>
> https://gerrit.wikimedia.org/r/gitweb?p=mediawiki/core.git;a=blob;f=include…
>
I'm creating pages that represent stores, and the format of every store
page is identical, so each store page transcludes the same store template
(which is used approximately like this: {{ store_template | storeID }}).
But stores belong to different categories depending on the kinds of goods
they sell. I have a parser function that, given the storeID, can fetch the
right good type from a database, so I want to use that function in the
store_template to categorize every store page dynamically.
Hello, me again /^)
I'm using DISPLAYTITLE to change the title of my pages, but I would like to make the links to those pages that appear on the "What links here" page to also be changed so they match, i.e., display the link text as the same value used for DISPLAYTITLE in the associated page. Is that possible?
Thanks!
Al
I have added the following to my LocalSettings.php file :-
$wgDebugLogFile = "/var/www/log/WikiLog.txt";
and ran :-
php maintenance*/*update.php
Then tried adding :-
wfDebugLog("MathRenderer::__construct()");
to math.body.php's MathRenderer constructor.
But am not getting any output in the log file from this.
Anyone shed any light on what I am doing wrong ?
Many thanks in advance,
Aaron
Hello, i need something like ImageGallery, but when user click on the image it
must be redirected to the article which contains this image, not to the
image's page.
I belive i can use ImageGallery for this purpose and it's method "add", which
has the last parameter "link" to replace the default link of the image
http://svn.wikimedia.org/doc/classImageGallery.html#ae590cf859de45ef2884181…
But how to randomly select article titles and they image titles as well ? I
did not found any info in the documentation. I did not found approriate
extension as well.
Thanks !
Kovács Zoltán sent me a note asking about reducing MediaWiki's memory
footprint a while back and would like your thoughts. (Note that there's
an update below the original email.)
-Sumana
-------- Original Message --------
Subject: MediaWiki memory footprint
Date: Mon, 19 Nov 2012 12:52:00 +0100
From: Kovács Zoltán <kovzol(a)matek.hu>
Dear Sumana,
I hope you are doing well. It was a pleasure to meet you at the GSoC
Mentor summit. We talked about MediaWiki performance tuning and you
suggested to contact you if I need further help.
We run MediaWiki 1.18.2 at wiki.geogebra.org. I am not sure but it seems
MediaWiki has quite a big memory footprint (approx. 85 MB for each
visitor). We don't use any caching yet, but we plan to use Memcached or
Varnish. Do you have any guidelines when to switch caching on (depending
on the number of visitors) and which software does the best job for
MediaWiki? We are also interested in commercial support if the problem
persists.
Thank you for your kind help in advance,
best regards, Zoltan
--
Zoltán Kovács
Research Assistant at the Department of Mathematics Education
Johannes Kepler University Linz, Austria, http://www.jku.at/idm
Update from late December: "finally we managed to solve this problem by
removing the unneeded modules. (Now we don't use more memory than 60 MB
for each Apache thread.) We also started to use memcached but it did not
seem to help too much.
"On the other hand, we are still interested how other people deal with
such problems. So, definitely it would be a great if you could send my
email to the public mailing list."
Hi,
Is there a way, such as a tag, to specify a block of text to be excluded from UTF normalization? I tried the <nowiki> tags but it didn't work.
Thanks,
Al
Hello,
I need to make sure a backend Java process is doing the same UTF normalization that is done for edit text. Grep'ing for 'normaliz' brings up a lot and I'm not a php dev. Can someone point me to a key php module and/or function?
Thank you,
Al