I run a wiki an order of magnitude larger than yours on a 2GB Linode. You
should have no issue on a $20/mo 512MB Linode provided you're running a
modern PHP stack.
My recommendation is NGINX, PHP-FPM with APC and the built-in mediawiki
file cache. If you're not getting the performance you want you could also
run Varnish or set up a separate Linode for memcached. You could also place
the wiki behind Cloudflare if you're serving a lot of media files on page,
if not I don't think it would be beneficial.
Best,
Chris
On Tue, Nov 27, 2012 at 9:40 PM, Dan Fisher <danfisher261(a)gmail.com> wrote:
I was running mediawiki on a Shared host and traffic
was around 10K views a
day (small to moderate size wiki). I was forced to leave that setup because
of high CPU usage. I was not able to install Squid there or do anything to
speed things up. I had talked about that before on this list and I'm
thankful for the recommendations.
Now I'm on a VPS where Squid is running and currently I don't have CPU
issues except when there's a traffic spike. So I've decided to look for a
dedicated server. I've seen on web hosting forums that (low-end?) dedicated
servers are available for pretty cheap ($100). Currently I'm paying $70 for
the VPS.
My key issue is that the webhost has to willing to let me remain anonymous
and because of this my options are limited. For example they have to accept
Paypal. I have not looked around yet at what options are available but I
will look into that next after this discussion.
To be prepared for the future, I want the server to be able to support 30K
views a day (3 times the current traffic) and display pages with no
noticeable/serious delays. I hope a $100 server with Squid can do this for
me.
Are there any server specs that I should look for? The first one would be
RAM. What's the minimum RAM I should have? Other desirable specs?
My second issue is the hit ratio for Squid: According to Squid's cache
manager, the cache hit rate is about 40% and the byte hit ratio is 20%.
Average time taken to serve a "missed" request is 0.7 seconds, while for a
hit its only 0.02 seconds (35 times faster). So a higher hit ratio would be
really nice.
Looking at Squid's access logs, I also noticed that calls to Load.php are
always "misses". Can anything be done to fix that?
What can be done to optimize Squid for mediawiki and increase the hit
ratio? The RAM I have available is 1.3GB and I told Squid it can use 130MB
and it goes over and the total RAM used usually stays around 40%. I know
1.3GB may be small. I've heard we need to leave some ram free, to ensure
system stability. I may have more RAM in the dedicated server when I get
it.
If anyone has a high hit ratio, I would really be thankful if you could
email me your Squid.conf (remove any sensitive information) and I can
compare it with my setup. Or you could tell me the settings I should change
or add.
thanks!
Dan
_______________________________________________
MediaWiki-l mailing list
MediaWiki-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l