I know https://www.mediawiki.org/wiki/Manual:Installation_requirements...
Let's say I have 20 medium sized wiki pages. Medium sized defined by wikipedia definitions, normal articles, not too much.
I expect having 10 to 100 viewers and maximum 5 editors at the same time.
How much RAM am I going to need?
At the moment I have a VPS with debian and apache with 256 RAM. I am still testing. When I open three different wiki pages in three different browser tabs this takes ~128 MB RAM and the server RAM is full, thus no other pages can be served.
So how much RAM am I going to use per wiki site? Doesn't have to be uber correct or proven, just tell me your experiences. How many users you have at a time and that the server still works. If you have some numbers I can upgrade my server plan.
Sorry, no numbers in my answer but I can say how would I do. Try JMeter or a web-service like loadimpact.com to model the load on your website. At the same time in your server use some tools (from UNIX-top to zabbix or it can be included in your control panel) to measure the processor time and the memory needed.
Then try some tricks like PHP acceleration and caching and do the tests one more time.
----- Yury Katkov
On Fri, Sep 7, 2012 at 1:36 AM, anotst01@fastmail.fm wrote:
I know https://www.mediawiki.org/wiki/Manual:Installation_requirements...
Let's say I have 20 medium sized wiki pages. Medium sized defined by wikipedia definitions, normal articles, not too much.
I expect having 10 to 100 viewers and maximum 5 editors at the same time.
How much RAM am I going to need?
At the moment I have a VPS with debian and apache with 256 RAM. I am still testing. When I open three different wiki pages in three different browser tabs this takes ~128 MB RAM and the server RAM is full, thus no other pages can be served.
So how much RAM am I going to use per wiki site? Doesn't have to be uber correct or proven, just tell me your experiences. How many users you have at a time and that the server still works. If you have some numbers I can upgrade my server plan.
-- http://www.fastmail.fm - Accessible with your email software or over the web
MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
On Thu, Sep 6, 2012 at 2:36 PM, anotst01@fastmail.fm wrote:
I know https://www.mediawiki.org/wiki/Manual:Installation_requirements...
Let's say I have 20 medium sized wiki pages. Medium sized defined by wikipedia definitions, normal articles, not too much.
I expect having 10 to 100 viewers and maximum 5 editors at the same time.
How much RAM am I going to need?
At the moment I have a VPS with debian and apache with 256 RAM. I am still testing. When I open three different wiki pages in three different browser tabs this takes ~128 MB RAM and the server RAM is full, thus no other pages can be served.
So how much RAM am I going to use per wiki site? Doesn't have to be uber correct or proven, just tell me your experiences. How many users you have at a time and that the server still works. If you have some numbers I can upgrade my server plan.
Given how cheap RAM is now ($5/GB for typical server RAM, $10/GB for high quality ECC, $15/GB for 16 GB DIMMs... crucial.com current retail prices on all) ... why on earth stick with 256 MB for the server?
Last time I provisioned an all-new Mediawiki server for internal use somewhere we used a 4 GB machine because we weren't buying anything smaller. Last time I bought a bunch of new servers I provisioned them at 48 GB because the RAM was the limit on performance until I got above 64 GB, and there was a huge DIMM price break point at the 48 GB limit (it was 4x as expensive for the 16 GB DIMMs at the time vs the 8 GB units).
Seriously, max out your memory. It's the cheapest performance win you can find.
On Thu, Sep 6, 2012 at 5:07 PM, George Herbert george.herbert@gmail.com wrote:
On Thu, Sep 6, 2012 at 2:36 PM, anotst01@fastmail.fm wrote:
I know https://www.mediawiki.org/wiki/Manual:Installation_requirements...
Let's say I have 20 medium sized wiki pages. Medium sized defined by wikipedia definitions, normal articles, not too much.
I expect having 10 to 100 viewers and maximum 5 editors at the same time.
How much RAM am I going to need?
At the moment I have a VPS with debian and apache with 256 RAM. I am still testing. When I open three different wiki pages in three different browser tabs this takes ~128 MB RAM and the server RAM is full, thus no other pages can be served.
So how much RAM am I going to use per wiki site? Doesn't have to be uber correct or proven, just tell me your experiences. How many users you have at a time and that the server still works. If you have some numbers I can upgrade my server plan.
Given how cheap RAM is now ($5/GB for typical server RAM, $10/GB for high quality ECC, $15/GB for 16 GB DIMMs... crucial.com current retail prices on all) ... why on earth stick with 256 MB for the server?
It's a VPS.
It is very difficult or impossible to give hard numbers for this type of question. I've always found that more RAM is better but it all depends on your expected load and how quickly you'd like your pages to load. The best way would be to setup your server and wiki with some test content and load test them to give you some numbers on what load you are capable of serving. If its not good enough you can optimize, add caching layers, change settings, etc... and retest which gives you hard numbers on what effect your changes had. I've always used "ab" (ApacheBench) usually on the same server to give a "good enough" value most of the time (it is usually optimistic). Something like:
ab http://mysite.com/wiki/Article
Be sure to test different pages and keep in mind this is simulating a logged out user (you can simulate a logged in user as well with some effort). There are numerous other test applications that do the same thing (ex: siege).
Regarding the specifics of your question:. When you say "10 - 100 viewers" do you mean per second, per day, at the same, something else? 100 viewers/day can be "easily" done in 256MB but 100 viewers/sec not so much. I don't think trying use concurrent users is a good metric for a MediaWiki site, even if you could define it exactly. Usually I look at my load tests which give a rough number like "X pages/sec" and match that up with what my expected or desired load is. So long as "X" is significantly higher than what you want or need you should be fine.
As for setting up a 256MB server I would try something along the lines of: - Apache: Very low MaxClients, 2-5 probably as each session could take 10-50MB with everything turned off that you don't need. - MySQL: Relatively low memory settings (100MB total) but make sure the query cache is on with around 50MB devoted to it. - PHP Opcode Cache: Make sure you have one (APC, eAcclerator, etc...) - memcached: Optional but 10-50MB here might help some. - MediaWiki: Ensure the file cache is on. - lighttpd/nginx: Optional but these take very little memory and can serve all your static content (images, JS, CSS, etc...) and reduce the load/memory on Apache. You may also be able to replace Apache entirely - Squid: Optional and things are getting tight but this can take a huge load/memory of the Apache server if you can find 10-50MB to fit it in.
As a point of reference I had the setup as listed on a low end 512MB RAM that served a half-million page views a day, albeit slowly (5-10 sec page load times). I think I hit a sweet spot in performance entirely by chance with that setup though. As you setup and test your site make sure that you never hit your swap as this will kill your performance ("top" or "free -m" or "vmstat").
There's no end of possible improvement or things to try so I would repeat the mantra: Test Test Test!
In my previous message the ApacheBench command should have been:
ab -kc 10 -t 30 http://mysite.com/wiki/Article
which uses 10 concurrent connections with keep alive and tests for 30 seconds. This is usually short enough to not impact a live site but long enough to give meaningful results. You can play around with the concurrent users along with all the many other options as you wish.
mediawiki-l@lists.wikimedia.org