Sorry, no numbers in my answer but I can say how would I do.
Try JMeter or a web-service like loadimpact.com
to model the load on
your website. At the same time in your server use some tools (from
UNIX-top to zabbix or it can be included in your control panel) to
measure the processor time and the memory needed.
Then try some tricks like PHP acceleration and caching and do the
tests one more time.
On Fri, Sep 7, 2012 at 1:36 AM, <anotst01(a)fastmail.fm> wrote:
Let's say I have 20 medium sized wiki pages. Medium sized defined by
wikipedia definitions, normal articles, not too much.
I expect having 10 to 100 viewers and maximum 5 editors at the same
How much RAM am I going to need?
At the moment I have a VPS with debian and apache with 256 RAM. I am
still testing. When I open three different wiki pages in three different
browser tabs this takes ~128 MB RAM and the server RAM is full, thus no
other pages can be served.
So how much RAM am I going to use per wiki site? Doesn't have to be uber
correct or proven, just tell me your experiences. How many users you
have at a time and that the server still works. If you have some numbers
I can upgrade my server plan.
- Accessible with your email software
or over the web
MediaWiki-l mailing list