Has anyone experimented with "per-user categories" as an extension? The commercial wiki Confluence has this feature: any category (called a "label") that begins with "my:" is local to the current user. So people can tag pages with their own categories for their own reference, and these categories are not visible to other users.
So if you added to an article:
[[category:my:important]]
this creates a category "my:important" that's visible only to the logged-in user who created it. So people can create their own locally organized sets of articles.
DanB
Hi,
I have succesfully moved my wiki site to another location and upgraded from 1.6.5 to 1.11.0. The new site works well, except for the speed in generating the pages. Even small pages take 10 seconds to be presented; 30 seconds is no exception. In order to try to increase the speed I wonder: - how can I determine what causes the delay (apache, php, memory, mysql?) - if it's php, what caching software do you suggest I should use?
thnx in advance
Hi,
Most of your pages load in less than 2 seconds for me.
Alain
ruud habets wrote:
I have succesfully moved my wiki site to another location and upgraded
from 1.6.5 to 1.11.0. The new site works well, except for the speed in generating the pages. Even small pages take 10 seconds to be presented; 30 seconds is no exception. In order to try to increase the speed I wonder:
- how can I determine what causes the delay (apache, php, memory, mysql?)
- if it's php, what caching software do you suggest I should use?
Alain van Acker schreef:
Hi,
Most of your pages load in less than 2 seconds for me.
Alain
Thank you Alain, for testing. But you tested the 1.6.5. version of the site. I decided to publish the 1.11.0 site only if it shold meet quality standards. Ruud
ruud habets wrote:
I have succesfully moved my wiki site to another location and upgraded
from 1.6.5 to 1.11.0. The new site works well, except for the speed in
generating the pages. Even small pages take 10 seconds to be presented; 30 seconds is no exception. In order to try to increase the speed I wonder:
- how can I determine what causes the delay (apache, php, memory, mysql?)
- if it's php, what caching software do you suggest I should use?
MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org http://lists.wikimedia.org/mailman/listinfo/mediawiki-l
ruud habets wrote:
Hi,
I have succesfully moved my wiki site to another location and upgraded from 1.6.5 to 1.11.0. The new site works well, except for the speed in generating the pages. Even small pages take 10 seconds to be presented; 30 seconds is no exception. In order to try to increase the speed I wonder:
- how can I determine what causes the delay (apache, php, memory, mysql?)
- if it's php, what caching software do you suggest I should use?
thnx in advance
Hi,
I would recommend you eAccelerator ( http://eaccelerator.net/ ). Although it's not the fastest one, the performance boost is big. And it is easy to use. ;) Good luck :)
Georgi Hristozov schreef:
ruud habets wrote:
Hi,
I have succesfully moved my wiki site to another location and upgraded from 1.6.5 to 1.11.0. The new site works well, except for the speed in generating the pages. Even small pages take 10 seconds to be presented; 30 seconds is no exception. In order to try to increase the speed I wonder:
- how can I determine what causes the delay (apache, php, memory, mysql?)
- if it's php, what caching software do you suggest I should use?
thnx in advance
Hi,
I would recommend you eAccelerator ( http://eaccelerator.net/ ). Although it's not the fastest one, the performance boost is big. And it is easy to use. ;) Good luck :)
OK, I will give it a try Ruud
On 21/01/2008, Georgi Hristozov georgi@forkbomb.nl wrote:
ruud habets wrote:
Hi,
I have succesfully moved my wiki site to another location and upgraded from 1.6.5 to 1.11.0. The new site works well, except for the speed in generating the pages. Even small pages take 10 seconds to be presented; 30 seconds is no exception. In order to try to increase the speed I wonder:
- how can I determine what causes the delay (apache, php, memory, mysql?)
- if it's php, what caching software do you suggest I should use?
thnx in advance
Hi,
I would recommend you eAccelerator ( http://eaccelerator.net/ ). Although it's not the fastest one, the performance boost is big. And it is easy to use. ;) Good luck :)
Small pages taking 10 seconds to load seems excessive to me, suggesting there is some underlying problem. Using an accelerator doesn't seem like the ideal solution.
There is a profiler built into Mediawiki that should tell you where things are running slowly, from there it might be possible to work out what the problem is. I've never used it, but there are instructions here: http://www.mediawiki.org/wiki/How_to_debug#Profiling
Thomas Dalton schreef:
On 21/01/2008, Georgi Hristozov georgi@forkbomb.nl wrote:
ruud habets wrote:
Hi,
I have succesfully moved my wiki site to another location and upgraded from 1.6.5 to 1.11.0. The new site works well, except for the speed in generating the pages. Even small pages take 10 seconds to be presented; 30 seconds is no exception. In order to try to increase the speed I wonder:
- how can I determine what causes the delay (apache, php, memory, mysql?)
- if it's php, what caching software do you suggest I should use?
thnx in advance
Hi,
I would recommend you eAccelerator ( http://eaccelerator.net/ ). Although it's not the fastest one, the performance boost is big. And it is easy to use. ;) Good luck :)
Small pages taking 10 seconds to load seems excessive to me, suggesting there is some underlying problem. Using an accelerator doesn't seem like the ideal solution.
There is a profiler built into Mediawiki that should tell you where things are running slowly, from there it might be possible to work out what the problem is. I've never used it, but there are instructions here: http://www.mediawiki.org/wiki/How_to_debug#Profiling
MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org http://lists.wikimedia.org/mailman/listinfo/mediawiki-l
After strugling some time with the profiler (my debugfile never showed any content) I tried the Eaccelerator knowing this might (not) be the answer. But it was. It increased the speed significantly, whatever the cause was. Thanks!
After strugling some time with the profiler (my debugfile never showed any content) I tried the Eaccelerator knowing this might (not) be the answer. But it was. It increased the speed significantly, whatever the cause was. Thanks!
You're probably just hiding the problem, though. Eaccelerator will just speed up viewing pages that have already been parsed, it won't speed up viewing pages for the first time (and that's first time for each logged in user, I expect, so the difference may be minimal if you test with other accounts).
Thomas Dalton wrote:
After strugling some time with the profiler (my debugfile never showed any content) I tried the Eaccelerator knowing this might (not) be the answer. But it was. It increased the speed significantly, whatever the cause was. Thanks!
You're probably just hiding the problem, though. Eaccelerator will just speed up viewing pages that have already been parsed, it won't speed up viewing pages for the first time (and that's first time for each logged in user, I expect, so the difference may be minimal if you test with other accounts).
This is completely incorrect, eAccelerator does nothing of the kind. It is a bytecode cache, and as such works in the same way as the bytecode cache we usually recommend: APC. It doesn't cache pages, it caches compiled source code. This is undeniably a good thing.
You can store the parser cache in eAccelerator instead of in the database, but the performance improvement from that would be small if anything, Ruud has not done it, and nobody has recommended that he should.
-- Tim Starling
This is completely incorrect, eAccelerator does nothing of the kind. It is a bytecode cache, and as such works in the same way as the bytecode cache we usually recommend: APC. It doesn't cache pages, it caches compiled source code. This is undeniably a good thing.
Ok, I need to read things more fully in future, sorry. My point still stands though - it shouldn't be taking 10 seconds to parse a small page, surely?
Hi!
Ok, I need to read things more fully in future, sorry. My point still stands though - it shouldn't be taking 10 seconds to parse a small page, surely?
my test deployment is serving pages at ~0.03s :) Most of that is described at http://dammit.lt/2007/01/26/mediawiki- performance-tuning/
And re- 10s, never overestimate resources provided by various virtual hosts :-) What is the time reported at the bottom of page source?
Best regards,
mediawiki-l@lists.wikimedia.org