Options in current local settings about
caching/compression:
$wgMainCacheType = CACHE_ACCEL;
$wgMemCachedServers = [];
$wgMessageCacheType = CACHE_ACCEL;
$wgUseFileCache = true; // Enable file cache
$wgCacheDirectory = "$IP/cache";
$wgDisableOutputCompression = true; // already used mod_deflat
$wgUseLocalMessageCache = false;
$wgParserCacheType = CACHE_DB;
$wgEnableSidebarCache = true;
# NO DB HITS!
$wgDisableCounters = true;
$wgMiserMode = true;
$wgRevisionCacheExpiry = 3*24*3600;
$wgParserCacheExpireTime = 14*24*3600;
I also added .htaccess at skin, extension, resource/asset, and vendor
directory for setting a long expire time for browser caching.
On Sat, Jul 15, 2017 at 1:34 AM, Brian Wolff <bawolff(a)gmail.com> wrote:
Yes its possible, however its probably not a very
good way to speed up
your
wiki and it probably would not be much faster.
For the use case you describe, id reccomend setting up varnish (ideally on
a separate server) in front of your wiki. There should be instructions on
how to do this on
mediawiki.org. anyone not logged in will be handled by
varnish which will be faster than mediawiki. However thats not an option
on
a free shared host
If varnish is not an option, use file cache which is basically a crappy
version of varnish.
Other notes about your config:
Make sure the cache directory is writable by the webserver
$wgMainCacheType should be set to CACHE_ACCEL. This will make your wiki
significantly faster and is one of the most important things you can do.
(Assuming apcu is set up correctly)
wgUseLocalMessageCache should probably be false. You already are using
cache_accel for it, that setting probably just means its stored twice.
You should also ensure apcu.shm_size is large enough (aim for at least 64
mb. Maybe more im not really sure)
On Saturday, July 15, 2017, Iso Bar <openshift.isobar(a)gmail.com> wrote:
Hi. How to make two wikis such that
1. wiki A can only read the page.
2. wiki B allows both read and write the page
3. The pages on wiki A and wiki B are the same
4. They use the same MySQL database.
Is that possible? I am thinking about aggressively cut down the
resource.php and remove all extensions for the read only wiki to make it
faster.
I installed Mediawiki on a free shared host, which allows 2 cron jobs
that
don't use too much cpu/memory per day.
PHP seems slow. I don't have a domain name and so can't use cloud flare.
I tried file cache, enabled PHP7's mod_deflat with cPannel, requested
APCu,
OpCache and:
$wgMessageCacheType = CACHE_ACCEL;
$wgUseFileCache = true;
$wgCacheDirectory = "$IP/cache";
$wgUseLocalMessageCache = true;
$wgParserCacheType = CACHE_DB;
$wgEnableSidebarCache = true;
# NO DB HITS!
$wgDisableCounters = true;
$wgMiserMode = true;
$wgRevisionCacheExpiry = 3*24*3600;
$wgParserCacheExpireTime = 14*24*3600;
I didn't try the trick that serves the cached page directly without
going
through PHP because I worry things will slow down
when the number of
pages
increased.
Thanks in advance
_______________________________________________
MediaWiki-l mailing list
To unsubscribe, go to:
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
_______________________________________________
MediaWiki-l mailing list
To unsubscribe, go to:
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l