On our intranet, our MediaWiki system is spidered each day by our web crawler. Unfortunately this throws off the hit totals in Special:Statistics. Is there any way to prevent spider hits from being logged as hits by Special:Statistics?
DanB
--------------------- Confidentiality note The information in this email and any attachment may contain confidential and proprietary information of VistaPrint and/or its affiliates and may be privileged or otherwise protected from disclosure. If you are not the intended recipient, you are hereby notified that any review, reliance or distribution by others or forwarding without express permission is strictly prohibited and may cause liability. In case you have received this message due to an error in transmission, please notify the sender immediately and delete this email and any attachment from your system. ---------------------
Hi!
On our intranet, our MediaWiki system is spidered each day by our web crawler. Unfortunately this throws off the hit totals in Special:Statistics. Is there any way to prevent spider hits from being logged as hits by Special:Statistics?
I don't want to sound radical, but $wgDisableCounters=true is what I always recommend to everyone. This cuts execution time reasonably (per http://dammit.lt/2007/01/26/ mediawiki-performance-tuning/ :-)
BR,
Daniel Barrett wrote:
On our intranet, our MediaWiki system is spidered each day by our web crawler. Unfortunately this throws off the hit totals in Special:Statistics. Is there any way to prevent spider hits from being logged as hits by Special:Statistics?
DanB
Maybe you can get your spider to add ?dontcountme=s to each wiki url? Alternatively, hack the code to skip hits based on user-agent.
mediawiki-l@lists.wikimedia.org