-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Tomasz Chmielewski wrote:
According to
http://www.mediawiki.org/wiki/Manual:%24wgArticleRobotPolicies, one can
set robot policies for individual pages with $wgArticleRobotPolicies.
However, it doesn't work for me, i.e., when I add:
$wgArticleRobotPolicies = array( 'Special:Categories' =>
'index,follow' );
I still get noindex,nofollow on that page.
Special pages would be in control of their own robots settings, so
probably not overrideable here.
It is the same for other, non-special pages - setting
anything with
$wgArticleRobotPolicies for any page doesn't seem to have any effect.
Am I missing anything? I use 1.12.0.
Seems to work fine for me:
$wgArticleRobotPolicies = array( 'Main Page' => 'testing' );
outputs
<meta name="robots" content="testing" />
on my main page. Please confirm that you're not looking at a cached
page, and that you're setting a non-default value (it won't bother
outputting it if it's the default 'index,follow').
- -- brion vibber (brion @
wikimedia.org)
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.8 (Darwin)
Comment: Using GnuPG with Mozilla -
http://enigmail.mozdev.org
iEYEARECAAYFAkfxLRIACgkQwRnhpk1wk47+WACfSIO4YHS9SHfeZ7bs79DgBkYA
q4QAn2k2AbFhGSpNYlADeKR5xCK5iusf
=/i5d
-----END PGP SIGNATURE-----