According to http://www.mediawiki.org/wiki/Manual:%24wgArticleRobotPolicies, one can set robot policies for individual pages with $wgArticleRobotPolicies.
However, it doesn't work for me, i.e., when I add:
$wgArticleRobotPolicies = array( 'Special:Categories' => 'index,follow' );
I still get noindex,nofollow on that page.
It is the same for other, non-special pages - setting anything with $wgArticleRobotPolicies for any page doesn't seem to have any effect.
Am I missing anything? I use 1.12.0.
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
Tomasz Chmielewski wrote:
According to http://www.mediawiki.org/wiki/Manual:%24wgArticleRobotPolicies, one can set robot policies for individual pages with $wgArticleRobotPolicies.
However, it doesn't work for me, i.e., when I add:
$wgArticleRobotPolicies = array( 'Special:Categories' => 'index,follow' );
I still get noindex,nofollow on that page.
Special pages would be in control of their own robots settings, so probably not overrideable here.
It is the same for other, non-special pages - setting anything with $wgArticleRobotPolicies for any page doesn't seem to have any effect.
Am I missing anything? I use 1.12.0.
Seems to work fine for me:
$wgArticleRobotPolicies = array( 'Main Page' => 'testing' );
outputs
<meta name="robots" content="testing" />
on my main page. Please confirm that you're not looking at a cached page, and that you're setting a non-default value (it won't bother outputting it if it's the default 'index,follow').
- -- brion vibber (brion @ wikimedia.org)
Brion Vibber schrieb:
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
Tomasz Chmielewski wrote:
According to http://www.mediawiki.org/wiki/Manual:%24wgArticleRobotPolicies, one can set robot policies for individual pages with $wgArticleRobotPolicies.
However, it doesn't work for me, i.e., when I add:
$wgArticleRobotPolicies = array( 'Special:Categories' => 'index,follow' );
I still get noindex,nofollow on that page.
Special pages would be in control of their own robots settings, so probably not overrideable here.
In other words, for special pages I would have to change php code, I guess?
It is the same for other, non-special pages - setting anything with $wgArticleRobotPolicies for any page doesn't seem to have any effect.
Am I missing anything? I use 1.12.0.
Seems to work fine for me:
$wgArticleRobotPolicies = array( 'Main Page' => 'testing' );
outputs
<meta name="robots" content="testing" />
on my main page. Please confirm that you're not looking at a cached page, and that you're setting a non-default value (it won't bother outputting it if it's the default 'index,follow').
I'm not looking at a cached page. I even disabled all extensions...
And here is why it didn't work for me - $wgArticleRobotPolicies has to be placed *before* require_once( "includes/DefaultSettings.php" ):
This one will work:
require_once( "includes/DefaultSettings.php" );
$wgArticleRobotPolicies = array( 'Some_Page' => 'testing' );
This one will NOT work:
$wgArticleRobotPolicies = array( 'Some_Page' => 'testing' );
require_once( "includes/DefaultSettings.php" );
And indeed, it doesn't work for special pages, which is a pity a bit.
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
Tomasz Chmielewski wrote:
Brion Vibber schrieb:
Special pages would be in control of their own robots settings, so probably not overrideable here.
In other words, for special pages I would have to change php code, I guess?
Looks like.
And here is why it didn't work for me - $wgArticleRobotPolicies has to be placed *before* require_once( "includes/DefaultSettings.php" ):
*after*, not *before* :)
This one will NOT work:
$wgArticleRobotPolicies = array( 'Some_Page' => 'testing' );
require_once( "includes/DefaultSettings.php" );
This is the same as with all MediaWiki settings whatsoever.
- -- brion vibber (brion @ wikimedia.org)
mediawiki-l@lists.wikimedia.org