"FC" == Faue, Caralynn Caralynn.Faue@mts.com writes:
FC> Hello,
FC> I am so sorry for the intrusion, but I have come across your FC> comment on bugzilla FC> (https://bugzilla.wikimedia.org/show_bug.cgi?id=8473) regarding FC> allowing search engines to crawl specialallpages.php. I am FC> somewhat of a MediaWiki / php newbie, however our organization FC> does have a WIMP installation of MediaWiki installed. I have been FC> maintaining code in specialallpages.php that allowed Microsoft FC> Search Server 2008 to crawl the wiki. The code is similar to this:
FC> $wgOut->setRobotpolicy( 'index,follow' );
FC> This was working, but recently it appears that my customization of FC> this file is being ignored (when I view source the FC> meta tag shows noindex, nofollow again). Do you have any insight into FC> why this might be happening? I am not sure exactly FC> when it started, however I did notice it after our upgrade to FC> MediaWiki version 1.13.3. I have verified that the source FC> code did not get overwritten during the upgrade. FC> $wgOut->setRobotpolicy( 'index,follow' ); is still contained in FC> specialallpages.php.
FC> Again, I am sorry to just email you directly, I have been watching the FC> bugzilla site for updates, but none have been FC> posted.
I'll Cc the newsgroup. I ended up using sitemaps, but I would love to not make sitemaps, if the aforementioned bug was fixed. (I never stray outside of LocalSettings.php with my changes.)
FC> Thanks in advance,
FC> Caralynn Faue
FC> Application Developer
FC> MTS Systems Corporation
http://perishablepress.com/press/2008/06/03/taking-advantage-of-the-x-robots... seems like a possible useful way to override the MediaWiki body via a custom HTTP header...
wikitech-l@lists.wikimedia.org