On 17.09.2008, 23:37 Jens wrote:
Hi,
The robots.txt file can now be customized for all wikis. There's a central robots.txt file which is common for all projects and which is maintained by the server administrators of Wikimedia.
Additionally, there's now the possibility to add custom entries on a per-wiki-base. Add lines to the Mediawiki:robots.txt page on your project and they will be sent as part of robots.txt, too.
Best regards,
JeLuF
Thanks, it's extremely useful!
However, currently they follow in a wrong order: local first, then global. That results in User-agent: * rule from local list (needed to set rules for every spider) allowing all user agents to ignore rules from the global list. That is easy to fix, just swap the two parts.