Agon S. Buchholz wrote:
Emufarmers Sangly wrote:
> I've tried so far (a) to mimic the setup
of the Wikimedia wikis with a
> similar robots.txt,
>
Oh? Have you blocked /w/? How about special
pages?
I hope so: "User-agent: * Disallow: /w/"
The special pages sections are taken directly from the Wikimedia directives.
Have you tried "User-agent: * Disallow /wiki/Special:" ? When I wanted
to stop Google from indexing specific namespaces that worked on my wiki
(although it took about a year before all my disallowed namespaces got
cleaned out of the Google index completely).
MinuteElectron.