[Mediawiki-l] Accessing Robots.txt with the Wiki in the Domain Root
Robert Leverington
lcarsdata at googlemail.com
Sun Apr 8 10:39:41 UTC 2007
robots.txt and sitemap.txt should be accessible anyway. Check you
don't have any server trickery that could be causing this.
On 08/04/07, Emufarmers Sangly <emufarmers at gmail.com> wrote:
> I'm running a wiki (http://howdypedia.com/) with the setup on the root of
> the domain; I know that you're not "supposed" to do this, but here I am.
> (It's loosely based off the structure of another wiki, and changing it would
> be a last resort.)
>
> Anyway, things worked fine with this for a while, until recently, I noticed
> that the robots.txt file for my domain wasn't accessible: It had been the
> last time I checked, so I have to assume that a software upgrade changed
> something here. Does anyone know a way I could kludge (through .htaccess,
> or whatever) robots.txt (and sitemap.txt) into being accessible, without
> changing the root structure?
> _______________________________________________
> MediaWiki-l mailing list
> MediaWiki-l at lists.wikimedia.org
> http://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>
--
-- Robert.
http://roberthl.allhyper.com
More information about the MediaWiki-l
mailing list