[Mediawiki-l] Optimization of Rewriting Rules?

mediawiki-l at Wikimedia.org mediawiki-l at Wikimedia.org
Wed Feb 16 17:41:42 UTC 2005


"Martin" == Martin Steiger <southernapproachwiki at gmail.com> writes:

> Right now, I'm using the following working rewriting rules for my
> SouthernApproachWiki (http://www.southernapproach.ch/wiki/), however,
> they are sometimes slow and might contain errors. Google for example
> hasn't indexed my wiki so far, most likely because of the rewriting
> rules.

 More likely because of a missing robots.txt:

$wget http://www.southernapproach.ch/wiki/robots.txt
--18:33:18--  http://www.southernapproach.ch/wiki/robots.txt
           => `robots.txt'
Resolving www.southernapproach.ch... 217.26.52.26
Connecting to www.southernapproach.ch[217.26.52.26]:80... connected.
HTTP request sent, awaiting response... 301 
Location: http://www.southernapproach.ch/wiki/Robots.txt [following]
--18:33:19--  http://www.southernapproach.ch/wiki/Robots.txt
           => `Robots.txt'

 As you can see, there's no valid robots.txt, and what gets served
instead is maybe enough to confuse the bot.

 If, on the other hand this is not the problem, and you in fact have
made Google aware of your wikis existance, the next potential problem
is that your oldest article is from 3. Jan 2005. It is not unusual for
Google to delay listing for a few months. 

 You may also want to link to your Wiki from a few places, just to
speed things up a bit. A logical place would be [[meta:Sites using MediaWiki]] 

-- 
/Wegge <http://wiki.wegge.dk/>
<http://wiki.wegge.dk/Folk_jeg_ignorerer_p%C3%A5_usenet>
mailto:awegge at gmail.com - Invitationer på FCFS basis



More information about the MediaWiki-l mailing list