wojtek pobratyn wrote:
a usefull thing. An issue worth considering is banning robots from special pages, Recent Changes, Talk, User, etc. by an appropriate META tag. IMHO, robots and spiders should only be allowed to go over the articles.
Google's algorithm these days takes notice of pages that change rapidly. I think they spider RecentChanges a lot, to get current pages. I think we should allow for that.
Also, in my experience, "professional" robots (google, altavista, inktomi, etc.) are *overly* polite, hitting the site only once per minute, etc. Every now and then I'll find (at Bomis) some jerk who wrote a homemade bot that's hammering us.
But really, we should be able to handle the occcassional spider with no difficulty.