A discussion just came up on the tech list that deserves input from the list at large: how do we want to restrict access (if at all) to robots on wikipedia special pages and edit pages and such?
There are two issues to decide for each class of page: do we want robots to index that page (i.e., list it among those to be searched), and do we want robots to follow links on that page in search of others? (These correspond to the "noindex,nofollow" directive in the robots meta tag).
There is some concern that allowing robots to index everything might be a performance issue, but there is no evidence of this. Cleearly, robots provide invaluable traffic boosts. Also, I think it's important that we maintain a reputation for having _quality_ links, not merely popular ones.
My own opinion leans toward not indexing pages that are input forms, namely edit forms, login forms, user setttings form, etc.; because these pages aren't "information", and don't contain anything that isn't already on a higher-quality page. I see no compelling reason to disallow following links on any page, though.
I'm of two minds about "recent changes". Theoretically, it's a dynamic page that makes all indexing useless because it's content changes every time it is accessed. On the other hand, it's a popular page and a good source of timely links.
What do you think?
0