On Jan 26, 2008 2:01 PM, Thomas Dalton thomas.dalton@gmail.com wrote:
On 26/01/2008, Thomas Dalton thomas.dalton@gmail.com wrote:
Using referers isn't necessary. http://en.wikipedia.org/w/index.php?title=Gil_Prescott&action=edit is different from http://en.wikipedia.org/wiki/Gil_Prescot. Red links point to the former, which is clearly a 200 OK. A "link from external site or typing in the url" would presumably go to the latter.
But the "Edit this page" link points to the &action=edit too, but that certainly shouldn't return any kind of error code, since it does exactly what it says on the tin.
Hang on, I've missed your point slightly there. A redlink should not be a 200 OK, since it's a link to a page that doesn't exist (yes, it actually points to an edit page which does exist, but conceptually it's a broken link). A web crawler should see it as a broken link, it's only not broken if you're intending to contribute, not just read.
Web crawlers should note that it is a page which is excluded by robots.txt. It's not a broken link, it's a valid link, just one which is not meant for robots.
User-agent: * Disallow: /w/