Hi,
whenever a request is sent to Wikipedia with the following header, the
result is Error 403:
User-Agent: W3C-checklink/4.1 [4.14] libwww-perl/5.803
The resulting document has a XHTML 1.0 Strict definition, but it is not
valid XHTML 1.0 Strict - see
http://topjaklont.student.utwente.nl/invalid.html and try to validate
it.
Why is Error 403 served at all? As for example
www.sp.nl proves,
W3C-checklink can obey robots.txt. It also sleeps one second between
each request, so it does a fair job in throttling. What's the problem
then?
regards,
Gerrit Holl.
--
Weather in Twenthe, Netherlands 27/02 16:25:
-3.0°C wind 6.3 m/s NNE (57 m above NAP)
--
In the councils of government, we must guard against the acquisition of
unwarranted influence, whether sought or unsought, by the
military-industrial complex. The potential for the disastrous rise of
misplaced power exists and will persist.
-Dwight David Eisenhower, January 17, 1961