OK, can you please stop giving 403 Forbidden for HEAD on both pages that do and don't exist. It makes testing difficult.
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
jidanni@jidanni.org:
OK, can you please stop giving 403 Forbidden for HEAD on both pages that do and don't exist.
no. the entire point is to avoid sending requests to the backend. the error comes from Squid.
- river.
On 22.02.2009 03:57:15, jidanni@jidanni.org wrote:
OK, can you please stop giving 403 Forbidden for HEAD on both pages that do and don't exist. It makes testing difficult.
% HEAD -PS -H 'User-agent: leon' http://en.wikipedia.org/ HEAD http://en.wikipedia.org/ --> 301 Moved Permanently
Where does that make testing too hard?
Leon
On Sat, Feb 21, 2009 at 9:00 PM, Leon Weber leon@leonweber.de wrote:
On 22.02.2009 03:57:15, jidanni@jidanni.org wrote:
OK, can you please stop giving 403 Forbidden for HEAD on both pages that do and don't exist. It makes testing difficult.
% HEAD -PS -H 'User-agent: leon' http://en.wikipedia.org/ HEAD http://en.wikipedia.org/ --> 301 Moved Permanently
Where does that make testing too hard?
You first have to find some dude who tells you "oh, 403 means probably wrong user agent". IMO it should be stated in the HTML content of the 403 page WHY the request failed.
Marco
"403 Forbidden http://en.wikipedia.org/wiki/HTTP_403" "X-Squid-Error: ERR_ACCESS_DENIED 0" and do a quick Google search using relevant terms and you should be able to find out the issue pretty fast.
403 might not directly tell you that you used a bad user agent, but it should be a pretty big slap in the face that you weren't doing something right. This has been discussed on the mailing list already, so a good Google search using info from your error should spit out one or two sites or archives which mashup mailing lists.
As for "HTML Content"? Head requests don't have HTML bodies. In this case there was absolutely no attempt to find more information on one's own, so you can't say it's impossible to find the information without coming to the list.
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://nadir-seen-fire.com] -Nadir-Point & Wiki-Tools (http://nadir-point.com) (http://wiki-tools.com) -MonkeyScript (http://monkeyscript.org) -Animepedia (http://anime.wikia.com) -Narutopedia (http://naruto.wikia.com) -Soul Eater Wiki (http://souleater.wikia.com)
Marco Schuster wrote:
On Sat, Feb 21, 2009 at 9:00 PM, Leon Weber leon@leonweber.de wrote:
On 22.02.2009 03:57:15, jidanni@jidanni.org wrote:
OK, can you please stop giving 403 Forbidden for HEAD on both pages that do and don't exist. It makes testing difficult.
% HEAD -PS -H 'User-agent: leon' http://en.wikipedia.org/ HEAD http://en.wikipedia.org/ --> 301 Moved Permanently
Where does that make testing too hard?
You first have to find some dude who tells you "oh, 403 means probably wrong user agent". IMO it should be stated in the HTML content of the 403 page WHY the request failed.
Marco
On Sun, Feb 22, 2009 at 6:49 AM, Daniel Friesen dan_the_man@telus.net wrote:
As for "HTML Content"? Head requests don't have HTML bodies. In this case there was absolutely no attempt to find more information on one's own, so you can't say it's impossible to find the information without coming to the list.
Overlooked that. But GET / POST requests do have HTML bodies. IMO the best would be if in the HTML body (or in the HTTP header) information about the UA part that triggered the filter be present.
Marco
Marco Schuster wrote:
On Sun, Feb 22, 2009 at 6:49 AM, Daniel Friesen dan_the_man@telus.net wrote:
As for "HTML Content"? Head requests don't have HTML bodies. In this case there was absolutely no attempt to find more information on one's own, so you can't say it's impossible to find the information without coming to the list.
Overlooked that. But GET / POST requests do have HTML bodies. IMO the best would be if in the HTML body (or in the HTTP header) information about the UA part that triggered the filter be present.
Well, that's impossible without patching Squid, but we can certainly have a separate deny_info file for user agent and IP blocks. Currently we only have two deny_info files:
http://svn.wikimedia.org/viewvc/mediawiki/trunk/debs/squid/debian/errors/
-- Tim Starling
Maybe 405 would be more accurate if you don't like just the User-Agent,
[RFC2616] 405 Method Not Allowed The method specified in the Request-Line is not allowed for the resource identified by the Request-URI. The response MUST include an Allow header containing a list of valid methods for the requested resource.
But Request-Line doesn't include User-Agent, so never mind.
On Fri, Feb 27, 2009 at 10:23 PM, jidanni@jidanni.org wrote:
Maybe 405 would be more accurate if you don't like just the User-Agent,
[RFC2616] 405 Method Not Allowed The method specified in the Request-Line is not allowed for the resource identified by the Request-URI. The response MUST include an Allow header containing a list of valid methods for the requested resource.
But Request-Line doesn't include User-Agent, so never mind.
And the response requirement can't be met - if the UA fails, then *no* method will work and using 405 would be inaccurate.
Marco
wikitech-l@lists.wikimedia.org