Hello!
I just installed mediawiki on a new site. I've used it on other servers before and never encountered this problem. For some reason, i get "Error 330 (net::ERR_CONTENT_DECODING_FAILED): Unknown error.", when i try to enter a new url (/index.php?title=asd). This seem to be similar to the problem described here: http://www.mediawiki.org/wiki/Help_talk:Starting_a_new_page#Page_Not_Found_e...
I cannot for the life of me figured out what the problem is. I reinstalled from scratch, without any customizations, so this has to be a server config problem. It's not my server, but my guess it that it probably something I can fix by changing some php option in my .htaccess.
Anyone have a good idea?
My site is at: http://www.incrediblemusicmachine.se/
Thank you,
/ Gustaf
[guff] Gustaf Josefsson Entreprenör, inspiratör och expert på allt. guff@guff.se 0709-854979
- Sveriges mest kreativa problemlösare - www.entreprenorsjakten.se
-
Entreprenörsjakten - Sveriges mest kreativa problemlösare. Gustaf Josefsson Grundare & Visionär 0709-854979 gustaf@entreprenorsjakten.se www.entreprenorsjakten.se Läs Entreprenörsjaktens blogg på: www.entreprenorsjakten.se/blogg
Works for me: http://www.incrediblemusicmachine.se/index.php?title=Abc
DanB
On 03/02/11 03:11, Gustaf Josefsson wrote:
Hello!
I just installed mediawiki on a new site. I've used it on other servers before and never encountered this problem. For some reason, i get "Error 330 (net::ERR_CONTENT_DECODING_FAILED): Unknown error.", when i try to enter a new url (/index.php?title=asd). This seem to be similar to the problem described here: http://www.mediawiki.org/wiki/Help_talk:Starting_a_new_page#Page_Not_Found_e...
http://www.google.com/support/forum/p/Chrome/thread?tid=3388bd392b939f43&hl=en
There's lots of possible causes given in that thread. The most likely of them seems to be display_errors, you should try switching it off in php.ini.
-- Tim Starling
On 2 feb 2011, at 22.59, Tim Starling wrote:
On 03/02/11 03:11, Gustaf Josefsson wrote:
Hello!
I just installed mediawiki on a new site. I've used it on other servers before and never encountered this problem. For some reason, i get "Error 330 (net::ERR_CONTENT_DECODING_FAILED): Unknown error.", when i try to enter a new url (/index.php?title=asd). This seem to be similar to the problem described here: http://www.mediawiki.org/wiki/Help_talk:Starting_a_new_page#Page_Not_Found_e...
http://www.google.com/support/forum/p/Chrome/thread?tid=3388bd392b939f43&hl=en
There's lots of possible causes given in that thread. The most likely of them seems to be display_errors, you should try switching it off in php.ini.
-- Tim Starling
MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
I don't have the option of changing php.ini, but i'm able to override through .htaccess I added this to .htaccess: php_flag display_errors off
With no change :(
/ Gustaf
[guff] Gustaf Josefsson Entreprenör, inspiratör och expert på allt. guff@guff.se 0709-854979
- Sveriges mest kreativa problemlösare - www.entreprenorsjakten.se
On 02/02/11 17:11, Gustaf Josefsson wrote:
Hello!
I just installed mediawiki on a new site. I've used it on other servers before and never encountered this problem. For some reason, i get "Error 330 (net::ERR_CONTENT_DECODING_FAILED): Unknown error.", when i try to enter a new url (/index.php?title=asd). This seem to be similar to the problem described here: http://www.mediawiki.org/wiki/Help_talk:Starting_a_new_page#Page_Not_Found_e...
I cannot for the life of me figured out what the problem is. I reinstalled from scratch, without any customizations, so this has to be a server config problem. It's not my server, but my guess it that it probably something I can fix by changing some php option in my .htaccess.
Anyone have a good idea?
My site is at: http://www.incrediblemusicmachine.se/
For some reason, the edit link returns pure HTML but mark it as encoded with gzip. This confuse the browser which is expecting gzipped content. I reproduce it in both Safari and Firefox. IE might be able to guess it.
$ curl -I -H 'Accept-Encoding: gzip' 'http://www.incrediblemusicmachine.se/index.php?title=FOOBAR' HTTP/1.1 404 Not Found Date: Wed, 02 Feb 2011 22:32:55 GMT Server: Apache/1.3.37 (Unix) mod_gzip/1.3.26.1a mod_fastcgi/FSDATA-1.1 mod_jk/1.1.0 Embperl/2.0b8 mod_perl/1.29 PHP/4.4.3 mod_ssl/2.8.28 OpenSSL/0.9.8b X-Powered-By: PHP/5.2.14 Content-language: en Vary: Accept-Encoding,Cookie Expires: Thu, 01 Jan 1970 00:00:00 GMT Cache-Control: private, must-revalidate, max-age=0 Content-Encoding: gzip X-Powered-By: PHP/4.4.3 Content-Type: text/html
I have noticed a strange character at the top of the output:
curl -H 'Accept-Encoding: gzip' 'http://www.incrediblemusicmachine.se/index.php?title=FOOBAR' | hexdump -C 00000000 0a 3c 21 44 4f 43 54 59 50 45 20 68 74 6d |.<!DOCTYPE htm| ^^^^^
Not really helpful, but at least it gives the cause.
Ashar Voultoiz wrote:
On 02/02/11 17:11, Gustaf Josefsson wrote:
Hello!
I just installed mediawiki on a new site. I've used it on other servers before and never encountered this problem. For some reason, i get "Error 330 (net::ERR_CONTENT_DECODING_FAILED): Unknown error.", when i try to enter a new url (/index.php?title=asd). This seem to be similar to the problem described here: http://www.mediawiki.org/wiki/Help_talk:Starting_a_new_page#Page_Not_Found_e...
I cannot for the life of me figured out what the problem is. I reinstalled from scratch, without any customizations, so this has to be a server config problem. It's not my server, but my guess it that it probably something I can fix by changing some php option in my .htaccess.
Anyone have a good idea?
My site is at: http://www.incrediblemusicmachine.se/
For some reason, the edit link returns pure HTML but mark it as encoded with gzip. This confuse the browser which is expecting gzipped content. I reproduce it in both Safari and Firefox. IE might be able to guess it.
$ curl -I -H 'Accept-Encoding: gzip' 'http://www.incrediblemusicmachine.se/index.php?title=FOOBAR' HTTP/1.1 404 Not Found Date: Wed, 02 Feb 2011 22:32:55 GMT Server: Apache/1.3.37 (Unix) mod_gzip/1.3.26.1a mod_fastcgi/FSDATA-1.1 mod_jk/1.1.0 Embperl/2.0b8 mod_perl/1.29 PHP/4.4.3 mod_ssl/2.8.28 OpenSSL/0.9.8b X-Powered-By: PHP/5.2.14 Content-language: en Vary: Accept-Encoding,Cookie Expires: Thu, 01 Jan 1970 00:00:00 GMT Cache-Control: private, must-revalidate, max-age=0 Content-Encoding: gzip X-Powered-By: PHP/4.4.3 Content-Type: text/html
I have noticed a strange character at the top of the output:
curl -H 'Accept-Encoding: gzip' 'http://www.incrediblemusicmachine.se/index.php?title=FOOBAR' | hexdump -C 00000000 0a 3c 21 44 4f 43 54 59 50 45 20 68 74 6d |.<!DOCTYPE htm| ^^^^^
Not really helpful, but at least it gives the cause.
However, the Main Page is delivered compressed without errors. Check the files for a that newline before the <?php Have you tried to manual edit any of them? Although I find strange that a new line breaks the gziping process but doesn't also fail the header() call.
I'm still getting the occasional spam.. I've looked over the media-wiki website extensions.. Are there any better captcha's than the math question one? The spam bots seem to understand both versions of that one.
On 2/2/2011 7:24 PM, 2007@gmaskfx.com wrote:
I'm still getting the occasional spam.. I've looked over the media-wiki website extensions.. Are there any better captcha's than the math question one? The spam bots seem to understand both versions of that one.
http://www.mediawiki.org/wiki/Extension:ConfirmEdit lists the available captcha types.
Okay, cool.. I want to use ReCaptcha but after downloading the most recent snapshot I don't see the reCaptcha.php file in there?
--- On Wed, 2/2/11, Alex mrzmanwiki@gmail.com wrote:
From: Alex mrzmanwiki@gmail.com Subject: Re: [Mediawiki-l] Getting more spam To: 2007@gmaskfx.com, "MediaWiki announcements and site admin list" mediawiki-l@lists.wikimedia.org Date: Wednesday, February 2, 2011, 7:55 PM On 2/2/2011 7:24 PM, 2007@gmaskfx.com wrote:
I'm still getting the occasional spam.. I've looked
over the
media-wiki website extensions.. Are there any better
captcha's than
the math question one? The spam bots seem to
understand both versions
of that one.
http://www.mediawiki.org/wiki/Extension:ConfirmEdit lists the available captcha types.
-- Alex (wikipedia:en:User:Mr.Z-man)
Ah nevermind I wasn't downloading the trunk version
--- On Wed, 2/2/11, Alex mrzmanwiki@gmail.com wrote:
From: Alex mrzmanwiki@gmail.com Subject: Re: [Mediawiki-l] Getting more spam To: 2007@gmaskfx.com, "MediaWiki announcements and site admin list" mediawiki-l@lists.wikimedia.org Date: Wednesday, February 2, 2011, 7:55 PM On 2/2/2011 7:24 PM, 2007@gmaskfx.com wrote:
I'm still getting the occasional spam.. I've looked
over the
media-wiki website extensions.. Are there any better
captcha's than
the math question one? The spam bots seem to
understand both versions
of that one.
http://www.mediawiki.org/wiki/Extension:ConfirmEdit lists the available captcha types.
-- Alex (wikipedia:en:User:Mr.Z-man)
I want to configure reCaptcha so that anonymous user trigger on any edit and Users trigger it when the add URLs but right now it appears I can't configure those two differently?
On Wed, Feb 2, 2011 at 10:33 PM, 2007@gmaskfx.com 2007@gmaskfx.com wrote:
I want to configure reCaptcha so that anonymous user trigger on any edit and Users trigger it when the add URLs but right now it appears I can't configure those two differently?
Yes, I believe so. At some point someone should probably rewrite the permissions system for ConfirmEdit to be more granular (at the very least, the skipcaptcha right ought to work on a per-trigger basis).
Here are the settings I'm using.. as far as I can tell there is no way to assign different triggers to different types of users..it's either all or nothing.
$wgCaptchaTriggers['edit'] = false; $wgCaptchaTriggers['create'] = false; $wgCaptchaTriggers['addurl'] = true; $wgCaptchaTriggers['createaccount'] = true; $wgCaptchaTriggers['badlogin'] = true;
$wgGroupPermissions['*' ]['skipcaptcha'] = false; $wgGroupPermissions['user' ]['skipcaptcha'] = false; $wgGroupPermissions['autoconfirmed']['skipcaptcha'] = false; $wgGroupPermissions['bot' ]['skipcaptcha'] = true; // registered bots $wgGroupPermissions['sysop' ]['skipcaptcha'] = true;
--- On Wed, 2/2/11, Benjamin Lees emufarmers@gmail.com wrote:
From: Benjamin Lees emufarmers@gmail.com Subject: Re: [Mediawiki-l] reCaptcha To: 2007@gmaskfx.com, "MediaWiki announcements and site admin list" mediawiki-l@lists.wikimedia.org Date: Wednesday, February 2, 2011, 11:42 PM On Wed, Feb 2, 2011 at 10:33 PM, 2007@gmaskfx.com 2007@gmaskfx.com wrote:
I want to configure reCaptcha so that anonymous user
trigger on any edit and Users trigger it when the add URLs but right now it appears I can't configure those two differently?
Yes, I believe so. At some point someone should probably rewrite the permissions system for ConfirmEdit to be more granular (at the very least, the skipcaptcha right ought to work on a per-trigger basis).
Arghh.. it seems reCaptcha is worse than just having the math question
--- On Fri, 2/4/11, 2007@gmaskfx.com 2007@gmaskfx.com wrote:
From: 2007@gmaskfx.com 2007@gmaskfx.com Subject: Re: [Mediawiki-l] reCaptcha To: "MediaWiki announcements and site admin list" mediawiki-l@lists.wikimedia.org, "Benjamin Lees" emufarmers@gmail.com Date: Friday, February 4, 2011, 11:25 AM Here are the settings I'm using.. as far as I can tell there is no way to assign different triggers to different types of users..it's either all or nothing.
$wgCaptchaTriggers['edit'] = false; $wgCaptchaTriggers['create'] = false; $wgCaptchaTriggers['addurl'] = true; $wgCaptchaTriggers['createaccount'] = true; $wgCaptchaTriggers['badlogin'] = true;
$wgGroupPermissions['*' ]['skipcaptcha'] = false; $wgGroupPermissions['user' ]['skipcaptcha'] = false; $wgGroupPermissions['autoconfirmed']['skipcaptcha'] = false; $wgGroupPermissions['bot' ]['skipcaptcha'] = true; // registered bots $wgGroupPermissions['sysop' ]['skipcaptcha'] = true;
--- On Wed, 2/2/11, Benjamin Lees emufarmers@gmail.com wrote:
From: Benjamin Lees emufarmers@gmail.com Subject: Re: [Mediawiki-l] reCaptcha To: 2007@gmaskfx.com,
"MediaWiki announcements and site admin list" mediawiki-l@lists.wikimedia.org
Date: Wednesday, February 2, 2011, 11:42 PM On Wed, Feb 2, 2011 at 10:33 PM, 2007@gmaskfx.com 2007@gmaskfx.com wrote:
I want to configure reCaptcha so that anonymous
user
trigger on any edit and Users trigger it when the add
URLs
but right now it appears I can't configure those two differently?
Yes, I believe so. At some point someone should probably rewrite the permissions system for ConfirmEdit to be more granular
(at
the very least, the skipcaptcha right ought to work on a
per-trigger
basis).
MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
On 3 feb 2011, at 00.50, Platonides wrote:
Ashar Voultoiz wrote:
On 02/02/11 17:11, Gustaf Josefsson wrote:
Hello!
I just installed mediawiki on a new site. I've used it on other servers before and never encountered this problem. For some reason, i get "Error 330 (net::ERR_CONTENT_DECODING_FAILED): Unknown error.", when i try to enter a new url (/index.php?title=asd). This seem to be similar to the problem described here: http://www.mediawiki.org/wiki/Help_talk:Starting_a_new_page#Page_Not_Found_e...
I cannot for the life of me figured out what the problem is. I reinstalled from scratch, without any customizations, so this has to be a server config problem. It's not my server, but my guess it that it probably something I can fix by changing some php option in my .htaccess.
Anyone have a good idea?
My site is at: http://www.incrediblemusicmachine.se/
For some reason, the edit link returns pure HTML but mark it as encoded with gzip. This confuse the browser which is expecting gzipped content. I reproduce it in both Safari and Firefox. IE might be able to guess it.
$ curl -I -H 'Accept-Encoding: gzip' 'http://www.incrediblemusicmachine.se/index.php?title=FOOBAR' HTTP/1.1 404 Not Found Date: Wed, 02 Feb 2011 22:32:55 GMT Server: Apache/1.3.37 (Unix) mod_gzip/1.3.26.1a mod_fastcgi/FSDATA-1.1 mod_jk/1.1.0 Embperl/2.0b8 mod_perl/1.29 PHP/4.4.3 mod_ssl/2.8.28 OpenSSL/0.9.8b X-Powered-By: PHP/5.2.14 Content-language: en Vary: Accept-Encoding,Cookie Expires: Thu, 01 Jan 1970 00:00:00 GMT Cache-Control: private, must-revalidate, max-age=0 Content-Encoding: gzip X-Powered-By: PHP/4.4.3 Content-Type: text/html
I have noticed a strange character at the top of the output:
curl -H 'Accept-Encoding: gzip' 'http://www.incrediblemusicmachine.se/index.php?title=FOOBAR' | hexdump -C 00000000 0a 3c 21 44 4f 43 54 59 50 45 20 68 74 6d |.<!DOCTYPE htm| ^^^^^
Not really helpful, but at least it gives the cause.
However, the Main Page is delivered compressed without errors. Check the files for a that newline before the <?php Have you tried to manual edit any of them? Although I find strange that a new line breaks the gziping process but doesn't also fail the header() call.
MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
It does seem strange that no-one else has this problem. This is an account on one of sweden's largest hosting provider, so I can't imagine there being something special with their setup.
Can anyone else reproduce this on a fresh install of the latest version (mediawiki-1.16.2)?
/ g
[guff] Gustaf Josefsson Entreprenör, inspiratör och expert på allt. guff@guff.se 0709-854979
- Sveriges mest kreativa problemlösare - www.entreprenorsjakten.se
On 03/02/11 09:39, Ashar Voultoiz wrote:
For some reason, the edit link returns pure HTML but mark it as encoded with gzip. This confuse the browser which is expecting gzipped content. I reproduce it in both Safari and Firefox. IE might be able to guess it.
$ curl -I -H 'Accept-Encoding: gzip' 'http://www.incrediblemusicmachine.se/index.php?title=FOOBAR' HTTP/1.1 404 Not Found Date: Wed, 02 Feb 2011 22:32:55 GMT Server: Apache/1.3.37 (Unix) mod_gzip/1.3.26.1a mod_fastcgi/FSDATA-1.1 mod_jk/1.1.0 Embperl/2.0b8 mod_perl/1.29 PHP/4.4.3 mod_ssl/2.8.28 OpenSSL/0.9.8b X-Powered-By: PHP/5.2.14 Content-language: en Vary: Accept-Encoding,Cookie Expires: Thu, 01 Jan 1970 00:00:00 GMT Cache-Control: private, must-revalidate, max-age=0 Content-Encoding: gzip X-Powered-By: PHP/4.4.3 Content-Type: text/html
I have noticed a strange character at the top of the output:
curl -H 'Accept-Encoding: gzip' 'http://www.incrediblemusicmachine.se/index.php?title=FOOBAR' | hexdump -C 00000000 0a 3c 21 44 4f 43 54 59 50 45 20 68 74 6d |.<!DOCTYPE htm| ^^^^^
Not really helpful, but at least it gives the cause.
That's a line break. It's not really that strange, you're allowed to have line breaks at the top of HTML documents.
The strange thing is the rest of the body. The body of the MediaWiki response has been removed and replaced with an advertisement for fsdata.se, masquerading as an error page. Instead of removing the headers from MediaWiki, it has appended to them, so for instance there are two X-Powered-By headers. The replacement was presumably triggered by the 404 error code which MediaWiki gives in this case.
The solution is to switch to a hosting provider that doesn't mangle your output. Or failing that, comment out the 404 header in MediaWiki. It's somewhere around line 1281 in includes/Article.php. You want to change this:
$wgRequest->response()->header( "HTTP/1.x 404 Not Found" );
To this:
// LOCAL PATCH // Removed because it breaks on fsdata.se // $wgRequest->response()->header( "HTTP/1.x 404 Not Found" );
Gustaf Josefsson wrote:
I don't have the option of changing php.ini, but i'm able to override through .htaccess I added this to .htaccess: php_flag display_errors off
With no change :(
Yeah, sorry for not doing a full analysis the first time around.
-- Tim Starling
On 3 feb 2011, at 04.17, Tim Starling wrote:
On 03/02/11 09:39, Ashar Voultoiz wrote:
For some reason, the edit link returns pure HTML but mark it as encoded with gzip. This confuse the browser which is expecting gzipped content. I reproduce it in both Safari and Firefox. IE might be able to guess it.
$ curl -I -H 'Accept-Encoding: gzip' 'http://www.incrediblemusicmachine.se/index.php?title=FOOBAR' HTTP/1.1 404 Not Found Date: Wed, 02 Feb 2011 22:32:55 GMT Server: Apache/1.3.37 (Unix) mod_gzip/1.3.26.1a mod_fastcgi/FSDATA-1.1 mod_jk/1.1.0 Embperl/2.0b8 mod_perl/1.29 PHP/4.4.3 mod_ssl/2.8.28 OpenSSL/0.9.8b X-Powered-By: PHP/5.2.14 Content-language: en Vary: Accept-Encoding,Cookie Expires: Thu, 01 Jan 1970 00:00:00 GMT Cache-Control: private, must-revalidate, max-age=0 Content-Encoding: gzip X-Powered-By: PHP/4.4.3 Content-Type: text/html
I have noticed a strange character at the top of the output:
curl -H 'Accept-Encoding: gzip' 'http://www.incrediblemusicmachine.se/index.php?title=FOOBAR' | hexdump -C 00000000 0a 3c 21 44 4f 43 54 59 50 45 20 68 74 6d |.<!DOCTYPE htm| ^^^^^
Not really helpful, but at least it gives the cause.
That's a line break. It's not really that strange, you're allowed to have line breaks at the top of HTML documents.
The strange thing is the rest of the body. The body of the MediaWiki response has been removed and replaced with an advertisement for fsdata.se, masquerading as an error page. Instead of removing the headers from MediaWiki, it has appended to them, so for instance there are two X-Powered-By headers. The replacement was presumably triggered by the 404 error code which MediaWiki gives in this case.
The solution is to switch to a hosting provider that doesn't mangle your output. Or failing that, comment out the 404 header in MediaWiki. It's somewhere around line 1281 in includes/Article.php. You want to change this:
$wgRequest->response()->header( "HTTP/1.x 404 Not Found" );
To this:
// LOCAL PATCH // Removed because it breaks on fsdata.se // $wgRequest->response()->header( "HTTP/1.x 404 Not Found" );
Gustaf Josefsson wrote:
I don't have the option of changing php.ini, but i'm able to override through .htaccess I added this to .htaccess: php_flag display_errors off
With no change :(
Yeah, sorry for not doing a full analysis the first time around.
-- Tim Starling
MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
That works.
But, it does seem like an ugly hack? Isn't there any way to do this by changing .htaccess? ErrorDocument 404 Default doesn't work. Any other ideas?
FSDATA can't be the only hosting provider out there who override it's users 404 page. I would like to figure this out and document it properly (since I spent hours on mediawiki.org before mailing here).
-g
[guff] Gustaf Josefsson Entreprenör, inspiratör och expert på allt. guff@guff.se 0709-854979
- Sveriges mest kreativa problemlösare - www.entreprenorsjakten.se
On 03/02/11 20:08, Gustaf Josefsson wrote:
But, it does seem like an ugly hack?
It's not that ugly. The 404 header was only introduced in MediaWiki 1.14, you're just reverting to the 1.13 behaviour, which worked well enough for many years. See bug 2585 for a discussion.
Isn't there any way to do this by changing .htaccess? ErrorDocument 404 Default doesn't work. Any other ideas?
Without any source code or documentation, we can only speculate. The fact that the implementation is faulty tells you that this is a hack applied by fsdata.se, it's not part of Apache. I doubt they would have bothered adding configuration code.
FSDATA can't be the only hosting provider out there who override it's users 404 page. I would like to figure this out and document it properly (since I spent hours on mediawiki.org before mailing here).
It's the first time I've heard of it. I would expect most providers to just use ErrorDocument, not to write their own Apache module.
-- Tim Starling
mediawiki-l@lists.wikimedia.org