Hello,
I recently upgraded to 1.14.0rc1. But now, every non-existing page (link is red) is not run through the PHP-interpreter, instead my browser offers me the "file download"
The content of such a "downloaded" file (index.php) looks like:
~~~~~~~~~~~~~~~~~~~~ [Process] Type=Edit text Engine=MediaWiki Script=http://www.mydomain.example/wiki/index.php Server=http://www.mydomain.example Path=/wiki Special namespace=Spezial
[File] Extension=wiki URL=http://www.www.mydomain.example/wiki/index.php?title=Benutzer:prename.name&a... ~~~~~~~~~~~~~~~~~~~~~
TIA for any hint, Peter
Peter Velan schrieb:
Hello,
I recently upgraded to 1.14.0rc1. But now, every non-existing page (link is red) is not run through the PHP-interpreter, instead my browser offers me the "file download"
The content of such a "downloaded" file (index.php) looks like:
[Process] Type=Edit text Engine=MediaWiki Script=http://www.mydomain.example/wiki/index.php Server=http://www.mydomain.example Path=/wiki Special namespace=Spezial [File] Extension=wiki URL=http://www.www.mydomain.example/wiki/index.php?title=Benutzer:prename.name&action=edit&internaledit=true
TIA for any hint, Peter
Disable on your preferences the option 'Use external editor by default' (Externen Editor als Standard benutzen)
am 09.02.2009 23:33 schrieb Platonides:
Peter Velan schrieb:
I recently upgraded to 1.14.0rc1. But now, every non-existing page (link is red) is not run through the PHP-interpreter, instead my browser offers me the "file download"
Disable on your preferences the option 'Use external editor by default' (Externen Editor als Standard benutzen)
Wow! My problem is solved - thanks a lot.
Peter
Hi, I am a no-brainer when it comes to Mediawiki customization. Would anyone thinking that making red links "nofollow" would be a good idea to prevent robots from crawling to non-existing pages?
PM Poon
On Tue, Feb 10, 2009 at 2:34 AM, Peter Velan pv0001@dynapic.net wrote:
Hello,
I recently upgraded to 1.14.0rc1. But now, every non-existing page (link is red) is not run through the PHP-interpreter, instead my browser offers me the "file download"
The content of such a "downloaded" file (index.php) looks like:
[Process] Type=Edit text Engine=MediaWiki Script=http://www.mydomain.example/wiki/index.php Server=http://www.mydomain.example Path=/wiki Special namespace=Spezial [File] Extension=wiki URL= http://www.www.mydomain.example/wiki/index.php?title=Benutzer:prename.name&action=edit&internaledit=true
TIA for any hint, Peter
MediaWiki-l mailing list MediaWiki-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
ekompute wrote:
Hi, I am a no-brainer when it comes to Mediawiki customization. Would anyone thinking that making red links "nofollow" would be a good idea to prevent robots from crawling to non-existing pages?
PM Poon
Well, preventing robots crawling inexistant pages seems a good idea. It goes through wgScript so it is usually already blocked by robots.txt
On Tue, Feb 10, 2009 at 8:37 AM, ekompute ekompute@gmail.com wrote:
Hi, I am a no-brainer when it comes to Mediawiki customization. Would anyone thinking that making red links "nofollow" would be a good idea to prevent robots from crawling to non-existing pages?
PM Poon
They shouldn't be following them since they now return a 404 status code to anyone that tries to access an red link.
mediawiki-l@lists.wikimedia.org