[Mediawiki-l] Web page source - "strange" characters
Léa Massiot
lea.massiot at ign.fr
Tue Mar 23 14:50:25 UTC 2010
I am so sorry I bothered you with the previous two posts but problems
occurred:
proxy blocking, email error... Sniff!
lmhelp2 is the same as lmhelp...
Here is the post I wanted to submit:
-----------------------------------------------------------------------------------------------
Hi nakohdo,
Thank you for your precious help :) .
I am using "Firefox":
in "View -> Character Encoding", "Unicode (UTF-8)" was and is checked.
> Try opening the robots.txt in your browser and change the encoding to
UTF-8.
With both "Firefox" and "Internet Explorer" Chinese characters aren't
displayed
properly. (The character encoding being Unicode (UTF-8) in both cases).
> right click, "Save target as..."
I have "Save Page As..."
So I saved the page as "robots.txt" and opened it with "MS Word".
It proposed me to choose a specific encoding:
I chose Unicode (UTF-8).
And it worked! I could see the Chinese characters and other foreign
characters too!
> The robots.txt file you mentioned in your first posting doesn't
provide a mechanism
> for telling its encoding so the browser has to guess or take the
defaul settings.
So, I added, at the beginning of the file "robots.txt", the following code
(without the starting dashes):
--<html>
--<head>
--<meta http-equiv="Content-Type"
content="text/html;charset=utf-8" />
--</head>
--<body>
and at the end:
--</body>
--</html>
And I changed the file extension:
robots.txt -> robots.html
Then, I opened it with Firefox and IE and I got the Chinese characters
and the other characters properly rendered too!
Thanks for your brains :) .
Sincerely,
--
Lmhelp
More information about the MediaWiki-l
mailing list