Thanks - A list of pages that need fixing is not a problem - it's pretty much a one-man wiki at the moment, so most of the content should need to be converted.
To add a bit of confusion to the issue, however, I've noticed that the system messages are also encoded as ISO-8859-1 and thus displaying badly in UTF-8. They haven't even been customized through the wiki, and I've tried cleaning the l10n_cache table. I'm not sure where it's getting non-UTF8 versions from. Any ideas how do I go about fixing that? When I switch the page encoding to ISO-8859-1 the text displays correctly...
On 12/11/2013, at 13:00, mediawiki-l-request(a)lists.wikimedia.org wrote:
> From: Andru Vallance <andru(a)tinymighty.com>
> Subject: [MediaWiki-l] Character set problem
> Date: 11 de noviembre de 2013 17:17:07 GMT+01:00
> To: "mediawiki-l(a)lists.wikimedia.org" <mediawiki-l(a)lists.wikimedia.org>
> Reply-To: MediaWiki announcements and site admin list <mediawiki-l(a)lists.wikimedia.org>
> I'm setting up a new wiki installation and running into some problems with garbage characters showing up due to mismatched character sets. The wiki in question is here: http://wikiausland.de/bookshop/Hauptseite
> New articles written in are fine and display in UTF-8 as expected, but the owner has copied over some content, presumably from an old wiki or MS Word, and it seems like it's in ISO-8859-1 and thus showing a heap of question marks for all the umlauts etc… does anyone know how I can go about converting a page from ISO-8859-1 to UTF-8 easily enough?
> I've tried setting $wgLegacyEncoding to 'ISO-8859-1'  in the hope it might do the conversion for me on article save, but no joy. Are there any other options?
> Any tips would be greatly appreciated!
>  https://www.mediawiki.org/wiki/Manual:$wgLegacyEncoding
> From: Jeremy Baron <jeremy(a)tuxmachine.com>
> Subject: Re: [MediaWiki-l] Character set problem
> Date: 11 de noviembre de 2013 17:38:33 GMT+01:00
> To: MediaWiki announcements and site admin list <mediawiki-l(a)lists.wikimedia.org>
> Reply-To: MediaWiki announcements and site admin list <mediawiki-l(a)lists.wikimedia.org>
> On Mon, Nov 11, 2013 at 4:17 PM, Andru Vallance <andru(a)tinymighty.com> wrote:
>> I'm setting up a new wiki installation and running into some problems with garbage characters showing up due to mismatched character sets. The wiki in question is here: http://wikiausland.de/bookshop/Hauptseite
>> New articles written in are fine and display in UTF-8 as expected, but the owner has copied over some content, presumably from an old wiki or MS Word, and it seems like it's in ISO-8859-1 and thus showing a heap of question marks for all the umlauts etc… does anyone know how I can go about converting a page from ISO-8859-1 to UTF-8 easily enough?
>> I've tried setting $wgLegacyEncoding to 'ISO-8859-1'  in the hope it might do the conversion for me on article save, but no joy. Are there any other options?
> I guess he copied over into a wiki that was already utf8 and so the
> row was marked as being utf8 already when saved.
> $wgLegacyEncoding should do nothing if the row is already utf8. You
> could fix this with a bot or possibly by changing the flag in the DB
> (idk how safe that is...).
> But the very first thing you need is a list of pages that need fixing.
> Maybe that's just as simple as listing that particular user's
l a fortune devant tes yeux free tous les secrets *Gab, félicitations,
vous êtes abonné succès*
*Patric Chan <patric(a)cbpassiveincome.com <patric(a)cbpassiveincome.com>>**18
novembre 2013 20:08*À: gab gervais <gabjgervais(a)gmail.com>
Salut Gab, http://gabou2014.cbpassive.hop.clickbank.net">
Félicitations, vous avez souscrit avec succès! :)
Voici 3 vidéo de formation gratuite pour vous:
Hi all, bug 42594
changing the default value of
<https://www.mediawiki.org/wiki/Manual:$wgNoFollowLinks>from true to
false. The status quo is that, by default, external URL links
in wiki text will be given the rel="nofollow" attribute as a hint to search
engines that they should not be followed for ranking purposes as they are
user-supplied and thus subject to spamming. If the change is implemented,
you will need to change your LocalSettings.php to switch $wgNoFollowLinks
to true if you want to keep the status quo on your wiki.
The argument for the status quo is that nofollow deters spammers. The
argument for the proposed change it is that it's better for the Internet as
a whole, and arguably for the individual wikis, to have the links followed
for ranking purposes. I'll focus on the arguments in favor of the change
and let others rebut them.
Suppose you run a wiki, wiki.foowidget.com, devoted to documenting your
software application, FooWidget. If you link to, say, the main
foowidget.comsite or to a vendor that stocks your software, would you
not want to
improve their pagerank, since this benefits you?
The same goes for, e.g., nonprofits that are promoting a cause. If you run
CancerWiki and there are a bunch of links on your site to the American
Cancer Society and other allied causes, would you not want to increase
their pagerank? I think that in the wikisphere, what we commonly see is
wikis devoted to niche interests they are trying to promote or share
information about. The reason they link to certain websites is that a
community consensus has decided that those sites are useful for effectively
promoting, or informing people about, those topics.
If the links are spammy, then the editing community at that wiki should
revert those spam edits. If they do so promptly, then if they have any
effect on pagerank at all, it won't be for long. A well-maintained wiki
will mostly have links to good sites, and the effect of the pagerank boost
those provide will drown out the pagerank boost that goes to the
short-lived spam links.
Also, we have other antispam tools that are way more effective than
nofollow at deterring spam. Sites that mirror a wiki may not apply nofollow
anyway, in which case those links might still increase the spammers'
pagerank, regardless of your nofollow setting. It's hard to reduce the
benefits that accrue to the spammers, except by vigilantly reverting their
edits; it's easier to increase the costs that the spammers incur, by using
CAPTCHAs and the like.
$wgNoFollowLinks was introduced in MediaWiki 1.4.0 as a setting that
defaults to true, so I'm not sure that we really gave the other option much
of a chance. Also, well-designed search engines should have other measures
too for sorting out what's spammy. There should be some sort of algorithm
for identifying wikis that have been overrun by spam, much as the search
engines have ways of figuring out which sites have a bunch of links just
for SEO purposes.
Nathan Larson <https://mediawiki.org/wiki/User:Leucosticte>
Distribution of my contributions to this email is hereby authorized
pursuant to the CC0 license<http://creativecommons.org/publicdomain/zero/1.0/>
Ive been doing some searching and I cannot find any basic extensions for
checklists. The closest I have seen is semantic forms which is a megalithic
extension requiring other extensions. How hard would it be to create an
extension that lets a user add/remove items, reorder them, and
At a wiki where I have a long participation, we are looking at an issue of
bland 404 pages (following mislinks or maybe deleted redirects). The
interest is whether we could look to give the user an enhanced 404 which
tells them that the page doesn't exist, AND give the user a shortened
search result based on their landing url/page title. I am presuming that
there is some capability to construct such a page, though wondered whether
there is something in the greater wikiverse that someone has already done.
I've been working on a wiki for a company. We are putting all their
user manuals in wiki format. Company personnel have been involved with
the content and presentation but not with the nuts-and-bolts of the editing.
I've been asked to put together a class / training session for the
company on using the wiki. These are technical people (engineers,
designers, mechanics) and their tech knowledge is high, but they really
don't have any experience with editing and maintaining wiki articles.
I'm sure I'm not the first to do this, so in the best tradition of the
Internet, I am looking for any tutorials or lesson plans I could use as
a starting point.
I've already compiled a list of the mediawiki/Help pages and wrote up
the documentation of my own tweaks; what I need is the basics,
introduction, examples, and so on to help me structure the class.
I'm just starting out building a Wiki and I'm a little confused. I would
like to be able to access my Wiki both within my home network and
externally over the internet. If I set $wgServer in LocalSettings.php to "
http://192.168.1.100" my intranet works as expected. Entering
"192.168.1.100" resolves to http://192.168.1.100/wiki/Main_Page and
everything works as expected complete with the vector theme. Obviously this
poses a problem if I try to access my wiki over the intranet as it can't
connect to 192.168.1.100.
if I set $wgServer to my registered domain name and try and access the wiki
externally everything works as expected but if i access the wiki internally
I have two different issues. If I try to use my domain name this ends up
pointing to my router admin page so the resolved name i.e with
"/wiki/Main_Page" appended does not exist on my router . If I use the i.p
address directly i.e http://192.168.1.100/wiki/Main_Page the web page is
displayed but I loose the vector theme. i,e it just has very basic
formatting. Navigating seems to work fine and images are displayed I just
don't get any nice formatting.
Sorry if this has rambled on but I didn't quite know hot to put it
eloquently. What configuration do I need to get my wiki to display properly
when accessed both internally and externally via the internet? I presume if
I could move my routers admin page away from port 80 this might solve the
issue but this isn't an option on my router.