I was careless enough to run a MediaWiki installation allowing people to
sign up without a moderator's approval. Hence a few hundred of them did
sign up and started to use the site to swap images.
How can I delete their accounts in the most expeditious way?
We've created one page called MediaWiki:CentralCSS.css . This page is supposed to "call" css subpages.
Then we've created 5 css subpages, called :
Now, we'd like the 5 css subpages been "called" into the central one.
What should we write into the MediaWiki:CentralCSS.css in order to get the css rules writen in subpages, working across the central css page ?
something like : [[MediaWiki:Reset.css]] ? putting name of pages into brackets [[ xxx ]] ?
Thanks a lot for your help
Je suis en train d'installer un site utilisant mediawiki.
Je voudrais qu'il fonctionne pour des utilisateurs français et anglais, principalement.
Je voudrais que pour chaque article paru, il soit possible de traduire l'article dans l'autre langue, et d'indiquer sur la page de l'article, un lien vers sa traduction.
J'ai bien compris que le fait de créer une page en ajoutant un "slash" et le code de la langue, permet de réaliser la traduction de l'article.
Exemple: Si la langue par défaut est l'anglais, alors Page1/fr permet de créer la traduction en français de Page1 écrite en anglais.
J'ai bien compris qu'en plaçant manuellement un lien vers la page traduite, cela permet de cliquer sur ce lien et d'arriver sur la traduction de la page.
Mais comment faire pour l'automatiser ? Est-il possible de récupérer les façons de faire du site meta-wiki ou du site mediawiki dans ce domaine du multilinguisme ? Et comment faire ?
Les articles que j'ai lu là dessus, ne m'ont pas éclairé. Je ne comprends rien.
L'installation de "translate" a échoué. Et je n'arrive pas à savoir quoi faire.
Je ne comprends pas du tout les logiques "wiki farm" qui semblent pouvoir aider dans le domaine du multilinguisme.
Bref, je suis perdu !
Spam bots are creating accounts on my music wiki (http://www.blazemonger.com/GG/Special:RecentChanges). What is the best extension to prevent this? I installed ConfirmEdit and tried the default CAPTCHA (SimpleCAPTCHA) but it didn't stop the bots. Before I experiment with the many other options, I thought I'd ask what people recommend.
Fortunately the bots can't edit articles, just create useless accounts.
Apologies if you have received multiple copies.
I would appreciate any help regarding inserting custom text into articles.
The problem setting is as follows:
I am currently trying to create a wiki where each article is about a
specific entity (say, for example, a book). Above each article's body
content, I want to insert custom text (such as the author of the book, the
retail price of the book, the publisher of the book, etc) in the form of
HTML. I do not want the users to be able to edit these custom text, and
thus I want to "inject" these text as the content of the page gets loaded.
What would be the cleanest approach to solve this problem?
I have looked at the list of hooks available at
but was unable to find any hook that satisfied my need. I have also tried
modifying the core code (the outputPage() method of the Skin class, for
example), but things got ugly pretty quickly, and it was hard to maintain
the code. I would appreciate any help.
Thanks in advance,
I tried to update our (intra-)mediawiki to 1.18. I can browse and
read the new wiki, but can't edit a page. After click to edit tab, I
get the following message:
| You do not have permission to edit this page, for the
| following reason:
| Your username or IP address has been blocked.
| The block was made by 184.108.40.206. The reason given is "ist nicht
| mehr bei OURCOPMANY".
| Start of block: 19:02, 13 November 2007
| Expiry of block: infinite
| Intended blockee: My.username
| You can contact 220.127.116.11 or another administrator to discuss
| the block. You cannot use the 'e-mail this user' feature
| unless a valid e-mail address is specified in your account
| preferences and you have not been blocked from using it. Your
| current IP address is 18.104.22.168, and the block ID is #2. Please
| include all above details in any queries you make.
| You can view and copy the source of this page: ...
IP address "22.214.171.124" is always the same (that is the static address
where I'm always coming from), "My.username" is my Wiki username
(with administrative rights).
Where could I start to debug this problem?
Thanks in advance for any helpful hint,
PS: I copied our stable wiki (1.17) to a new location, then
proceeded as per http://www.mediawiki.org/wiki/Manual:Upgrading
("update.php" was successfully run via bash shell)
I would like to add content to my mediawiki from a script.
For example, after a successful build, I would like to add info to
mediawiki page such as "2/25/12 Build completed successfully" via
Any easy ways to do this via some extension, or otherwise ?
On 02/27/2012 02:17 PM, mediawiki-l-request(a)lists.wikimedia.org wrote:
> Date: Mon, 27 Feb 2012 17:57:14 +0000
> From: Daniel Barrett <danb(a)VistaPrint.com>
> To: MediaWiki announcements and site admin list
> Subject: [Mediawiki-l] CAPTCHA recommendation for account-creation
> Content-Type: text/plain; charset="us-ascii"
> Spam bots are creating accounts on my music wiki (http://www.blazemonger.com/GG/Special:RecentChanges). What is the best extension to prevent this? I installed ConfirmEdit and tried the default CAPTCHA (SimpleCAPTCHA) but it didn't stop the bots. Before I experiment with the many other options, I thought I'd ask what people recommend.
> Fortunately the bots can't edit articles, just create useless accounts.
I'll take this opportunity to point out that Dan's not the only one.
People who run MediaWiki instances come into IRC (#mediawiki) every day
asking for help fighting spam and vandalism. Some emails that have come
my way (reprinted with permission):
Jason Vertrees wrote:
> HI Sumana,
> After opening up my wiki for free registration, I immediately got
> three spam accounts. One turned into real spam being injected into and
> then cleaned from the wiki. None of the methods aside from preventing
> self-registration seem to work against spam. Unfortunately, this also
> prevents growth.
> I read the links you provided--do you have anything else more proactive?
> -- Jason
Sergey Chernyshev wrote:
"Do you by any chance know a reliable way to combat spam? I'm tired of
the hordes of spammers attacking my MW projects.... Unfortunately, I'm
well aware of current anti-spam features and none of them simplify the
spam combat to the level similar to WordPress, for example."
Anne Gray wrote of sfeditorwatch.com and sfartistwatch.com, which have
Google Analytics, ReCaptcha and Bad Behavior installed:
> Thank you for offering to help try to figure out what's gone wrong
> with sfeditorwatch.com and sfartistwatch.com. Cheryl's email reminds
> me that activity on the sf editors wiki was really picking up for a
> while, the year before it started getting attention from spambots.
> Once I started having to protect pages to keep from losing content and
> the wiki started being full of spam, participation declined sharply.
> It rapidly hit the point where I as an administrator couldn't possibly
> hope to keep up. Then reCaptcha broke and people *couldn't* edit
> their own pages. At this point, I'm pretty sure bots are almost the
> only thing active there....
> Sfartistwatch never really took off partly, I think, because it's
> really non-obvious how to post images (I still have never learned).
> And I'm worried that if we turn it on, it will get hit by the kind of
> spammers that are posting hundreds of spam images to the Carl Brandon
> society wiki...
> I don't know enough to set up bots to patrol, to set up notifications
> so that I as the administrator can "watch" all the pages, or to use
> scripts or whatever to nuke the massive amounts of spam pages that are
> created regularly. Bots are clearly getting past recaptcha to
> register users (they're doing that on the carl brandon wiki, too), and
> the "Random page" button is totally useless for a visitor who is
> actually interested in the topic of the wiki (is there a way to
> reprogram that, so it only selects a random page that has a category,
> at least? (it would be sooo nice, when you Block a user for spam or
> vandalism, for the following page to list pages that user created,
> with checkboxes beside them, and provide the option to select all of
> them at once and delete them. Even better if it gave the option of
> automatically protecting against future re-creation of those spam
Most of these administrators are smart people who just need a little
help fighting the bots. It sounds like the recommendations from this
group so far are:
* https://www.mediawiki.org/wiki/Extension:QuestyCaptcha , a plugin for
the ConfirmEdit extension
so I've added those to these help pages:
and welcome additional improvements to those docs, especially if
recommendations that are in there right now are no longer worthy of
Also, anyone want to take a crack at cleaning out what's no longer
applicable in https://www.mediawiki.org/wiki/Spam_Filter so I can
suggest it as a project for contributors?
Volunteer Development Coordinator
I'm just upgrading from 1.16.2 to 1.18.1. Everything seems to be fine
apart from a custom skin (the built-in skins work ok). The skin
('Resources') is a small variation on Vector. When I switch
$wgDefaultSkin to my skin, the skin code gets called but no css is
loaded. I've tried starting again by copying Vector.php to
Resources.php, copying vector/* to resources/*, and changing all
references to '[Vv]ector to '[Rr]esources' in Resources.php, but the css
is still ignored.
As you can tell from this description I don't know how the skin system
works in any depth and am hacking away blindly. Can anyone suggest where
I might be going wrong?