SELECT 'all', count(0) FROM user
UNION
SELECT '0 edit', count(0) FROM user WHERE user_editcount = 0;
all count(0)
all 22749
0 edit 22288
All but ~500 out of ~22,000 users on my wiki have never made a single
edit. Is there a good way to clean out these junk records? Is there an
extension that puts e.g. a captcha on the user register page (not just
edit)?
I see a pile of bugs on FCK Editor have been resolved "invalid"
because upstream have declared MediaWiki unsupported.
So what are the current options for WYSIWYG editing, even non-ideal
ones? I believe this leaves none ...
- d.
I just tried to upgrade our version of Mediawiki(1.13?) to Mediawiki
1.17. Everything went fine until I went to run the update.php script and
I got this error: ld.so.1: php: fatal: relocation error: file
/usr/local/depot/php-5.3.6/php-5.3.6 /bin/php: symbol
xmlTextReaderGetAttribute: referenced symbol not found
Killed
I googled for some answers and it keeps pointing to libxml2 but the
version is 2.73 and is the same as another wiki I have running on
another server with Mediawiki 1.17. So far the only thing I can find
different between the 2 installs is the one that is working uses Solaris
5.8 and the one giving the error is version 5.9. Is there a problem with
Mediawiki 1.17/php and Solaris 5.9?
Any help would be appreciated.
On 07/27/11 08:00, mediawiki-l-request(a)lists.wikimedia.org wrote:
> On 26/07/11 06:58, Anthony Lieuallen wrote:
>> SELECT 'all', count(0) FROM user
>> UNION
>> SELECT '0 edit', count(0) FROM user WHERE user_editcount = 0;
>>
>> all count(0)
>> all 22749
>> 0 edit 22288
>>
>> All but ~500 out of ~22,000 users on my wiki have never made a single
>> edit. Is there a good way to clean out these junk records? Is there an
>> extension that puts e.g. a captcha on the user register page (not just
>> edit)?
>
> Maybe some users only use the watchlist functionality?
Nope, they're spammers' failed accounts.
SELECT count( DISTINCT ( wl_user ) ) FROM watchlist;
count(distinct(wl_user))
3580
(And lots of those are for pages that don't exist on this wiki, and
never did, including e.g. full URLs to facebook photos,
"Www.farmville.com(bonus)", and in at least one case, many hundreds of
lines of junk intended as page contents, each submitted individually as
a watchlist page title, and two accounts with many thousands of invalid
watched pages -- the wiki has only around 300 real pages.)
Hello,
Wikimania is just next week, but two months from now will be another
conference that may be of interest to MediaWiki users and developers:
SMWCon, or the Semantic MediaWiki Conference, which will be happening in
Berlin on September 21-23:
http://semantic-mediawiki.org/wiki/SMWCon_Fall_2011
Semantic MediaWiki, if you don't know about it, is a MediaWiki extension
that has somewhat taken on a life of its own in the six years since it was
first created:
http://semantic-mediawiki.org/wiki/Semantic_MediaWiki
Though of course the main focus will be Semantic MediaWiki, we plan to have
talks on other related topics, like DBpedia, ontologies, and the creation of
a semantic data repository (not necessarily SMW-based) for Wikipedia itself.
And there's definitely room in the schedule for talking about issues related
to core MediaWiki.
This is a call for presentations, so if you want to attend, please consider
giving a talk. To propose a talk, just add a line to the wiki page here:
http://semantic-mediawiki.org/wiki/SMWCon_Fall_2011#Conference_days
And while we're at it, if you plan to attend SMWCon, please add your name to
the attendees list, here:
http://semantic-mediawiki.org/wiki/SMWCon_Fall_2011#Registration
-Yaron
As per
http://www.mediawiki.org/wiki/Manual:Tag_extensions#Publishing_your_extensi…,
I am notifying this list of a newly added MediaWiki extension called
Favorites:
http://www.mediawiki.org/wiki/Extension:Favorites
This extension is not 1.17 ready, but works well in 1.16. Please feel free
to review, alter, update as needed. This is my first stab at writing an
extension, and I'm sure I didn't follow all the accepted conventions. But I
did base my code off of MW php, so it should at least be safe.
Thanks!
Hallo,
I've just tried to edit one of my wiki pages and have got the message
Detected bug in an extension! Hook ConfirmEditHooks::confirmEditMerged
has invalid call signature; Parameter 1 to
ConfirmEditHooks::confirmEditMerged() expected to be a reference, value
given
Backtrace:
#0 /path/to/mediawiki-1.17.0/includes/EditPage.php(999): wfRunHooks('EditFilterMerge...', Array)
#1 /path/to/mediawiki-1.17.0/includes/EditPage.php(2656): EditPage->internalAttemptSave(false, false)
#2 /path/to/mediawiki-1.17.0/includes/EditPage.php(415): EditPage->attemptSave()
#3 /path/to/mediawiki-1.17.0/includes/EditPage.php(296): EditPage->edit()
#4 /path/to/mediawiki-1.17.0/includes/Wiki.php(522): EditPage->submit()
#5 /path/to/mediawiki-1.17.0/includes/Wiki.php(69): MediaWiki->performAction(Object(OutputPage), Object(Article), Object(Title), Object(User), Object(WebRequest))
#6 /path/to/mediawiki-1.17.0/index.php(114): MediaWiki->performRequestForTitle(Object(Title), Object(Article), Object(OutputPage), Object(User), Object(WebRequest))
#7 {main}
(first I had been asked to edit "LocalSettings.php":
Set
$wgShowExceptionDetails = true;
at the bottom of LocalSettings.php to show detailed debugging
information.)
---------------------------------------
Two days ago a user with a strange looking login name had subscribed.
--------------------------------------------------------
What goes wrong? Does some bad guy use a backdoor?
Viele Gruesse!
Helmut
Hello
We have a mediawiki installation that we are using to maintain
biodiveristy information.
I want to write a custom bot which goes through a bunch of pages and
makes some trivial changes periodically.
I am trying to write a bot to do this and I am using BasicBot class
along with Snoopy (description is here :
http://wikisum.com/w/User:Adam/Creating_MediaWiki_bots_in_PHP)
>From the log files I can see that my bot logins successfully and gets
a cookie brahma_session=VERY_LONG_HEX_NUMBER
However, when he bot tries to use this cookie to login, it fails. I
verified that apache receives the cookie but for some reason mediawiki
does not recognize it and throws "login failure (or user not logged in
to make change)" kind of error. Hence the bot fails to do any change.
Can anyone help me diagnose whats going wrong here ? Any help will be
greatly appreciated.
Thanks,
Alok
I just encountered the strangest MediaWiki behavior I've ever tried to debug. Thought you'd find it interesting. (MediaWiki gurus: is the behavior & solution below new in 1.17.0?) I spent several hours figuring this out.
After installing MW 1.17, I noticed that every page on our wiki was throwing an identical error. The exact details of the error aren't important (an extension was invoking an external Linux command that was failing). The weird part was this: only ONE page on the wiki invoked this extension and could possibly produce the error... but that page was never being hit! At all! And EVERY wiki page was producing the error. Even special pages! This was really strange.
The page was Template:Foo. So I figured, naturally, that Template:Foo must be getting transcluded by a system message or some other article used on every page. That turned out to be wrong.
So I removed all extensions except the one throwing the error. Nothing changed. So it's not caused by an interaction between extensions.
So, I found the line of code where the error was being thrown and inserted a var_dump(debug_backtrace()) to see what's going on. And now it got even weirder. Every time any wiki page was displayed, it was not just transcluding Template:Foo. It was parsing a RANDOM ARTICLE that contains Template:Foo. Each time I displayed any wiki page, a different random article would get parsed, and they all transcluded Template:Foo. Wild!
At this point, I was very confused.
Until I noticed that the "random articles" being selected for parsing were being picked in alphabetical order.
Any guesses? :-)
My coworker figured it out, or at least he has an idea that is consistent with the facts. Because Template:Foo is throwing an error, all pages that transclude it are failing to render. Which (probably) means they are being added to the MediaWiki job queue for reparsing. So when any wiki page is hit, an article on the MediaWiki job queue is being processed. It fails again, throwing the error.
So, Template:Foo was not being used by the currently displayed article at all. It was being parsed via the job queue afterward.
Did something change in 1.17.0 with job queue processing to make this behavior happen, or at least make it more visible in the apache error log?
Thanks,
DanB