I recently set up a MediaWiki (http://server.bluewatersys.com/w90n740/)
and I need to extra the content from it and convert it into LaTeX
syntax for printed documentation. I have googled for a suitable OSS
solution but nothing was apparent.
I would prefer a script written in Python, but any recommendations
would be very welcome.
Do you know of anything suitable?
I've been informally mentoring André, Tiago, Diego, and César. They
are four students at Minho University who are currently working on a
project to improve DB2 database support in MediaWiki.
So far, they've:
- Fixed several outstanding issues with DB2 support involving
character encoding, Windows vs Linux, etc
- Added DB2 support to the new MediaWiki 1.17 Installer and Updater
- Put in the appropriate Updater sql patches to reflect database
schema changes since 1.14
MediaWiki already had some DB2 support, but it's been broken since
1.15 and never complete. As a result of their work, it's now possible
to successfully install MediaWiki on DB2 out of the box and to use the
core wiki features.
I'll shortly commit their first patch using my SVN account (leonsp).
I've taken some care to look over the code and make sure it abides by
the MediaWiki code guidelines.
(CC'd inez @ wikia)
In the process of getting a better feel for the current state of Wikia,
Wikihow, & a few others' rich text editor tools, I'm going through Wikia's
CKEditor-based RTE extension and seeing if I can get it working on MediaWiki
I've got a fork from Wikia's SVN in this gitorious project:
The 'master' branch is a straight git-svn clone of the subtree; 'tweaks'
branch has some extra doc comments and some initial tweaks to get it loading
(if not actually working right yet ;) on stock MediaWiki 1.18-SVN.
* most stuff won't work yet!
* the editor can be loaded if forced with &useeditor=wysiwyg
* load/save results in some corruption, probably mostly due to the parser
annotations not all being present (need to customize a few bits)
* the editor is loaded through ResourceLoader, using a quick stub to work
around the lack of removal of certain lines
* it's almost certain that the CSS and some JS is broken :D
* there are various Wikia-specific PHP-side and JS-side extensions, many of
which still need to be switched to the stock MW equivalent or copied over.
Note that definitions for such things can usually be found in the modified
MediaWiki core in Wikia's SVN tree --
At a minimum I'd like to end up with something that works on stock MediaWiki
1.18 (and if it can be made to work on stock or lightly-patched 1.17, even
better!). It should be a more stable option for 1.17 users than the old
I'm still a bit leery of the internal annotations & edge-case checks for the
round-tripping and whether this structure would work for us in the long
term, but there's some good stuff in here that's going to be useful to learn
from whatever we do, and it's a useful tool for many cases in the short
If anybody feels like trying it out / pitching in on the fixes, do feel free
to give a shout. I can set a few folks up with commit access on the git repo
or take some pulls for now, and will merge it into our SVN extensions when
it's a bit more stable.
-- brion vibber (brion @ pobox.com / brion @ wikimedia.org)
Painstakingly, all of the unit tests are now passing on cruise control (Sqlite)
and my local install (Sqlite & Mysql). This is as of r90150 . Thank you
immensely to Brion for finally tracking down and eliminating that pesky
This means? We know that according to what we have covered by PHPUnit
(barring things marked incomplete/skipped), everything is currently working.
I'd like for Max to turn IRC annoyances back on with the codurr bot for CC
failures. From here on out, I'm going to take the stance that if you break a
test, it must be fixed or reverted on sight.
Please join me to welcome Jeff Green to Wikimedia Foundation.
Jeff is taking up the Special Ops position in the Tech Ops department where
one of his responsibilities is to keep our Fundraising infrastructure
secured, in compliance with regulation, scalable and highly available. Jeff
comes with strong systems operation background especially in scaling
and building highly secured infrastructure. He hails from Craiglist where he
started as their first system administrator and served as their lead system
administrator as well as their Operations manager, most of his tenure
When not working, Jeff likes cycling, playing music, and building stuff. He
is a proud father of two young kids and a lucky husband. He and his family
will be moving back to Massachusetts this August. Please drop by next week
to the 3rd floor to welcome him. For those who have already met him earlier,
do come by as well to see the new 'ponytailess' Jeff ;-)
On Thu, Jun 30, 2011 at 11:19 PM, CT Woo <ctwoo(a)wikimedia.org> wrote:
> Please join me to welcome Jeff Green to Wikimedia Foundation.
Welcome! Yay ops!
> He and his family
> will be moving back to Massachusetts this August.
Is this a typo or is Jeff really taking a job in SF and moving to the
other side of the country a month later?
Roan Kattouw (Catrope)
On Thu, Jun 30, 2011 at 2:19 PM, CT Woo <ctwoo(a)wikimedia.org> wrote:
> Please join me to welcome Jeff Green to Wikimedia Foundation.
> Jeff is taking up the Special Ops position in the Tech Ops department where
> one of his responsibilities is to keep our Fundraising infrastructure
> secured, in compliance with regulation, scalable and highly available. Jeff
> comes with strong systems operation background especially in scaling
> and building highly secured infrastructure. He hails from Craiglist where he
> started as their first system administrator and served as their lead system
> administrator as well as their Operations manager, most of his tenure
Welcome to the team! :-)
Deputy Director, Wikimedia Foundation
Support Free Knowledge: http://wikimediafoundation.org/wiki/Donate
This week's bug triage was the first one held by IRC in an effort to
be more open to community involvement. On that basis, it was
successful: User:Bawolff stepped up to the plate to take on a couple of
bugs and User:Leinad stepped in to ask us to consider solving
In preparation for 1.18, I had taken all of the bugs marked with
“High” priority and made them blockers on the 1.18 Deployment Bug
(https://bugzilla.wikimedia.org/29068) or on the 1.18 Tarball Bug
So the first thing we did was look at this list of blocker bugs and
make sure they all should be blockers. This process revealed a
problem with my method of turning “High” priority bugs into blockers
For example, I didn't examine some bugs closely enough. The following
two are a good example of this:
Make image views statistics available through Wikistats
Set up notification for when/if Google's safe browsing spots
something on wiki
Others that we found and removed from being blockers were
Search should index template expansion
lucene search for simple text misses some results
iPhone Native Crash with UTF-8
Block::purgeExpired giving "Lock wait timeout exceeded;"/"Deadlock"
Article::updateCategoryCounts 1213 Deadlock found when trying to
Make upload.wikimedia.org Cross Origin compatible (CORS)
Going through the list also helped us discover a couple that should be
Sometimes there's "undefined" in a Resource loader CSS request
Prevent creation of new (unattached?) SUL accounts with already
>From there, we moved on to the bugs that were affecting people right
MediaWiki:Filepage.css not loaded on foreignwiki itself
Chad glommed onto right away. After a bit of diagnosis, he fixed it
Edited page is not showing the most recent edits to anyone
not logged into wikipedia
Bawolff has been working on the problem and, upon Robla's advice, I
bumped the priority. Bawolff may need some help figuring out how to
test Squid, but until he does, he's on it.
maintenance script edit.php doesn't update link tables properly
Since Bawolff was already working on this and it was otherwise a low
priority, I assigned it to him.
Anonymous users can edit page protected with [edit=autoconfirmed]
Roan suggested that the page may have been deleted and undeleted, thus
removing the edit protection. This suggestion led to close
examination of the log where Brion notice today that everything was,in
fact, working correctly, but, because of difficulty on the part of
the developers in understanding the timeline and the actions, that
wasn't seen soon enough.
Enable $wgHtml5 on Wikimedia wikis
There were no known technical issues blocking this, but in order to
avoid having to roll it out and then roll it back when people
complained that it broke, say, Twinkle, we needed to make sure we have
someone who can babysit the roll out. After some discussion, We
appointed Reedy to be the point man on this.
This request from Indian wikis and a clear description of the tasks
involved from Brion made it a great addition to our “Annoying little
bug” page (http://www.mediawiki.org/wiki/Annoying_little_bug) that I'm
maintaining to point potential new MediaWiki developers to some
relatively achievable tasks.
LQT putting crap in dumps
By the time we got to this in triage, Brion already had a fix
committed for this one. Yay, Brion!
Periodical run of currently disabled special pages (WantedPages,
After Tim pointed out that a few of the disabled pages “should never be run
under any circumstances”, I asked him to update the bug with
information about which of the queries that was.
[[MediaWiki:Enotif body]] needs GENDER support
As if Bawolff's involvement in the triage wasn't reason enough to
continue holding them on IRC, Leinad stopped by during the meeting to
ask if we could fix this bug for the 1.18 release.
After a bit of discussion, we decided that I should contact the author
of enotify to see if he would fix gender support.
That's all for this week. Our next IRC triage will be at 2100 UTC
I hope to see you there,