I recently set up a MediaWiki (http://server.bluewatersys.com/w90n740/)
and I need to extra the content from it and convert it into LaTeX
syntax for printed documentation. I have googled for a suitable OSS
solution but nothing was apparent.
I would prefer a script written in Python, but any recommendations
would be very welcome.
Do you know of anything suitable?
A Wikipedia page loads. The last thing to load is the banner. This
pushes the page content down. If you've clicked on a link near the top
of the page, the banner grabs it instead.
This happened last year and it was reported then (and it was
incredibly annoying then too). It also fouls up stats on banner
effectiveness, as banners are clicked on without intention to do so.
Please load the page with a space for the banner to avoid this effect.
This is a question about an infrastructural detail of ResourceLoader and how it interacts with Internet Explorer. (It's my first post to wikitech-l, so apologies if it's the wrong forum.)
Our MediaWiki 1.17.0 site recently installed a bunch of extensions that use ResourceLoader, such as Extension:WikiEditor. To our surprise, some of our site's unrelated CSS styles stopped working. This was happening only in Internet Explorer. After some detective work, we discovered the problem is Internet Explorer's limit of 31 stylesheets:
Obviously this is an IE problem, not MediaWiki's, but it's going to cause issues on MediaWiki sites. WikiEditor itself loads about 10 stylesheets, for example, taking the site ~30% of the way toward a CSS failure.
So my questions are:
1. Is there a workaround for sites like mine, with many stylesheets from separate extensions loaded by ResourceLoader?
2. Should ResourceLoader address this IE problem? Maybe start combining stylesheets (with @import) automatically?
A test page for the new VIPS image scaler is now available:
You can give it names of images from Commons and it will show a
comparison with a moving divider, with the thumbnail from ImageMagick
on the left, and the one from VIPS on the right.
I'll explain what you would expect to see when using this tool:
For JPEG images, using a sharpening radius of 0.8 will make the VIPS
result roughly match the ImageMagick result, as long as the thumbnail
is reduced to less than 85% of the original width. With not enough
sharpening, the resulting image looks blurry. With too much
sharpening, contrast in fine detail will be unrealistically enhanced
and high-contrast borders will develop "halos".
Above about 50% reduction factor, the block average introduces
artifacts in fine detail, so enabling the "bilinear" option will look
better, and will more closely match ImageMagick. But if the bilinear
option is used with a reduction factor much smaller than that, severe
artifacts will be seen in areas of contrasting fine detail.
At small reduction factors, the main difference between ImageMagick
and VIPS is that VIPS uses a simple block average whereas ImageMagick
uses a more complex windowing function. This leads to minor
differences in fine detail.
What we're looking for out of this test is:
* Confirmation that VIPS is not completely failing fpr some class of
* Suggestions for parameter values (sharpening, bilinear) for various
source and destination sizes. VipsScaler allows these parameters to be
configured depending on source size and reduction factor.
-- Tim Starling
I've been informally mentoring André, Tiago, Diego, and César. They
are four students at Minho University who are currently working on a
project to improve DB2 database support in MediaWiki.
So far, they've:
- Fixed several outstanding issues with DB2 support involving
character encoding, Windows vs Linux, etc
- Added DB2 support to the new MediaWiki 1.17 Installer and Updater
- Put in the appropriate Updater sql patches to reflect database
schema changes since 1.14
MediaWiki already had some DB2 support, but it's been broken since
1.15 and never complete. As a result of their work, it's now possible
to successfully install MediaWiki on DB2 out of the box and to use the
core wiki features.
I'll shortly commit their first patch using my SVN account (leonsp).
I've taken some care to look over the code and make sure it abides by
the MediaWiki code guidelines.
Earlier this month, Wikimedia staff and volunteers got together in
Mumbai, India to work on mobile, offline, and
Photos are up:
Some notes on our outcomes, which included many new localisations for
Kiwix and new input methods for MediaWiki, readying Narayam for
Wikimedia Incubator, a prototype onscreen keyboard built in Narayam,
Wikimedia Mobile ready for translation, new UI prototypes for language
selection, and more:
And I haven't even touched on mobile! An update specifically on mobile
progress at the hackathon:
-- Summary from Phil Chang:
> In summary, over one weekend more than 50 volunteers from many parts of
> India added their hard work and insights to the technical foundation of
> Wikipedia. In the mobile area alone, volunteers contributed to 17 features,
> as listed here (features that were worked on are marked with an "H"):
> We also got support and input from most of the major mobile operators in
> India about how to make our user experience better. Free access to
> Wikipedia is moving forward on a number of fronts, as we identified several
> forms of collaboration, not just in the form of Wikipedia Zero. For
> example, there seems to be widespread interest in using an RSS feed of the
> Article of the Day, and the top 5 languages in India are important.
I'm asking Emmanuel to send an offline-related summary to
https://lists.wikimedia.org/mailman/listinfo/offline-l . And I'm
predicting the localization folks will have a summary in their next
This was the largest Wikimedia tech outreach event I've been a part of,
with 80 or so new folks learning and becoming contributors. Thanks to
the Wikimedia staffers who came, for -- as Alolita put it -- "leading
project teams to do some nice development, UI design, testing and
accomplishing a lot in a short blip of time." Thanks to the local
community and chapter for putting on Wiki Conference India, which
happened at the same time:
Sorry to be brief; more details are at the links provided. I know that
the i18n team also led a translation sprint and an intro to MediaWiki
hacking in Pune after the Mumbai hackathon, but I'll leave it to them in
case they want to report about that.
Volunteer Development Coordinator
On Mon, Nov 28, 2011 at 3:16 PM, Brion Vibber <brion(a)pobox.com> wrote:
> On Mon, Nov 28, 2011 at 3:05 PM, MZMcBride <z(a)mzmcbride.com> wrote:
>> Brion Vibber wrote:
> [snip my notes about removing the non-PNG non-source options, wanting
> higher-resolution renderings]
>> Did you have a chance to evaluate MathJax? <http://www.mathjax.org/> I
>> it's come up in past math discussions and that a lot of math folks think
>> looks promising. A technical analysis of its feasibility on Wikimedia
>> would be great. Killing the less-used, ancient math options is great, but
>> perhaps adding one wouldn't be too bad to do too. :-)
> That's an excellent thing to bring up -- MathJAX *does* look very
> promising, and things seem to render pretty nicely. Need to make sure that
> we can either do that type of rendering cleanly with the PNG fallback
> (older browsers will still need the PNGs, so it may still be worth spending
> the time to fix baselines).
I've done a quick experimental mode commit:
definitely promising in terms of things look nice. :)
Total library size is pretty large but that includes a bunch of fallback
images which will be rarely used; don't have a good sense of the 'weight'
of including the library yet. It does load a bit slowly, but hopefully
won't interfere much while it does so.
The initial method I'm using is to output the latex source in a <script
type="math/tex"> which MathJax recognizes, and then putting the image or
text source form in a <noscript> next to it. Browsers with JS off or
unavailable will use the fallback image/text silently, while those with
script will get the pretty math inserted at runtime.
This isn't perfect though, and for instance breaks on MobileFrontend
because the <script> that actually _loads_ MathJax doesn't get loaded, and
on top of that the fallback images are still in <noscript>. ;) Also,
browsers that have script support but don't work with MathJax won't load
images, so this is insufficient.
Most compatible thing is probably to let it include the images/text form
as-is, then make sure MathJax goes over and replaces them in-place. It
might need tweaks to understand the images (source in alt text).
I'm hoping to get some feedback and advice from folks who have worked with
MathJax previously; I'll whip up some detail notes as an RFC page in a bit.
We've had @wikimediatech accounts on twitter & identica for some time now:
that basically broadcast every single action that is logged to the
server admin log:
The account has 78 followers on identica and 430 on twitter (probably
counting the spammers).
I'm wondering if there are actually people reading all the stuff
that's pushed through these channels.
My gut feeling is that the few people reading these feeds are also
those that would know to check the SLA if they encountered an issue,
or know how to use the RSS feed of the SLA page if they really wanted
the information in real time.
Meanwhile, we don't really have social media channels dedicated to
Wikimedia tech stuff, i.e. channels where we can actually post stuff,
links, blog posts, outage info, etc and engage with a larger community
of people interested in our tech operations. I feel that the accounts
would be much more useful if we reduced the amount of semi-random
information we post there.
So, I'm basically proposing to repurpose the @wikimediatech accounts for this.
Thoughts? Good idea? Bad idea? You don't care?
Technical Communications Manager — Wikimedia Foundation
Looking over our USERINFO files we have 82 (out of our 330) users who
obfuscate their e-mail address.
It's rather trivial to de-obfuscate them, and we will need to for the git
Have we considered asking these users if they would like a different
e-mail address to be used for get when we migrate?
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]
What: 1.18 Triage
When: Friday, Nov, 2130UTC
Time zone conversion: http://hexm.de/ap
Where: #wikimedia-dev on freenode
if you don't have an IRC client
With the recent release of 1.18, I want to hold a triage to see if there
are enough issues to have a point release. This Friday, I'll be
covering bugs listed on the tracking bug
https://bugzilla.wikimedia.org/32711 to determine the severity and
number of issues.
If you know of issues not listed on that bug that have shown up in your
1.18 installation, especially regressions in MediaWiki behavior, please
Hope to see you at the triage!