Are you good in swearing? WE NEED YOU
Huggle 3 comes with vandalism-prediction as it is precaching the diffs
even before they are enqueued including their contents. Each edit has
so called "score" which is a numerical value that if higher, the edit
is more likely a vandalism.
If you want to help us improve this feature, it is necessary to define
a "score words" list for every wiki where huggle is about to be used,
for example on English wiki.
Each list has following syntax:
list of words separated by comma, can contain newlines but comma
must be present
these, are, some, words, which, presence, of, increases, the, score,
each, word, by, 200,
So, if you know english better than me, which you likely do, go ahead
and improve the configuration file there, no worries, huggle's config
parser is very syntax-error proof.
If you have any other suggestion how to improve huggle's prediction,
go ahead and tell us!
Hi, in response to bug 54607 , we've changed the semantics of the
mobileformat parameter to action=parse
== Summary ==
Previously, it used to accept strings 'html' or 'wml', later just
'html' and modify the structure of output (see below). This was problematic
because you needed to retrieve the HTML from output in different ways,
depending on whether mobileformat is specified or not. Now,
mobileformat is a boolean parameter, that is if there's a 'mobileformat'
parameter in request, it will be treated as "the output should be
mobile-friendly", regardless of value. And the output structure will
be the same. For compatibility with older callers,
mobileformat=(html|wml) will be special-cased to return the older
structure at least for 6 month from now. These changes will start
being rolled out to the WMF sites starting from tomorrow, Tuesday
October 24th and this process will be complete by October 31st.
== Examples ==
=== Non-mobile parse ===
<parse title="..." displaytitle="...">
=== Parse that outputs mobile HTML, old style ===
<parse title="..." text="foo" displaytitle="...">
=== Parse that outputs mobile HTML, new style ===
Same as for non-mobile parses.
== FAQ ==
Q: I didn't use mobileformat before, does anything change for me?
Q: I use mobileformat=html, will my bot/tool be broken now?
A: No, you will have 6 months to switch to new style.
Q: I'm only planning to use mobileformat, what should I do?
A: Just use the new style.
Q: How did this format discrepancy appear in the first place?
A: To err is human.
Max Semenik ([[User:MaxSem]])
I'm happy to announce the availability of the second beta release of the
new MediaWiki 1.19 release series.
Please try it out and let us know what you think. Don't run it on any
wikis that you really care about, unless you are both very brave and
very confident in your MediaWiki administration skills.
MediaWiki 1.19 is a large release that contains many new features and
bug fixes. This is a summary of the major changes of interest to users.
You can consult the RELEASE-NOTES-1.19 file for the full list of changes
in this version.
Five security issues were discovered.
It was discovered that the api had a cross-site request forgery (CSRF)
vulnerability in the block/unblock modules. It was possible for a user
account with the block privileges to block or unblock another user without
providing a token.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=34212
It was discovered that the resource loader can leak certain kinds of private
data across domain origin boundaries, by providing the data as an executable
protection tokens. This allows compromise of the wiki's user accounts, say
changing the user's email address and then requesting a password reset.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=34907
Jan Schejbal of Hatforce.com discovered a cross-site request forgery (CSRF)
vulnerability in Special:Upload. Modern browsers (since at least as early as
December 2010) are able to post file uploads without user interaction,
violating previous security assumptions within MediaWiki.
Depending on the wiki's configuration, this vulnerability could lead to
compromise, especially on private wikis where the set of allowed file types
broader than on public wikis. Note that CSRF allows compromise of a wiki
an external website even if the wiki is behind a firewall.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=35317
George Argyros and Aggelos Kiayias reported that the method used to generate
password reset tokens is not sufficiently secure. Instead we use various
secure random number generators, depending on what is available on the
platform. Windows users are strongly advised to install either the openssl
extension or the mcrypt extension for PHP so that MediaWiki can take
of the cryptographic random number facility provided by Windows.
Any extension developers using mt_rand() to generate random numbers in
where security is required are encouraged to instead make use of the
MWCryptRand class introduced with this release.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=35078
A long-standing bug in the wikitext parser (bug 22555) was discovered to
security implications. In the presence of the popular CharInsert extension,
leads to cross-site scripting (XSS). XSS may be possible with other
or perhaps even the MediaWiki core alone, although this is not confirmed at
this time. A denial-of-service attack (infinite loop) is also possible
regardless of configuration.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=35315
MediaWiki 1.19 brings the usual host of various bugfixes and new features.
Comprehensive list of what's new is in the release notes.
* Bumped MySQL version requirement to 5.0.2.
* Disable the partial HTML and MathML rendering options for Math,
and render as PNG by default.
* MathML mode was so incomplete most people thought it simply didn't work.
* New skins/common/*.css files usable by skins instead of having to copy
generic styles from MonoBook or Vector's css.
* The default user signature now contains a talk link in addition to the
* Searching blocked usernames in block log is now clearer.
* Better timezone recognition in user preferences.
* Extensions can now participate in the extraction of titles from URL paths.
* The command-line installer supports various RDBMSes better.
* The interwiki links table can now be accessed also when the interwiki
is used (used in the API and the Interwiki extension).
* More gender support (for instance in user lists).
* Add languages: Canadian English.
* Language converter improved, e.g. it now works depending on the page
* Time and number-formatting magic words also now depend on the page
* Bidirectional support further improved after 1.18.
Full release notes:
Co-inciding with these security releases, the MediaWiki source code
moved from SVN (at https://svn.wikimedia.org/viewvc/mediawiki/trunk/phase3)
to Git (https://gerrit.wikimedia.org/gitweb/mediawiki/core.git). So the
commits for these releases will not be appearing in our SVN repository. If
SVN checkouts of MediaWiki for version control, you need to migrate these to
If you up are using tarballs, there should be no change in the process for
Please note that any WMF-deployed extensions have also been migrated to Git
also, along with some other non WMF-maintained ones.
Please bear with us, some of the Git related links for this release may not
but should later on.
To do a simple Git clone, the command is:
git clone https://gerrit.wikimedia.org/r/p/mediawiki/core.git
More information is available at https://www.mediawiki.org/wiki/Git
For more help, please visit the #mediawiki IRC channel on freenode.netirc://irc.freenode.net/mediawiki or email The MediaWiki-l mailing list
Patch to previous version (1.19.0beta1), without interface text:
Interface text changes:
TL;DR SUMMARY: check out this short, silent, black & white video:
https://brionv.com/misc/ogv.js/demo/ -- anybody interested in a side
project on in-browser audio/video decoding fallback?
One of my pet peeves is that we don't have audio/video playback on many
systems, including default Windows and Mac desktops and non-Android mobile
devices, which don't ship with Theora or WebM video decoding.
The technically simplest way to handle this is to transcode videos into
H.264 (.mp4 files) which is well supported by the troublesome browsers.
Unfortunately there are concerns about the patent licensing, which has held
us up from deploying any H.264 output options though all the software is
ready to go...
While I still hope we'll get that resolved eventually, there is an
alternative -- client-side software decoding.
We have used the 'Cortado <http://www.theora.org/cortado/>' Java applet to
do fallback software decoding in the browser for a few years, but Java
applets are aggressively being deprecated on today's web:
* no Java applets at all on major mobile browsers
* Java usually requires a manual install on desktop
* Java applets disabled by default for security on major desktop browsers
years, and performance is getting well in line with what Java applets can
As an experiment, I've built Xiph's ogg, vorbis, and theora C libraries
emscripten<https://github.com/kripken/emscripten>and written a wrapper
that decodes Theora video from an .ogv stream and
draws the frames into a <canvas> element:
* demo: https://brionv.com/misc/ogv.js/demo/
* code: https://github.com/brion/ogv.js
* blog & some details:
It's just a proof of concept -- the colorspace conversion is incomplete so
it's grayscale, there's no audio or proper framerate sync, and it doesn't
really stream data properly. But I'm pleased it works so far! (Currently it
breaks in IE, but I think I can fix that at least for 10/11, possibly for
9. Probably not for 6/7/8.)
Performance on iOS devices isn't great, but is better with lower resolution
files :) On desktop it's screaming fast for moderate resolutions, and could
probably supplement or replace Cortado with further development.
Is anyone interested in helping out or picking up the project to move it
towards proper playback? If not, it'll be one of my weekend "fun" projects
I occasionally tinker with off the clock. :)
On Sun, Aug 25, 2013 at 7:46 PM, Yuvi Panda <yuvipanda(a)gmail.com> wrote:
> Hey rupert!
> On Sun, Aug 25, 2013 at 10:21 PM, rupert THURNER
> <rupert.thurner(a)gmail.com> wrote:
>> hi brion,
>> thank you so much for that! where is the source code? i tried to
>> search for "commons" on https://git.wikimedia.org/. i wanted to look
> Android: https://git.wikimedia.org/summary/apps%2Fandroid%2Fcommons.git
> iOS: github.com/wikimedia/Commons-iOS
>> if there is really no account creation at the login screen or it is
>> just my phone which does not display one, and which URL the aplication
> Mediawiki doesn't have API support for creating accounts, and hence
> the apps don't have create account support yet.
created https://bugzilla.wikimedia.org/show_bug.cgi?id=53328, maybe
you could detail a little bit more how this api should look like?
I would like to have an open IRC meeting for RFC review, on Tuesday 24
September at 22:00 UTC (S.F. 3pm).
We will work through a few old, neglected RFCs, and maybe consider a
few new ones, depending on the interests of those present.
The IRC channel will be #mediawiki-rfc.
-- Tim Starling
I’m delighted to announce that Ken Snider is joining the Wikimedia
operations team. He will start as an international contractor working
remotely from Toronto, Canada on June 10, and will be visiting SF in
the week of June 17. We’re currently in the process of seeking work
authorization in the United States in the Director of TechOps
CT has graciously agreed to support the ops leadership transition
full-time through June, and part-time through July. We’ll be starting
the handover while Ken is working remotely.
A bit more about Ken: Ken was apparently genetically predisposed to
become a sysadmin since he joined one of Canada’s first large ISPs,
Primus, straight out of school in 1997 and helped build their
infrastructure til 2001. He then joined a startup called OpenCOLA in
2001 which was co-founded by Cory Doctorow and developed early P2P
precursors to tools like BitTorrent and Steam. It’s best known today
for the development of an open source (GPL’d) cola recipe which is
still in use (more than 150,000 cans sold if Wikipedia is to be
Ken got involved in one of Cory’s pet projects, BoingBoing.net which
some of you may have heard of ;-), and has been their sysadmin since
2003. After a stint from 2001-2005 at DataWire, Ken became Director of
Tech Ops at Federated Media, a role he held from 2005-2012.
Federated Media is an ad network that was founded to support high
traffic blogs and sites that want to stay independent of large
publishers, with a network that supports more than 1B requests/day.
One of the unusual challenges at FM was that the company grew through
acquisitions of various blogging and publishing networks. This led to
the challenge of integrating very heterogeneous operations and
engineering infrastructure, including multiple geographically
distributed ops teams and data-center locations. As DTO, Ken led these
efforts, such as OS standardization, development of a unified
deployment infrastructure, etc. Ken also ensured that the operations
group partnered effectively with the various engineering teams
developing site features and enhancements.
I want to again take this opportunity to thank CT Woo for his tireless
operations leadership since December 2010. I’d also like to thank
everyone who’s participated in the Director of TechOps search process.
Please join me in welcoming Ken to the Wikimedia Foundation and the
VP of Engineering and Product Development, Wikimedia Foundation
Can we deprecate usage of '!ask' on IRC?
> ori-l: !ask
> wm-bot: Hi, how can we help you? Just ask your question.
It's annoying when people ask to ask, but the people who do so do it out of
insecurity or lack of experience, and so they're the last people we should
be siccing our bots on.
I've used '!ask' a lot before but I'm going to stop. I hope others do the