I'm happy to announce the availability of the second beta release of the
new MediaWiki 1.19 release series.
Please try it out and let us know what you think. Don't run it on any
wikis that you really care about, unless you are both very brave and
very confident in your MediaWiki administration skills.
MediaWiki 1.19 is a large release that contains many new features and
bug fixes. This is a summary of the major changes of interest to users.
You can consult the RELEASE-NOTES-1.19 file for the full list of changes
in this version.
Five security issues were discovered.
It was discovered that the api had a cross-site request forgery (CSRF)
vulnerability in the block/unblock modules. It was possible for a user
account with the block privileges to block or unblock another user without
providing a token.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=34212
It was discovered that the resource loader can leak certain kinds of private
data across domain origin boundaries, by providing the data as an executable
JavaScript file. In MediaWiki 1.18 and later, this includes the leaking of
CSRF
protection tokens. This allows compromise of the wiki's user accounts, say
by
changing the user's email address and then requesting a password reset.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=34907
Jan Schejbal of Hatforce.com discovered a cross-site request forgery (CSRF)
vulnerability in Special:Upload. Modern browsers (since at least as early as
December 2010) are able to post file uploads without user interaction,
violating previous security assumptions within MediaWiki.
Depending on the wiki's configuration, this vulnerability could lead to
further
compromise, especially on private wikis where the set of allowed file types
is
broader than on public wikis. Note that CSRF allows compromise of a wiki
from
an external website even if the wiki is behind a firewall.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=35317
George Argyros and Aggelos Kiayias reported that the method used to generate
password reset tokens is not sufficiently secure. Instead we use various
more
secure random number generators, depending on what is available on the
platform. Windows users are strongly advised to install either the openssl
extension or the mcrypt extension for PHP so that MediaWiki can take
advantage
of the cryptographic random number facility provided by Windows.
Any extension developers using mt_rand() to generate random numbers in
contexts
where security is required are encouraged to instead make use of the
MWCryptRand class introduced with this release.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=35078
A long-standing bug in the wikitext parser (bug 22555) was discovered to
have
security implications. In the presence of the popular CharInsert extension,
it
leads to cross-site scripting (XSS). XSS may be possible with other
extensions
or perhaps even the MediaWiki core alone, although this is not confirmed at
this time. A denial-of-service attack (infinite loop) is also possible
regardless of configuration.
For more details, see https://bugzilla.wikimedia.org/show_bug.cgi?id=35315
*********************************************************************
What's new?
*********************************************************************
MediaWiki 1.19 brings the usual host of various bugfixes and new features.
Comprehensive list of what's new is in the release notes.
* Bumped MySQL version requirement to 5.0.2.
* Disable the partial HTML and MathML rendering options for Math,
and render as PNG by default.
* MathML mode was so incomplete most people thought it simply didn't work.
* New skins/common/*.css files usable by skins instead of having to copy
piles of
generic styles from MonoBook or Vector's css.
* The default user signature now contains a talk link in addition to the
user link.
* Searching blocked usernames in block log is now clearer.
* Better timezone recognition in user preferences.
* Extensions can now participate in the extraction of titles from URL paths.
* The command-line installer supports various RDBMSes better.
* The interwiki links table can now be accessed also when the interwiki
cache
is used (used in the API and the Interwiki extension).
Internationalization
- --------------------
* More gender support (for instance in user lists).
* Add languages: Canadian English.
* Language converter improved, e.g. it now works depending on the page
content language.
* Time and number-formatting magic words also now depend on the page
content language.
* Bidirectional support further improved after 1.18.
Release notes
- -------------
Full release notes:
https://gerrit.wikimedia.org/r/gitweb?p=mediawiki/core.git;a=blob_plain;f=RE
LEASE-NOTES-1.19;hb=1.19.0beta2
https://www.mediawiki.org/wiki/Release_notes/1.19
Co-inciding with these security releases, the MediaWiki source code
repository has
moved from SVN (at https://svn.wikimedia.org/viewvc/mediawiki/trunk/phase3)
to Git (https://gerrit.wikimedia.org/gitweb/mediawiki/core.git). So the
relevant
commits for these releases will not be appearing in our SVN repository. If
you use
SVN checkouts of MediaWiki for version control, you need to migrate these to
Git.
If you up are using tarballs, there should be no change in the process for
you.
Please note that any WMF-deployed extensions have also been migrated to Git
also, along with some other non WMF-maintained ones.
Please bear with us, some of the Git related links for this release may not
work instantly,
but should later on.
To do a simple Git clone, the command is:
git clone https://gerrit.wikimedia.org/r/p/mediawiki/core.git
More information is available at https://www.mediawiki.org/wiki/Git
For more help, please visit the #mediawiki IRC channel on freenode.netirc://irc.freenode.net/mediawiki or email The MediaWiki-l mailing list
at mediawiki-l(a)lists.wikimedia.org.
**********************************************************************
Download:
http://download.wikimedia.org/mediawiki/1.19/mediawiki-1.19.0beta2.tar.gz
Patch to previous version (1.19.0beta1), without interface text:
http://download.wikimedia.org/mediawiki/1.19/mediawiki-1.19.0beta2.patch.gz
Interface text changes:
http://download.wikimedia.org/mediawiki/1.19/mediawiki-i18n-1.19.0beta2.patc
h.gz
GPG signatures:
http://download.wikimedia.org/mediawiki/1.19/mediawiki-1.19.0beta2.tar.gz.si
g
http://download.wikimedia.org/mediawiki/1.19/mediawiki-1.19.0beta2.patch.gz.
sig
http://download.wikimedia.org/mediawiki/1.19/mediawiki-i18n-1.19.0beta2.patc
h.gz.sig
Public keys:
https://secure.wikimedia.org/keys.html
How to load up high-resolution imagery on high-density displays has been an
open question for a while; we've wanted this for the mobile web site since
the Nexus One and Droid brought 1.5x, and the iPhone 4 brought 2.0x density
displays to the mobile world a couple years back.
More recently, tablets and a few laptops are bringing 1.5x and 2.0x density
displays too, such as the new Retina iPad and MacBook Pro.
A properly responsive site should be able to detect when it's running on
such a display and load higher-density image assets automatically...
Here's my first stab:
https://bugzilla.wikimedia.org/show_bug.cgi?id=36198#c6https://gerrit.wikimedia.org/r/#/c/24115/
* adds $wgResponsiveImages setting, defaulting to true, to enable the
feature
* adds jquery.hidpi plugin to check window.devicePixelRatio and replace
images with data-src-1-5 or data-src-2-0 depending on the ratio
* adds mediawiki.hidpi RL script to trigger hidpi loads after main images
load
* renders images from wiki image & thumb links at 1.5x and 2.0x and
includes data-src-1-5 and data-src-2-0 attributes with the targets
Note that this is a work in progress. There will be places where this
doesn't yet work which output their imgs differently. If moving from a low
to high-DPI screen on a MacBook Pro Retina display, you won't see images
load until you reload.
Confirmed basic images and thumbs in wikitext appear to work in Safari 6 on
MacBook Pro Retina display. (Should work in Chrome as well).
Same code loaded on MobileFrontend display should also work, but have not
yet attempted that.
Note this does *not* attempt to use native SVGs, which is another potential
tactic for improving display on high-density displays and zoomed windows.
This loads higher-resolution raster images, including rasterized SVGs.
There may be loads of bugs; this is midnight hacking code and I make no
guarantees of suitability for any purpose. ;)
-- brion
hi-
i'm hopeful this is the appropriate venue for this topic - i recently
had occasion to visit #mediawiki on freenode, looking for help. i found
myself a bit frustrated by the amount of bot activity there and wondered
if there might be value in some consideration for this. it seems to
frequently drown out/dilute those asking for help, which can be a bit
discouraging/frustrating. additionally, from the perspective of those
who might help [based on my experience in this role in other channels],
constant activity can sometimes engender disinterest [e.g. the irc
client shows activity in the channel, but i'm less inclined to look as
it's probably just a bot].
to offer one possibility - i know there are a number of mediawiki and/or
wikimedia related channels - might there be one in which bot activity
might be better suited, in the context of less contention between the
two audiences [those seeking help vs. those interested in development,
etc]? one nomenclature convention that seems to be at least somewhat of
a defacto standard is #project for general help, and #project-dev[el]
for development topics. a few examples of this i've seen are android,
libreoffice, python, and asterisk. adding yet another channel to this
list might not be terribly welcome, but maybe the distinction would be
worth the addition?
as i'm writing this, i see another thread has begun wrt freenode, and i
also see a bug filed that relates at least to some degree
[https://bugzilla.wikimedia.org/show_bug.cgi?id=35427], so i may just be
repeating an existing sentiment, but i wanted to at least offer a brief
perspective.
regards
-ben
On Tue, Jul 24, 2012 at 10:25 PM, Steven Walling <steven.walling <at>
gmail.com> wrote:
> But do we have a plan for improving Gerrit in a substantial way?
Hi everyone,
In my response to Steven at the time [1], I indicated that we have a
modest contractor budget for this work. The RFP is now posted here:
http://hire.jobvite.com/Jobvite/Job.aspx?j=o4gIWfwI&c=qSa9VfwQ
Please let me know if you're interested (and apply if you're really
interested). Also, please let me know if you have any questions.
Thanks!
Rob
[1] http://article.gmane.org/gmane.science.linguistics.wikipedia.technical/62630
On Fri, Jun 15, 2012 at 8:48 AM, Sumana Harihareswara
<sumanah(a)wikimedia.org> wrote:
> If you merge into mediawiki/core.git, your change is considered safe for
> inclusion in a wmf branch. The wmf branch is just branched out of
> master and then deployed. We don't review it again. Because we're
> deploying more frequently to WMF sites, the code review for merging into
> MediaWiki's core.git needs to be more like deployment/shell-level
> review, and so we gave merge access to people who already had deployment
> access. We have since added some more people. The current list:
> https://gerrit.wikimedia.org/r/#/admin/groups/11,members
Let me elaborate on this. As unclear as our process is for giving
access, it's even less clear what our policy is for taking it away.
If we can settle on a policy for taking access away/suspending access,
it'll make it much easier to loosen up about giving access.
Here's the situation we want to avoid: we give access to someone who
probably shouldn't have it. They continually introduce deployment
blockers into the code, making us need to slow down our frequent
deployment process. Two hour deploy windows become six hour deploy
windows as we need time to fix up breakage introduced during the
window. Even with the group we have, there are times where things
that really shouldn't slip through do. It's manageable now, but
adding more people is going to multiply this problem as we get back
into a situation where poorly conceived changes become core
dependencies.
We haven't had a culture of making a big deal about the case when
someone introduces a breaking change or does something that brings the
db to its knees or introduces a massive security hole or whatever.
That means that if the situation were to arise that we needed to
revoke someones access, we have to wait until it gets egregious and
awful, and even then the person is likely to be shocked that their
rights are being revoked (if we even do it then). To be less
conservative about giving access, we also need to figure out how to be
less conservative about taking it away. We also want to be as
reasonably objective about it. It's always going to be somewhat
subjective, and we don't want to completely eliminate the role of
common sense.
It would also be nice if we didn't have to resort to the nuclear
option to get the point across. One low-stakes way we can use to make
sure people are more careful is to have some sort of rotating "oops"
award. At one former job I had, we had a Ghostbusters Stay Puft doll
named "Buster" that was handed out when someone broke the build that
they had to prominently display in their office. At another job, it
was a pair of Shrek ears that people had to wear when they messed
something up in production. In both cases, it was something you had
to wear until someone else came along. Perhaps we should institute
something similar (maybe as simple as asking people to append "OOPS"
to their IRC nicks when they botch something).
Rob
Hi all!
Since https://gerrit.wikimedia.org/r/#/c/21584/ got merged, people have been
complaining that they get tons of warnings. A great number of them seem to be
caused by the fact the MediaWiki will, if the DBO_TRX flag is set,
automatically start a transaction on the first call to Database::query().
See e.g. https://bugzilla.wikimedia.org/show_bug.cgi?id=40378
The DBO_TRX flag appears to be set by default in sapi (mod_php) mode. According
to the (very limited) documentation, it's intended to wrap the entire web
request in a single database transaction.
However, since we do not have support for nested transactions, this doesn't
work: the "wrapping" transaction gets implicitely comitted when begin() is
called to start a "proper" transaction, which is often the case when saving new
revisions, etc.
So, DBO_TRX sems to be misguided, or at least broken, to me. Can someone please
explain why it was introduced? It seems the current situation is this:
* every view-only request is wrapped in a transaction, for not good reason i can
see.
* any write operation that uses an explicit transaction, like page editing,
watching pages, etc, will break the wrapping transaction (and cause a warning in
the process). As far as I understand, this really defies the purpose of the
automatic wrapping transaction.
So, how do we solve this? We could:
* suppress warnings if the DBO_TRX flag is set. That would prevent the logs from
being swamped by transaction warnings, but it would not fix the current broken
(?!) behavior.
* get rid of DBO_TRX (or at least not use it per default). This seems to be the
Right Thing to me, but I suppose there is some point to the automatic
transactions that I am missing.
* Implement support for nested transactions, either using a counter (this would
at least make DBO_TRX work as I guess it was intended) or using safepoints (that
would give us support for actual nested transactions). That would be the Real
Solution, IMHO.
So, can someone shed light on what DBO_TRX is intended to do, and how it is
supposed to work?
-- daniel
tl;dr: Please help in getting https://bugzilla.wikimedia.org/38638
resolved and ensure that your code doesn't cause additional issues for
translators.
At translatewiki.net we have an [ask question] button for translators,
and they have really used it to report various kinds of issues with
the messages they are translating. In fact, we barely have time to
sort those things into correct places, let alone poking developers to
act on them. We are experimenting with different ways to improve this
process – the last experiment is using Semantic MediaWiki to track
open issues, see [1].
Basically the backlog is growing, and we need some help to reverse the
direction. We have tried filing bugs and poking developers on IRC and
via email, but that is not scaling anymore, moreover bugs and pokes
are sometimes ignored[2].
There are many ways you can help:
1. Follow the best i18n practices like documenting new messages as
they are added [4]. Some of you may have noticed that we have started
to review any i18n and L10n changes more closely. This is not to annoy
you. This is to help you write code that can be translated better and
without too many questions from translators.
2. Have a look at [1] and act on issues.
3. Have a look at [1] and poke someone else to act on issues.
4. We need maintainers for [[Support]] [3] to sort stuff into the
correct places.
5. When committing code which has i18n changes, add Nikerabbit (me) or
Siebrand to reviewers.
6. Ideas on how to improve this process are welcome.
We think that we are all responsible for providing them with the
information they need. We feel it's our (translatewiki.net staff's)
obligation to make sure translators can do their work without too many
distractions and wasted efforts.
[1] https://translatewiki.net/wiki/Support/Open_requests
[2] https://bugzilla.wikimedia.org/38638
[3] https://translatewiki.net/wiki/Support
[4] https://www.mediawiki.org/wiki/Localisation
--
Niklas Laxström
Siebrand Mazeland
Raimond Spekking
Federico Leva
Amir E. Aharoni
Hello everyone,
This is one of my major updates regarding my GSoC project (named
ConventionExtension), which I have been working on for about three months
now. This project has come a long way and it has reached a point where a
lot about it can be shared with others. Since I don't post that often in
this list I would like to make this post a long one, and talk about the
status of my extension and where its headed in the coming weeks. Some of
the features which were part of my timeline for GSoC but were not completed
are put under the section "Things yet to be done" along with the other
features that I would be working on in the upcoming weeks.
*1. Completed Features*
*
*
1. Dashboard Page (more features are likely to be added depending upon the
feedback I gather from the people who have set up conferences on their wiki
in the past)
2. Author Registration Page
3. Conference Setup Page
4. Backend (DB) for storing the conference details
5. The basic architecture of the extension:
5.a) Model classes - encapsulating the basic objects required for this
extension
5.b) Api Module -- for interacting with ajax calls from the client
5.c) Util classes
5.d) Templates -- classes exending QuickTemplate class, providing a basic
layout for Dashboard and Author Register pages
5.e) UI classes - classes extending SpecialPage class (Dashboard,
AuthorRegister and ConferenceSetup pages)
5.f) JS + CSS resource modules
6. Parser tags, Magic Words (Variables) and a parser function
parser tags --> <conference>, <page>, <account>,
<registration>,<passport>,<author>,<submission>,<event>,<organizer> and
<location>
variables --> {{CONFERENCENAME}}, {{CONFERENCEVENUE}},
{{CONFERENCECITY}}, {{CONFERENCEPLACE}}, {{CONFERENCECOUNTRY}},
{{CONFERENCECAPACITY}}, {{CONFERENCEDESCRIPTION}}
parser function --> {{#cvext-page-link}}
7. Sidebar modification (added some new portals for the conference)
8. Schedule Template System - which automates the process of creating a
schedule for the conference, as new locations and events are added to the
system.
9. Content Pages - these are the default set of pages that are created for
the conference by the extension (Note : these are just like any other wiki
pages whose content can be modified using the wiki interface)
*2. Things yet to be done !*
*
*
1. *DB rollback implementation in most of my model classes
2. *Account Setup Page (for registration of users)
3. *Modification of User pages for displaying content related to the
conference
3. Organizer management module (most part of it is already implemented in
the basic architecture, just some additions needed regarding the
permissions and rights for this group)
4. Payment Gateway
5. Support for languages other than English
6. Some more parser functions and variables which would help in editing the
content pages of the conference
* - These features were not completed during the GSoC period.
I really enjoyed my experience of working with such a vibrant community
over this summer, especially thankful to all the people who helped me out
in the IRC channel may it be regarding the setting up of labs, or helping
me out with the localisation issues, or even suggesting me come up with a
better feature than what I had already implemented. Other community fellows
who reviewed my big chunks of code, many issues which I very easily missed
were pointed out with a proper explanation of what needs to be done, have
helped me a great deal in improving it. And finally I would like to thank
Sumana and Greg for managing this program so well, and my mentor Jure
Kajzer for his unmatched support and guidance throughout the summer.
Some important links:
Proposal Page -
http://www.mediawiki.org/wiki/User:Chughakshay16/GSOCProposal(2012)
Gerrit changesets -
https://gerrit.wikimedia.org/r/#/q/ConventionExtension,n,z
Extension Page - http://www.mediawiki.org/wiki/Extension:ConventionExtension
Suggestions are always welcome !
--
Thanks,
Akshay Chugh
skype- chughakshay16
irc - chughakshay16(#mediawiki)
[[User:Chughakshay16]] on mediawiki.org
On Mon, Oct 15, 2012 at 1:44 AM, S Page <spage(a)wikimedia.org> wrote:
> In the promised land, developers use vagrant to run local VM instances on
> laptops that puppet configures to run a production-ish MediaWiki. At Etsy
> and Facebook, the day a developer walks in she can make changes in her
> personal VM and push them to production (or so they claim in blog posts and
> meetups ;-) ).
The new Mozilla Kuma project is a good example for this as well, and
unlike the aforementioned ones, you can download the VM yourself. See:
https://github.com/mozilla/kuma/https://github.com/mozilla/kuma/blob/master/docs/installation-vagrant.rst
Relevant blog post about their Vagrant setup:
http://decafbad.com/blog/2011/10/02/putting-clouds-in-boxes
Some design notes:
https://wiki.mozilla.org/Webdev:DevBoxVMImages
We have an awesome and unique infrastructure with Labs for testing and
staging, but for local development, having a pre-packaged dev
environment (probably slightly less ambitious than beta.wmflabs) would
indeed seem very useful. How feasible/useful would it be to build on
the existing work, e.g. Andrew Bogott's MediaWiki class, to provide a
first iteration of such an environment? [1]
Erik
[1] https://gerrit.wikimedia.org/r/gitweb?p=operations/puppet.git;a=blob_plain;…
--
Erik Möller
VP of Engineering and Product Development, Wikimedia Foundation
Support Free Knowledge: https://wikimediafoundation.org/wiki/Donate