One discussion we had at the Hack-a-ton was about the continued
frustration of getting features deployed to the WMF-operated sites.
Prior to Hack-a-ton, one short-term solution we started work on was
consolidating the review queue into a single place:
That page still needs some organization and a process around it to
make sure that we're actually looking at the page. However, our
discussion at Hack-a-ton made it clear that even if we have a
well-tuned process there, we still have features rolling off the end
of that conveyor belt onto the floor.
After review, some (but not all) of the features in the review queue
then need to be reviewed for checking into the deployment branch. Our
short term answer to that was the deployment queue:
Even then, we're still not done. We have one more step, which is
launching the feature on one or more wikis. We could also create
another queue page for that. However, given the complicated workflow
here, it seems that a wiki is the wrong tool to keep track of this.
My inclination at this point is to augment the list of keywords on
Bugzilla, and use mediawiki.org to document the process, and as a
place to stash the magical queries to pull up the right lists.
We already have the "need-review" keyword. I suggest we add two more
keywords: "need-deploy" and "need-enabled". Then we add all three
keywords to feature requests that need to go through the whole process
before being deployed. For example, we'd add all three to this issue:
We'd then pick off the keywords as we step through the process (e.g.
once it's reviewed, remove the "need-review" keyword). We could then
generate three queries to get us the three queues I alluded to above:
1. Issues with all three keywords. These are features that someone
would like to see deployed and launched, but needs to be reviewed
2. Issues with "need-deploy" and "need-enabled". These are
extensions that have been reviewed, but need to be checked into the
3. Issues with "need-enabled" only. These are extensions/features
that just need action from ops.
Does this make sense? If so, I'll add the keywords and start
documenting the process and retrofitting existing feature requests
into this system.
I submitted a patch to trunk, and an apparently unrelated test broke.
Does that just mean it's an old test that's broken for other reasons? I
think the randomized user passwords broke the old API tests a few days
ago, but I'm not sure.
I did change the Makefile, but in a trivial way, so I could use
different PHP binaries (don't ask).
In any case I just submitted a new API test, ApiUploadTest, that works
apart from the (broken?) ApiTestSetup framework. I hope to refactor it
into something that can replace all the API tests in the near future.
It's a lot easier to read, the naming of classes is more consistent (it
made me cry to inherit from "ApiSetup") and the randomized user
passwords work. I'm not saying it's perfect, or even particularly good,
but it's a little better.
Neil Kandalgaonkar ( <neilk(a)wikimedia.org>
Forwarding to wikitech-l, needs more audience.
I've no love for outdated software, so I'm firmly in
the +1 camp.
---------- Forwarded message ----------
From: Ashar Voultoiz <hashar+wmf(a)free.fr>
Date: Tue, Sep 28, 2010 at 3:39 PM
Subject: [Mediawiki-l] about requiring PHP 5.2
Looking at INSTALL it seems we are still supporting PHP version 5.1
which is 5 years old in a couple of weeks. This is getting old and
prevents developers from using some new features.
Ideally we could raise it to 5.3 to get Namespace support, closures but
that might be to early since most webhost probably still use 5.2.x.
Would it be possible to consider raising the requirement to at least
5.2.0 ? This would give us native JSON support and most probably the
filter extension enabled by default. The later can be used to speed up
the input validation.
MediaWiki-l mailing list
> Message: 11
> Date: Mon, 1 Nov 2010 07:29:18 +0000 (UTC)
> From: Tisza Gerg? <gtisza(a)gmail.com>
> Subject: Re: [Wikitech-l] Cross wiki script importing
> To: wikitech-l(a)lists.wikimedia.org
> Message-ID: <loom.20101101T082118-175(a)post.gmane.org>
> Content-Type: text/plain; charset=utf-8
> Raimond Spekking <raimond.spekking <at> gmail.com> writes:
> > Try something like
> That will break HTTPS security though. I use this script on my home wiki:
May I ask how? If you're logged in to the secure server, then the
cookies won't get transmitted to the unsecure server when loading js
from them. At the very worse (if we really put on our tin foil hats) I
suppose someone could intercept the non-secured js script, do a man in
the middle type thing and replace the script with malicious js.
However if someone actually has the ability to do that, they could
already do that with the geoip lookup. Thus I don't see how doing the
importScriptURI reduces security.
I am the administrator of Jmol wiki, http://wiki.jmol.org, and I am trying
to authorize the upload of files with extensions .pdb, .mol, ... (chemistry
It was working some time ago, but apparently it's not working any more.
I don't know when it stopped working (maybe when upgrading MediaWiki but not
sure at all).
Error message is "File is corrupt or the extension does not match the file
We are currently using MediaWiki 1.14.0 (but I could upgrade if required,
just needs some work to change the Jmol extension to work with 1.16)
Our current configuration is :
In LocalSettings.php :
$wgEnableUploads = true;
$wgFileExtensions = 'cml';
$wgFileExtensions = 'ico';
$wgFileExtensions = 'mol';
$wgFileExtensions = 'pdb';
$wgFileExtensions = 'xyz';
$wgTrustedMediaFormats = 'chemical/x-pdb';
$wgTrustedMediaFormats = 'chemical/x-xyz';
In includes/mime.types :
Can anyone help us to find what's going on ?