It seems r40830, which merged the LinkSearch extension into core, as a
side effect changed the canonical name of Special:Linksearch from
"Linksearch" to "LinkSearch".
This happened to break one of my user scripts on en.wikipedia, which was
checking the canonical name in order to add some extra links to the
output of that special page. I realize that this is a minor annoyance,
and that fixing it took me all of one minute once I figured out what was
wrong, but could folks still _please_ try to avoid doing that in the
future? The canonical names are supposed to be canonical for a reason,
so that scripts can reliably tell what page they're running on. Don't
break them without a good reason.
Thank you, and we'll now return to your scheduled programming.
--
Ilmari Karonen
At 14:03:09 UTC, our Amsterdam sever cluster suffered from a partial
power outage. One out of two power feeds in each rack went down for
approximately 6 seconds, causing servers which are not redundantly
connected to go down. The following statement was released by the data
center:
> Amsterdam, 04 october 2008
> Subject: Power disruption at SARA 04-10-08
>
> On October 4th, approximately between 16:03 hrs and 18:00 hrs, a large
> area of Amsterdam has suffered from a severe power outage. Unfortunately
> also Science Park Amsterdam, where one of the SARA Datacenters is
> located, was confronted with this failure.
>
> The Emergency Power Supply took over the major part of the power
> delivery, but unfortunately one of our four UPS systems malfunctioned.
> As a result some customers experienced a short outage of approximately 6
> seconds on one of their two powerfeeds. After the 6 seconds the
> generators came online and provided full power on all feeds, including
> the affected one. However, a small number of racks needed a manual reset
> and was affected longer. The failure had consequences among others for
> the internet traffic and some other SARA services. At this moment most
> services are restored.
>
> By now the external power situation is stable and normal again. SARA is
> investigating the cause of the malfunction of the UPS system. If needed,
> the generator will preventive be put online and running.
>
> We are sorry for the inconvenience. If you need assistance you can
> contact us for support by one of our onsite engineers.
Approximately half an hour after the start of the power loss incident,
we experienced some strange additional problems with the servers in one
of racks all shutting down in the course of a few minutes. We think this
may have occurred due to rising temperatures in that rack, or the
systems being explicitly turned off by on-site personnel. At that point,
traffic was moved away to our Amsterdam cluster. Other racks were
unaffected.
The power supply is now stable again, and traffic has been moved back to
return to the normal situation.
--
Mark Bergsma <mark(a)wikimedia.org>
System & Network Administrator, Wikimedia Foundation
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Hey all --
I hereby invite you all to the first official regular weekly MediaWiki
Bug Monday, to occur on October 6*.
Come hang out in #mediawiki on irc.freenode.net and help us verify and
clean up bug reports, raise submitted patches for review and
application, and generally *try to break stuff*. :D
It'll be awesome!
* (Pick a timezone, any timezone, just show up when you can -- even you
Australians!)
- -- brion
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.8 (Darwin)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
iEYEARECAAYFAkjmrDoACgkQwRnhpk1wk46TlQCgutg+PosLCdda/EpVkyWw9iUn
cUgAnRnz/82ehAr0eRVGtO5tX0fg9Gia
=PjPN
-----END PGP SIGNATURE-----
Dear List Members,
Is there a list of the protected pages of the English wikipedia ? I would
like to extract the following information:
<page ID> <protection status>
<page ID> <protection status>
<page ID> <protection status>
...
Thanks.
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
One of the problems we've been seeing is that our code review procedure
doesn't always scale well. We have a fairly large number of committers,
and a pretty liberal policy about committing new code to trunk -- but we
also need things to _work_ consistently so we can keep the production
code up to date.
Traditionally, code that's been committed to SVN gets reviewed offline
by me or Tim before we push things out live. If we find problems, we fix
them up or revert the code to be redone correctly later.
There's a couple big problems with this:
1) We can't easily coordinate our notes; I can't see what Tim's reviewed
and what he hasn't. We end up either duplicating effort or missing things.
2) If we're both busy, sick, on vacation, etc sometimes it just doesn't
get done!
3) If we want more people to pitch in, coordination gets even harder.
In my spare time over the last few weeks I've thrown together a little
CodeReview extension for MediaWiki to help with this. It pulls the SVN
revision data as commits are made and presents an interface on the wiki
where we can see what's been reviewed, tag problems, and add comments
for follow-up issues.
Yesterday I went ahead and put it live:
http://www.mediawiki.org/wiki/Special:Code/MediaWiki
The UI's still a little rough, and not all linking and metadata features
are implemented; Aaron's going to help polish it up. :) But it's already
useful, and I've got some revisions flagged as fixme...
Feel free to try it out, and add notes for things which need work at:
http://www.mediawiki.org/wiki/CodeReview_todo_list
(or on Bugzilla)
Currently comments are open to any registered user on the wiki; status
changes and tagging updates are limited to the 'coder' group, which is
viral -- any coder can make another user a coder.
- -- brion
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.8 (Darwin)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
iEYEARECAAYFAkjk+/AACgkQwRnhpk1wk4574wCeNBWXR6EUGbu+IncchURb0vhk
n6cAoIvfN4Y251ypofhtiCPjz2BmnxXn
=vfrJ
-----END PGP SIGNATURE-----
Hi folks,
Thanks. I found my way around of mangled HTML (produced by the parser - not the "tidy" - tidy it's off in my wiki)
The solution was to strip all nonmeaningful empty spaces and linebreaks from the HTML. Smarty templates do that by placing template into {strip}{/strip} tags.
It would be nice to have an optional flag in the parser function return value like "verbatim"=true, like there is "noparse" and "isHTML" or simply have "isHTML" automatically suppress insertion of <p><br /></p>.
Best,
Evgeny.
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
The MediaWiki version number, $wgVersion in
includes/DefaultSettings.php, wasn't updated in the release of 1.12.1
and 1.13.2. So these releases will report themselves as being 1.12.0
and 1.13.1 respectively, in [[Special:Version]] and other places. A
new, corrected tarball has been uploaded.
New SHA-1 checksums:
c6f6e404ee9152deeec63cdc3278a2a57d556efe mediawiki-1.13.2.tar.gz
046206b342904cb729fd076ddd101534e23b6c07 mediawiki-1.13.2.patch.gz
4500cde3e60351ae2fc0382b8e91654f4cb6a0ff mediawiki-i18n-1.13.2.patch.gz
6f315f88a481daa1a92b1a409e92e036aaca610b mediawiki-1.12.1.tar.gz
1aed1e8083ebe98e884c924dad174c2fb1537d8b mediawiki-1.12.1.patch.gz
71fb3d06c1fe331fecf25cca92ea50bb07ce7465 mediawiki-i18n-1.12.1.patch.gz
New MD5 checksums:
e10f791ba9ecd02dd751a5676cc84405 mediawiki-1.13.2.tar.gz
2e33ed21c5e889f546556066a2b53806 mediawiki-1.13.2.patch.gz
db1e3b46e04a2608ea5429d73465ad03 mediawiki-i18n-1.13.2.patch.gz
00229272c5e1881ff36a07ca95891ca2 mediawiki-1.12.1.tar.gz
885c6dc5bce177563c3d7a14e5167411 mediawiki-1.12.1.patch.gz
82f874b72a65e71e41f4dadf410e0eec mediawiki-i18n-1.12.1.patch.gz
- -- Tim Starling
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.6 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
iD8DBQFI5QTtdWgrCOij/sQRArsGAJ9LxmlBwMVhnj2oPaE3WydOGEFi0wCeO5dL
ZudvdYX82fMDt1Gkdyv5wwI=
=J5XS
-----END PGP SIGNATURE-----
brion(a)svn.wikimedia.org schreef:
> Revision: 41458
> Author: brion
> Date: 2008-09-30 23:04:24 +0000 (Tue, 30 Sep 2008)
>
> Log Message:
> -----------
> Add an API module that can be POSTed to to trigger a codereview repo update
>
> + $repo = CodeRepository::newFromName( $params['repo'] );
> + if( !$repo ){
> + throw new MWException( "Invalid repo {$args[0]}" );
> + }
Throwing an MWException on invalid input is not very nice. You should
really use something like:
$this->dieUsage("Invalid repo ``{$args[0]}''", 'invalidrepo');
If you have some kind of array containing all valid repos lying around,
you could use that in getAllowedParams() so the repo parameter will
automatically be validated by extractRequestParams().
> + $endRev = intval( $params['rev'] );
You can also set the rev parameter to be an integer in
getAllowedParams(), so you won't have to use intval() (and so
non-integers will be rejected instead of being silently converted to 0).
> + if( $lastStoredRev >= $endRev ) {
> + // Nothing to do, we're up to date.
> + return;
> + }
You should still output *something* here (see also below).
> + // fixme: this could be a lot?
Yeah, you gotta thing about max exec time here. What you could do is
just process the first 50 or so and tell the client where you stopped,
so they can issue another request for the next 50.
> + $log = $svn->getLog( '', $lastStoredRev + 1, $endRev );
> + if( !$log ) {
> + throw new MWException( "Something awry..." );
> + }
I don't know in which circumstances $log would be empty or false, but
unless it only happens in really crazy cases you should use dieUsage()
here as well.
> + foreach( $log as $data ) {
> + $codeRev = CodeRevision::newFromSvn( $repo, $data );
> + $codeRev->save();
> + // would be nice to output something but the api code is weird
> + // and i don't feel like figuring it out right now :)
> + }
You probably wanna do something like this:
$data = array();
foreach($log as $entry)
{
// Do whatever you need to do
$data['saved'][] = array('revid' => $revid, /* More properties
here */);
}
$data['continue'] = $revidToContinueFrom;
// Tell the XML formatter to use <rev> tags for the $data['saved'] array
// You need to call this function on every array with integer indices,
// or ApiFormatXml will scream and die
$this->getResult()->setIndexedTagName($data['saved'], 'rev');
// Add $data on the top level (that's what the null does) in a
// <codeupdate> tag (which is what getModuleName() returns)
$this->getResult()->addValue(null, $this->getModuleName(), $data);
The XML this will return will look like:
<api>
<codeupdate continue="56">
<saved>
<rev revid="54" />
<rev revid="55" />
<rev revid="56" />
</saved>
</codeupdate>
</api>
> + public function getAllowedParams() {
> + return array(
> + 'repo' => null,
> + 'rev' => null );
> + }
A better getAllowedParams() array would probably be:
// This assumes $wgAvailableRepos is an array of
// (surprise) available repos. If there is no such
// array, you should still use 'repo' => null like
// you did initially.
array(
'repo' => array(
ApiBase::PARAM_TYPE => $wgAvailableRepos
),
'rev' => array(
ApiBase::PARAM_TYPE => 'integer'
)
)
Roan Kattouw (Catrope)
Hello,
redirects running on the following domains are broken. I'm getting "Wiki
does not exist" error.
wikibooks.cz, wikicitaty.cz, wikidruhy.cz, wikiknihy.cz, wikipedia.cz,
wikipedie.cz, wikislovnik.cz, wikisource.cz, wikispecies.cz,
wikiversity.cz, wikiverzita.cz, wikizpravy.cz, wiktionary.cz, wikimedia.cz
The only one working is wikiquote.cz but www.wikiquote.cz is broken as
the above. Does it have something to do with yesterday's redirect.conf
issue?...
Thanks for help
--
Vojtech Hala (aka Egg), MFF UK, Prague