Interlanguage extension
Some time ago, I made the Interlanguage Extension for MediaWiki (
www.mediawiki.org/wiki/Extension:Interlanguage ) with the aim of easing the
work with interlanguage links by enabling a multitude of wikis to use the
links from a central site.
I have then started a discussion on Meta (
http://meta.wikimedia.org/wiki/A_newer_look_at_the_interwiki_link ) about
whether such an extension would be acceptable and I believe that it is
generally welcomed by the community.
I wanted to further flesh out the extension and make a few additions, however,
in the meantime a Wikipedian opened a Bugzilla request to activate the
extension on Wikimedia projects (
https://bugzilla.wikimedia.org/show_bug.cgi?id=15607 ) and so I am bringing
the discussion here.
Do you think that using this extension on Wikimedia projects is feasible? Do
you think that it would work while using less (or at least not significantly
more) resources than the current way of updating interlanguage links by bots?
svn.wikimedia.org will take a short down time tomorrow 9/19 around 3pm
PST (22:00 UTC).
We're going to upgrade our soon to be EOL release of Ubuntu and pick up
some new packages as a result.
The upgrade itself will have us go from 7.04->7.10->8.04 and thus I'm
setting aside a whole hour in case the worse happens.
Please take care to know that commits should be avoided during this time.
I'll mail out again when the upgrade starts and ends along with
mentioning it in wikimedia irc channels.
--tomasz
You probably want wikitech-l for this question. cc'd there.
- d.
2008/9/8 jay mehta <jmenjoy05(a)yahoo.com>:
> I am doing a research project for which I had needed to download the wiki english dump which is of 15 GB from the below link.
> http://static.wikipedia.org/downloads/2008-06/en/wikipedia-en-html.tar.7z
> My problem is with the extraction of it.when I tried to extract it could extract upto 32GB only whereas it is supposed to give me around 208GB of data. Please help me as to how to extract it.
>
>
>
>
> _______________________________________________
> WikiEN-l mailing list
> WikiEN-l(a)lists.wikimedia.org
> To unsubscribe from this mailing list, visit:
> https://lists.wikimedia.org/mailman/listinfo/wikien-l
>
Hi, all,
Some of you may be aware that my wiki experienced a security breach
this afternoon. The error was a minor lapse in assigning
$wgGroupPermissions['user'][] = 'centralauth-merge'. This seems like
an inocuous error to start with, but it had the following impact:
1. The value $wgGroupPermissions['user'][0] was assigned to 'centralauth-merge'.
2. In getting a user's rights, we use array_keys( array_filter(
$wgGroupPermissions[$group] ) ), which added a right 0 to the list.
3. PHP has a nasty little habit of letting 0 == 'some string'. Witness
the result:
> $f = array( 'foo', 'bar', 'baz', 0, 'anne' );
> print in_array( 'blah', $f );
1
4. User::isAllowed returned true for all rights if a user was in the
'user' group, because it uses in_array.
I spoke to the people in ##php, and supposedly, this is expected
behaviour. They suggested we use in_array with 'true' as the final
parameter, which uses ===.
I've done this in r40946. For those working on a revision BEFORE this
one, PLEASE be careful. Double-check that you've assigned group
permissions with $wgGroupPermissions['user']['right'] = true, rather
than the (wrong) version I've used.
Thanks,
--
Andrew Garrett
Hi,
Just noticed that action=raw returns 403 if the url isn't identical to the
script path. This seems to be for some IE security problem? Isn't this a
pain for having behind reverse proxies/servers with different names? Is
there a good way round this? I have a problem that my skin loads user
scripts and styles using action=raw, and these now fail when I proxy...
--
Alex Powell
Exscien Training Ltd
Tel: +44 (0) 1865 876562
Mob: +44 (0) 759 5048178
skype: alexp700
mailto:alexp@exscien.com
http://www.exscien.com
Registered in England and Wales 05927635, Unit 10 Wheatley Business Centre,
Old London Road, Wheatley, OX33 1XW, England
Hi,
The robots.txt file can now be customized for all wikis. There's a
central robots.txt file which is common for all projects and which is
maintained by the server administrators of Wikimedia.
Additionally, there's now the possibility to add custom entries on a
per-wiki-base. Add lines to the Mediawiki:robots.txt page on your
project and they will be sent as part of robots.txt, too.
Best regards,
JeLuF
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Erik tossed this interesting tidbit my way:
http://codewideopen.blogspot.com/2008/09/inkscape-shell-patch.html
Inkscape can already be used on the command-line to do SVG to PNG
conversions, but there's some folks working on patches to allow a single
process to accept multiple requests over time. This may confer a
performance advantage, avoiding startup costs, but I haven't seen any stats.
Currently we use the GNOME librsvg library for our SVG rasterizations;
it's fairly fast and of course built for embedded rasterization, but
it's traditionally been a bit behind on bug and feature implementation,
so some files don't render correctly.
I think we're currently a few versions behind on librsvg, so probably an
upgrade would help a lot, but we still toss around the idea of just
shelling out to Inkscape.
Anybody interested in doing a sample run of known-bad-on-rsvg SVGs with
the latest rsvg to compare against inkscape, and maybe some performance
stats comparing rsvg, inkscape, and these experimental batch mode
inkscapes? Should be fun! :D
(You'll find lots of examples of not-rendering-right files on our
bugzilla -- search for SVG!)
- -- brion
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.8 (Darwin)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
iEYEARECAAYFAkjOAeEACgkQwRnhpk1wk47vcACeKw9/HmY2J97rG7UPfdGUvtmI
VqwAn1jWbtKlt2T24I0y0vzvY4eGK97M
=YNCJ
-----END PGP SIGNATURE-----
OK, this is really weird (see forwarded message)
Here is an email I received from "meta" apparently, that talks about
an edit that never existed (confirmed by Dungodung himself).
Any idea where that came from?
All the links seems legitimate and I can't seem to find any trace of fakeness.
Anyone can provide an explanation to that? Is this a bug?
Thanks
Delphine
---------- Forwarded message ----------
From: WikiAdmin <wiki(a)wikimedia.org>
Date: Fri, Sep 5, 2008 at 21:52
Subject: Meta page User:Notafish has been created by Dungodung
To: Notafish <notafishz(a)gmail.com>
Dear Notafish,
The Meta page User:Notafish has been created on 21:51, 5 September 2008 by
Dungodung, see http://meta.wikimedia.org/wiki/User:Notafish for the
current version.
This is a new page.
Editor's summary: staff member
Contact the editor:
mail: http://meta.wikimedia.org/wiki/Special:EmailUser/Dungodung
wiki: http://meta.wikimedia.org/wiki/User:Dungodung
There will be no other notifications in case of further changes unless
you visit this page.
You could also reset the notification flags for all your watched pages
on your watchlist.
Your friendly Meta notification system
--
To change your watchlist settings, visit
http://meta.wikimedia.org/wiki/Special:Watchlist/edit
Feedback and further assistance:
http://meta.wikimedia.org/wiki/Help:Help
--
~notafish
NB. This gmail address is used for mailing lists. Your emails will get lost.
Ceci n'est pas une endive - http://blog.notanendive.org
tstarling(a)svn.wikimedia.org schreef:
> Revision: 40902
> Author: tstarling
> Date: 2008-09-16 06:13:31 +0000 (Tue, 16 Sep 2008)
>
> Log Message:
> -----------
> Fixed documentation. Don't use empty() to determine if an array has zero length, that's not what it does.
>
That's exactly what it does [1].
Roan Kattouw (Catrope)
[1] http://nl2.php.net/function.empty
Hello,
im trying to login over a script on my MediaWiki Server and create a new
page. I can't use a browser because the input is a stream which comes
from the internet. I want to collect the stream, format it an put it on
as a wiki-page.
Can this be done? Has someone done this before?
Greetings
Smyley