I'd like to express my congratulations to the developer who simply
"bolded" those page titles (in watchlist) for watched pages having
unseen changes.
It like that; it's simple and efficient.
Tom
P.S.
My garish-green updated marker has other, further advantages as
explained in [1], but admittedly not everyone liked that; users can
opt-out in EnotifWiki versions.
[1]
http://meta.wikimedia.org/wiki/Email_notification_%28documentation%29#Updat…
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
MediaWiki 1.5 beta 1 is a preview release, pretty much feature complete,
of the new 1.5 release series. There are several known and likely a
number of unknown bugs; it is not recommended to use this release in a
production environment but would be recommended for testing in mind of
an upcoming deployment.
A number of significant changes have been made since the alpha releases,
including database changes and a reworking of the user permissions
settings. See the file UPGRADE for details of upgrading and changing
your prior configuration settings for the new system.
For a relatively full list of changes since 1.5alpha2, see the changelog
in the release notes.
Release notes:
http://sourceforge.net/project/shownotes.php?release_id=337757
Download:
http://prdownloads.sf.net/wikipedia/mediawiki-1.5beta1.tar.gz?download
Before asking for help, try the FAQ:
http://meta.wikimedia.org/wiki/MediaWiki_FAQ
Low-traffic release announcements mailing list:
http://mail.wikipedia.org/mailman/listinfo/mediawiki-announce
Wiki admin help mailing list:
http://mail.wikipedia.org/mailman/listinfo/mediawiki-l
Bug report system:
http://bugzilla.wikimedia.org/
Play "stump the developers" live on IRC:
#mediawiki on irc.freenode.net
- -- brion vibber (brion @ pobox.com)
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.4 (Darwin)
Comment: Using GnuPG with Thunderbird - http://enigmail.mozdev.org
iD8DBQFCvolDwRnhpk1wk44RAvT8AKChqSlSP+vqxnVnBo0mdqzW+x7V0wCgyJLN
6C4aRnTwiScc6H43xf2ri+Q=
=JaIh
-----END PGP SIGNATURE-----
I've made a special page to rename users, you can find it in
extensions/Renameuser/SpecialRenameuser.php, it only works with the
1.5 schema.
Run it at your own risk;)
Sending as a separate mail to not mess things :)
At Les trophees du Libre, MediaWiki won the following prizes:
* a nice trophy, of which i shall upload a picture at some point, and
which'll be for someone to to you
* an HP laptop dual boot. I guess you guys/gals decide what to do with
it :)
* an hosting offer "pack premium", description at
http://nexenservices.com/product_info.php?set_lang=en&cPath=2&products_id=17
.Once again, i guess you decide whether to accept it or not, what to
do with it. I think the offer is for one year (but i can contact
someone to make sure, or have more details)
* 4 subscriptions to DirectionPHP (
http://www.directionphp.biz/a_la_une.php ).
Nicolas
I have added a new function to Special:Export. Just letting you know, in
case you worry why I do that so shortly before 1.5 going life:
* The function is dead simple
* It requires $wgAllowArticleList to be set "true", which it isn't by
default
So it shouldn't break anything...
What it does is to return a simple XML list of all articles changed at
or after a specific time. Example syntax:
Special:Export&action=list&since=20050625083959
Output looks like this:
<mediawiki version="0.1">
<articlelist>
<article prefixed_title="Main_Page" title="Main_Page"
namespace_id="0" namespace_name=""/>
<article prefixed_title="Help:Contents" title="Contents"
namespace_id="12" namespace_name="Help"/>
</articlelist>
</mediawiki>
The reasoning behind this is to ease mirror updates, which then won't
have to wait for the next dump. Goes like this:
* Check your mirror database for the latest date any article has
* Get a list of all articles that have changed since that moment
* Get all these articles via the "normal" Special:Export function
* Update your database with these
I'm not sure if we'll have such a function via the SOAP interface soon,
but - so what?
Magnus
Magnus, could you do this as a drop-in extension rather than in the main
SpecialExport code?
* It's standalone
* It's optional
* It doesn't use the defined export format
I've taken it out for the moment, but would love to see it available as
an extension.
A couple notes:
* Since this isn't part of the defined mediawiki export schema, it
should probably not claim to be the same version.
* Probably best to use existing htmlspecialchars() rather than
reimplementing it
* Actually, there's a function wfElement() specifically for generating
XML tags with proper formatting. That might be handy.
* There's no index on page_touched, and no limit on the query; that
could be a serious performance problem if this were used.
* Probably should use the database timestamp() method to format the
timestamp in the query; we'll at some point switch over to using
DATETIME fields rather than the 14-char fields.
-- brion vibber (brion @ pobox.com)
May be, that I miss an information in this newsgroup, may be, my english
is too humble for understanding it, but I cannot find an answer to my
question: "When can we expect fresh dumps?"
Our mirror meanwhile uses a five weeks old cur_table, which seems to me
to be the recent one.
Please give me a hint, if I'm wrong or at least the hope for a freash dump.
Cheers
jo
Sj:
> What is it that will take a lot of time to update, rewriting the queries?
> There are some SQL-friendly people around here who would like to help with
that.
I know SQL. Right now the scripts are parsing the dumps directly, doing a
lot of encoding, splitting, sorting and merging to process the files within
cramped memory.
SQL would be much easier. I do however worry about run time. I asked Brion
at Berlin, and he had no idea either.
Heavy ad hoc SQL queries have already often been forcefully aborted or
forbidden alltogether, no doubt for good reasons, so what to expect for this
monster job? It already runs 10 hours on English Wikipedia alone, over 24
hours for all Wikipedias. And parsing the dumps should still be much faster
than SQL, with all its extra I/O. Of course some tests could shed light on
this.
Apart from that I do lack the energy for any serious programming. All that
is left is for my day job. So I won't work on migrating to SQL of write new
encoding, splitting, sorting and merging code anytime soon. If I do some
perl hacking during my holiday it should be targeted towards EasyTimeline
where I have made promises that are half a year overdue (mainly unicode
support). If someone else would want to step in for wikistats code, that
would of course be great.
Erik Zachte