hello,
the account expiration date was originally scheduled for April 1st, but has
been extended to May 1st. on this date, all accounts will expire (and no
longer be usable) except those which have had the expiration date extended.
if you have an account, and you would like to keep it:
- if you have one or more working projects, please describe these (preferably
with examples, URLs, etc.)
- if you do not yet have anything ready (particularly if you're a new user),
please describe what you intend to work on. a rough estimate of when you
expect it to be ready would be useful. if some issue is holding you up
(e.g. lack of text access), please mention that.
if you no longer wish to use your account, please say so.
this information should be mailed to <dab(a)daniel.baur4.info> and cc'd to
<zedler-admins(a)wikimedia.org>. (there's no particular deadline, but if you
wait until one day before the expiration, you might find that your account
expires because no-one managed to look at it yet...)
k.
Interiot and pgk were hanging around on IRC complaining about toolserver
lag, so I gave them both PROCESS access in MySQL. This allows them to
see what queries are running at any given time. I'm hoping that they
will terrorise anyone responsible for running inefficient lag-inducing
read queries, and thereby improve the situation.
-- Tim Starling
Hi!
As you all may have noticed, the replication lag has got higher and
higher during the last days. "Replication lag is currently 1 day, 22
hours.".
Why is the replag so high?
Greets,
Marco
Dear admins,
Let me insist and ask again about the status und plans of the toolserver: Starting on Monday our team will start over for an interim last programming round on Wikipoint database in our spare time.
So, I'd really like to avoid redundant programming on outdated articles (residing in dumps).
Now, we will use WikiProxy in the hope everything works as well as it gets, right?
-- Stefan
P.S. There is yet another potential use of our Wikipoint service coming up: There is a proposal for a 'wiki safary across germany' which has the goal to capture pictures for georeferenced articles which have not yet one... c.f. http://de.wikipedia.org/wiki/Wikipedia_Diskussion:Bilderw%C3%BCnsche. If they would plan on the basis of a dump, which is always some weeks old, then they probably would gather photos where there has been already someone...
-----Original Message-----
From: toolserver-l-bounces(a)Wikipedia.org [mailto:toolserver-l-bounces@Wikipedia.org] On Behalf Of Leo Büttiker
Sent: Mittwoch, 22. März 2006 22:04
To: toolserver-l(a)wikipedia.org
Subject: [Toolserver-l] Troubles with reading Articles
Hi all,
For a toolserver-project I will read all Wikipedia (pwiki_de) articles and
parse them for geoinformation. After some troubles I've fixed now nearly all
bugs, but I have still some troubles with opening the articles.
I open the article with the help of the mediawiki functions in the following
way:
$title = Title::newFromID($page_id);
$art = new Article($title);
$text = $art->getContent(true);
For some articles this work quite well, but for some it doesn't return text. I
think there's a problem with the compresion of the database (in a local
enviroment with a wikipedia dump it works), but I could't find out a
workaround. Any suggestions?
Thanks
Leo
_______________________________________________
Toolserver-l mailing list
Toolserver-l(a)Wikipedia.org
http://mail.wikipedia.org/mailman/listinfo/toolserver-l