Dear admins,
Let me insist and ask again about the status und plans of the toolserver: Starting on Monday our team will start over for an interim last programming round on Wikipoint database in our spare time.
So, I'd really like to avoid redundant programming on outdated articles (residing in dumps).
Now, we will use WikiProxy in the hope everything works as well as it gets, right?
-- Stefan
P.S. There is yet another potential use of our Wikipoint service coming up: There is a proposal for a 'wiki safary across germany' which has the goal to capture pictures for georeferenced articles which have not yet one... c.f. http://de.wikipedia.org/wiki/Wikipedia_Diskussion:Bilderw%C3%BCnsche. If they would plan on the basis of a dump, which is always some weeks old, then they probably would gather photos where there has been already someone...
-----Original Message----- From: toolserver-l-bounces@Wikipedia.org [mailto:toolserver-l-bounces@Wikipedia.org] On Behalf Of Leo Büttiker Sent: Mittwoch, 22. März 2006 22:04 To: toolserver-l@wikipedia.org Subject: [Toolserver-l] Troubles with reading Articles
Hi all, For a toolserver-project I will read all Wikipedia (pwiki_de) articles and parse them for geoinformation. After some troubles I've fixed now nearly all bugs, but I have still some troubles with opening the articles.
I open the article with the help of the mediawiki functions in the following way: $title = Title::newFromID($page_id); $art = new Article($title); $text = $art->getContent(true);
For some articles this work quite well, but for some it doesn't return text. I think there's a problem with the compresion of the database (in a local enviroment with a wikipedia dump it works), but I could't find out a workaround. Any suggestions?
Thanks Leo _______________________________________________ Toolserver-l mailing list Toolserver-l@Wikipedia.org http://mail.wikipedia.org/mailman/listinfo/toolserver-l