[Wikipedia-l] Re: Snapshot wikipedia installations in schools
Magnus Manske
magnus.manske at web.de
Wed Jun 9 07:58:44 UTC 2004
Andy Rabagliati wrote:
>>You can export the main page of en: by using the
>>http://en.wikipedia.org/wiki/Special:Export (might need to be logged
>>in). In the field just request "Main Page". It will output a xml feed
>>that can then be imported in your local wiki.
>>For import you can either use Special:Import page (available to sysops
>>only, might not be available in 1.2.X versions) or build a little parser
>>that will read the xml and put it in the database.
>>
>>
>
>For this to work for me, it has to be fully automateable, over UUCP.
>
>Is there a script that will take the XML export and incorporate it
>into the MySQL database so the Mediawiki front end will see it ?
>
>Then a cronjob my end pulls the XML, and sends it to the school
>via UUCP, which stuffs it on stdin to a script, and the wiki is
>magically updated every day.
>
>
>
I am currently working on a Window$ syncing software. It will be able to
read the XML created by [[Special:Export]]. All I need on the wikipedia
end is a function that returns a plain text list of all articles that
have been edited since a given date. A single, simple SQL query on the
wikipedia server will do.
Magnus
More information about the Wikipedia-l
mailing list