I run http://fixedreference.org http://fixedreference.org/ (where we've put static periodic copies of some of the WPs e.g. http://july.fixedreference.org/fr/20040727/wikipedia/Accueil etc.) and I have received a request from a charity which distributes 1500 refurbished computers a year to low income families in New York for a static version of WikiPedia to put on the computers they distribute.
I don't have an offline static version. I guess it would take a few days in Perl to write an offline static code. Am too busy at present. Anyone prepared to help? I am user BozMo on en.
Oh, and you posted a request for people to help with static copies. Am happy helping if you know what you want? I put up monthly copies of whichever WPs people have asked me to anyway at present on FR.
BozMo (aka Andrew Cates)
On Mon, 9 Aug 2004, Andrew Cates wrote:
I don't have an offline static version. I guess it would take a few days in Perl to write an offline static code. Am too busy at present. Anyone prepared to help? I am user BozMo on en.
You seem to have used an (improved) version of my script to make those pages. Depending on how long ago you downloaded it, there's better version at http://www.tommasoconforti.com/wiki along with static archives in many languages. The English version still isn't there, but there will be in a couple of days.
Alfio
For all interested, I'm quoting below a message from Alfio Puglisi that details the process of making an offline static version.
Alfio Puglisi wrote:
The script used to make the html version is here:
http://www.tommasoconforti.com/wiki/ (bottom of the page)
I used cygwin under Win2k, but should work with Linux also. To run the script you'll need to study a bit the various command line options. Here's what I normally use:
perl wiki2static.pl -m3 -t3 -a ---prefix=dump/ --languag="en" 20040802.sql
Where the last argument is the "cur" sql dump from download.wikimedia.org. You'll also need the texvc program for TeX math generation.
This will *download* each and every image from the wikimedia server, so it will probably take hours if not days. Subsequent runs are faster because it only downloads new images.
Hope that helps.
Cheers, Ivan.
wikitech-l@lists.wikimedia.org