On Wed, 15 Jun 2005, Kim Tucker wrote: (mailing list copied)
Hi Andy, How is everything?
Lets start with the big questions :-) Working currently on an easily installable "wizzy server" that does email in an occasionally-connected location. Mostly via dialup these days, but I hope via USB memory stick.
Are you by any chance going to WCCE? (we have exhibition space and a demo slot).
Looking at the dates, no :-(
We are preparing a box of CDs to distribute and considered including extracts of Wikipedia.
Do you have a subset which would fit on one or at most 2 CDs which could be distributed without risk of bad content inside? - and ideally relevant content which would be of use even to people in Africa?
I carry wikipedia around on an external hard drive, for installation at schools.
Capacity quickly wizzed through a DVD last year ..
I have been deleting my 2004 copies (which would fit your criteria) - maybe I shouldn't. However, old dumps are fine (just as good as today's dumps, I didn't read the old one yet).
I think the 'bad content' (and 2 CD) criteria could be easily satisfied with a text-only recent dump.
Text-only carries everything, but lacks pizazz (and porn).
But that is what I recommend. Any others should be sent to Kim directly.
Cheers, Andy!
Andy Rabagliati a écrit:
On Wed, 15 Jun 2005, Kim Tucker wrote: (mailing list copied)
Hi Andy, How is everything?
Lets start with the big questions :-) Working currently on an easily installable "wizzy server" that does
email in an occasionally-connected location.
Mostly via dialup these days, but I hope via USB memory stick.
Are you by any chance going to WCCE? (we have exhibition space and a
demo slot).
Looking at the dates, no :-(
We are preparing a box of CDs to distribute and considered including
extracts of Wikipedia.
Do you have a subset which would fit on one or at most 2 CDs which
could be distributed
without risk of bad content inside? - and ideally relevant content
which would be of use
even to people in Africa?
I carry wikipedia around on an external hard drive, for installation
at schools.
Capacity quickly wizzed through a DVD last year ..
I have been deleting my 2004 copies (which would fit your criteria) -
maybe I shouldn't.
However, old dumps are fine (just as good as today's dumps, I didn't
read the old one yet).
I think the 'bad content' (and 2 CD) criteria could be easily
satisfied with a text-only recent dump.
Text-only carries everything, but lacks pizazz (and porn).
But that is what I recommend. Any others should be sent to Kim directly.
Cheers, Andy!
Not sure exactly where this mail comes from (/me perplex)... but text-only encyclopedia, though safer, is a bit restricted :-(
I actually would prefer we set up a system with taggued pages than to strip all images :-( but well... it takes less room
What is a "wizzy" server ? ant
Not sure exactly where this mail comes from (/me perplex)... but text-only encyclopedia, though safer, is a bit restricted :-(
For text-only: check out wik2dict and dict. dict is a client/server hypertext protocol. wik2dict converts Mediawiki SQL dumps into the (gzipped) dict format. Usually the result takes up less space than the *table.sql.gz that is downloaded.
I'm currently working on a http gateway to dict that is more friendly than the thing that's running at dict.org. I'll probably install it at a Geekcorps server, together with the Bambara and French Wikipedia-dicts. I think it would be cool if a Wikimedia server can be used to server dict stuff. It's really less resource intensive, and would be a nice backup for when the real servers are too slow. It also queries all different dicts at once (i.e. Wikipedias, Wiktionaries, Wikibooks in many languages). I'll post the link to htdict once it's finished :)
I actually would prefer we set up a system with taggued pages than to strip all images :-( but well... it takes less room
I've actually been thinking about doing something like Tomeraider does. Maybe there is more interest in this. I think it's ugly that people are kind of forced to use a proprietary format to have Wikipedia on their PDAs.
On Thu, 16 Jun 2005, Guaka wrote:
Not sure exactly where this mail comes from (/me perplex)...
(sorry, dates seemed tight)
but text-only encyclopedia, though safer, is a bit restricted :-(
Often people are looking for guarantees, especially around (proveable) CD distributions.
Text guarantees no bad pictures (whatever they are).
For text-only: check out wik2dict and dict. dict is a client/server hypertext protocol. wik2dict converts Mediawiki SQL dumps into the (gzipped) dict format.
I actually would prefer we set up a system with taggued pages than to strip all images :-( but well... it takes less room
I've actually been thinking about doing something like Tomeraider does. Maybe there is more interest in this. I think it's ugly that people are kind of forced to use a proprietary format to have Wikipedia on their PDAs.
I bought the memory stick for my PDA, (for wikipedia), and read the howto page. I could do it myself, but maybe it is worth the money to buy Tomeraider's solution.
On another note, we ( http://www.slug.org.za/ ) installed Linux Thin Client labs in ten schools today, which included a March snapshot of en wikipedia (including pictures).
This was mostly possible because the 15 Gig required (pictures) could be carried to the school on a hard drive (the server pre-installation).
These are for schools presently without Internet access.
Cheers, Andy!
I bought the memory stick for my PDA, (for wikipedia), and read the howto page. I could do it myself, but maybe it is worth the money to buy Tomeraider's solution.
Could it ever be worth the money when it's poor people who are paying? I think Tomeraider is definitely not the way to go. It locks you in and it's a crime against humanity to have African people (or countries) pay software licenses - agricultural goods need to be exported to make up for this.
...I loved to read Brazil's stance on this: the US is losing 1 billion US$ on unlicensed software in Brazil, Brazil is losing 1 billion US$ on paying software licenses to the US. So the Brazilians say, if we stop using proprietary software both us and the US will save 1 billion US$.
It's about time to have a non-proprietary solution to carry Wikipedia on a PDA.
On another note, we ( http://www.slug.org.za/ ) installed Linux Thin Client labs in ten schools today, which included a March snapshot of en wikipedia (including pictures).
This was mostly possible because the 15 Gig required (pictures) could be carried to the school on a hard drive (the server pre-installation).
That's nice. Is that 15 GB the entire Commons? Or just the pix from the English Wikipdia?
On Fri, 17 Jun 2005, Guaka wrote:
On another note, we ( http://www.slug.org.za/ ) installed Linux Thin Client labs in ten schools today, which included a March snapshot of en wikipedia (including pictures).
This was mostly possible because the 15 Gig required (pictures) could be carried to the school on a hard drive (the server pre-installation).
That's nice. Is that 15 GB the entire Commons? Or just the pix from the English Wikipdia?
It is pictures from en wikipedia.
http://download.wikimedia.org/images/ (or a version thereof at the time) used to have the picture dump.
I have been too busy recently to chase up a more recent dump. And I need to figure out the 'commons' thing for my downloads.
However, the 1.5 mediawiki release, and attendant database changes, mean that I am waiting for the dust to settle before spending time getting a new snapshot.
The old one (March) is so great anyway (for folks that do not even have internet) that I am hanging with that for a while.
I tart it up (for schools that dial up overnight) with todays wikipedia front page, edited to link to their snapshot.
An example can be seen at http://esangweni.shacknet.nu/ - a school that has a wireless (24hour) connection.
Cheers, Andy!
It would be an interesting idea to have an offline wikipedia for schools, wherein the content/pictures are updated overnight, so that it's refreshed by the morning. That'd be nice to do for every wikipedia...any program specifically that could perhaps be set to download specified wikis, with settings per wiki?
Nice.
James
-----Original Message----- From: wikipedia-l-bounces@Wikimedia.org [mailto:wikipedia-l-bounces@Wikimedia.org] On Behalf Of Andy Rabagliati Sent: Friday, June 17, 2005 3:10 PM To: wikipedia-l@Wikimedia.org Subject: [Wikipedia-l] Offline wikipedia (was: WCCE and Wikipedia etc.)
On Fri, 17 Jun 2005, Guaka wrote:
On another note, we ( http://www.slug.org.za/ ) installed Linux Thin Client labs in ten schools today, which included a March snapshot of en wikipedia (including pictures).
This was mostly possible because the 15 Gig required (pictures) could be carried to the school on a hard drive (the server pre-installation).
That's nice. Is that 15 GB the entire Commons? Or just the pix from the English Wikipdia?
It is pictures from en wikipedia.
http://download.wikimedia.org/images/ (or a version thereof at the time) used to have the picture dump.
I have been too busy recently to chase up a more recent dump. And I need to figure out the 'commons' thing for my downloads.
However, the 1.5 mediawiki release, and attendant database changes, mean that I am waiting for the dust to settle before spending time getting a new snapshot.
The old one (March) is so great anyway (for folks that do not even have internet) that I am hanging with that for a while.
I tart it up (for schools that dial up overnight) with todays wikipedia front page, edited to link to their snapshot.
An example can be seen at http://esangweni.shacknet.nu/ - a school that has a wireless (24hour) connection.
Cheers, Andy!
wikipedia-l@lists.wikimedia.org