Saludos a Todos.Que es? ondbzip2???? Es posible mejor ,para traduccion por favor.Gracias,Mr.Serguey puede ayudar usted en las traducciones,por favor,he leido todo,y me parece interesante.Dr.Juan Cesar Martinez
----- Original Message ----- From: xmldatadumps-l-request@lists.wikimedia.org To: xmldatadumps-l@lists.wikimedia.org Sent: Monday, October 04, 2010 2:21 PM Subject: Xmldatadumps-l Digest, Vol 9, Issue 1
Send Xmldatadumps-l mailing list submissions to xmldatadumps-l@lists.wikimedia.org
To subscribe or unsubscribe via the World Wide Web, visit https://lists.wikimedia.org/mailman/listinfo/xmldatadumps-l or, via email, send a message with subject or body 'help' to xmldatadumps-l-request@lists.wikimedia.org
You can reach the person managing the list at xmldatadumps-l-owner@lists.wikimedia.org
When replying, please edit your Subject line so it is more specific than "Re: Contents of Xmldatadumps-l digest..."
Today's Topics:
- Re: Dumps, dumps, dumps (Jamie Morken)
- testing one phase at a time parallelized en wiki dumps (Ariel T. Glenn)
- Enwiki stopped (Andreas Meier)
- Re: Enwiki stopped (Ariel T. Glenn)
- Domas visits logs (emijrp)
- Re: Domas visits logs (Ariel T. Glenn)
- dataset1 maintenance Sat Oct 1 (dumps unavailable) (Ariel T. Glenn)
- Re: dataset1 maintenance Sat Oct 1 (dumps unavailable) (Ariel T. Glenn)
- posting (yllaermdm27@gmail.com)
- Re: dataset1 maintenance Sat Oct 1 (dumps unavailable) (emijrp)
Message: 1 Date: Sun, 15 Aug 2010 09:13:23 -0700 From: Jamie Morken jmorken@shaw.ca Subject: Re: [Xmldatadumps-l] Dumps, dumps, dumps To: "Ariel T. Glenn" ariel@wikimedia.org Cc: xmldatadumps-l@lists.wikimedia.org Message-ID: cdc2e0a421fc9.4c67afb3@shaw.ca Content-Type: text/plain; charset="iso-8859-1"
Hi,
----- Original Message ----- From: "Ariel T. Glenn" ariel@wikimedia.org Date: Sunday, August 15, 2010 12:15 am Subject: Re: [Xmldatadumps-l] Dumps, dumps, dumps To: Jamie Morken jmorken@shaw.ca Cc: emijrp emijrp@gmail.com, xmldatadumps-l@lists.wikimedia.org
Images take up 8T or more these days (of course that includes deletes and earlier versions but those aren't the bulk of it).? Hosting 8T tarballs seems out of the question... who would download them anyways?
Having said that, hosting small subsets of images is qute possible and is something that has been discussed in the past.? I would love to hear which subsets of images people want and would actually use.
There is the script wikix that people have used to manually download
images from wikis:
http://meta.wikimedia.org/wiki/Wikix
It generates a list of all the images in an XML dump and then downloads
them.? The only thing missing is the image scaling, without that the enwiki image dump will be too large for most people to use right now.? ImageMagick, http://en.wikipedia.org/wiki/ImageMagick could work to scale the various formats of images to smaller sizes.
Here's a script snippet I found using it in the bash shell:
#!/bin/sh find /media/SHAWN\ IPOD/Songs/ -iname "*.png"| while read file; do convert -size 75x75 "$file" -resize 100x100 "cover.bmp" cp cover.bmp "${file%/*}"/. done
If wikimedia foundation provides a dump of images I think people will find
good ways to use them in interesting ways.? Dumps of enwiki images with a max size of 640x480 or 800x600 and also enwiki thumbnails are the two subsets I think would be most valuable.
cheers, Jamie
Ariel