Στις 18-11-2011, ημέρα Παρ, και ώρα 20:33 +1100, ο/η John Vandenberg έγραψε:
On Thu, Nov 17, 2011 at 6:40 AM, Ariel T. Glenn ariel@wikimedia.org wrote:
Στις 12-11-2011, ημέρα Σαβ, και ώρα 00:31 +1100, ο/η John Vandenberg έγραψε:
On Fri, Nov 11, 2011 at 11:18 PM, emijrp emijrp@gmail.com wrote:
Forwarding...
---------- Forwarded message ---------- From: emijrp emijrp@gmail.com Date: 2011/11/11 Subject: Old English Wikipedia image dump from 2005 To: wikiteam-discuss@googlegroups.com
Hi all;
I want to share with you this Archive Team link[1]. It is an old English Wikipedia image dump from 2005. One of the last ones, probably, before Wikimedia Foundation stopped publishing image dumps. Enjoy.
Regards, emijrp
[1] http://www.archive.org/details/wikimedia-image-dump-2005-11
People interested in image dumps may be also interested in my post relating to the GFDL requirements, which I think mean images need to be included in the dumps.
https://meta.wikimedia.org/w/index.php?title=Talk:Terms_of_use&diff=prev...
excerpt:
"..the [GFDL] license requires that someone can download a ''complete'' Transparent copy for one year after the last Opaque copy is distributed. As a result, I believe the BoT needs to ensure that the dumps are available ''and'' that they can be available for one year after WMF turns of the lights on the core servers (it allows 'agents' to provide this service). As Wikipedia contains images, the images are required to be included. .."
discussion continues ..
https://meta.wikimedia.org/wiki/Talk:Terms_of_use#Right_to_Fork
I would read this as requiring access to the images to remain available, not necessarily in dump form.
I dont believe that is the case. The GFDL, like the GPL, requires that it is possible to rebuild the product from the distributed source, minus any seperately distributed dependencies.
It is necessary to provide a simple mechanism for reliably downloading the used images on each project and incorporating all of the dumps needed to regenerate a replica of each project.
The 'source' can be broken into chunks, but it would be obviously contray to the spirit of the license to require that each and every image needs to be downloaded individually.
There are scripts to download all media used on a project ( http://meta.wikimedia.org/wiki/Wikix ). As long as the end user runs one command, it doesn't matter what's happening on the back end.
_and_ it needs to be possible for any consumer to perform the task of obtaining the source. Does the WMF block people who attempt to mirror the project content one item at a time? IMO blocking them is very sane, but if that is the only way to obtain the source then it would again be breaking the licence.
AFAIK we do not block folks that are making serial requests, even if they crawl the entire media space. Serial requests don't incur a big cost on our servers.
InstantCommons means that those images dont need to be redistributed in order for the projects to be compliant with the GFDL.
-- John Vandenberg
However I would be happier if we had full media mirrors hosted by other folks (and they could provide packages of groups of files for download too).
Ariel