On 15 August 2013 12:27, Amir Ladsgroup <ladsgroup@gmail.com> wrote:
I asked about a solution of the 200M everywhere. The best option is cloning shallowly  


No, it's not. After running git gc, I can re-clone the existing repository by only transferring 8MB:

valhallasw@lisilwen:~/src/pywikibot-compat$ git daemon --reuseaddr --base-path=. --export-all --verbose &
valhallasw@lisilwen:~/src/pywikibot-compat$ cd ..
valhallasw@lisilwen:~/src$ git clone git://localhost/ pwb2
Cloning into 'pwb2'...
[6373] Connection from 127.0.0.1:36357
[6373] Extended attributes (16 bytes) exist <host=localhost>
[6373] Request upload-pack for '/'
remote: Counting objects: 37384, done.
remote: Compressing objects: 100% (10563/10563), done.
[6370] [6373] Disconnected2525/37384)
remote: Total 37384 (delta 26576), reused 37384 (delta 26576)
Receiving objects: 100% (37384/37384), 7.96 MiB, done.
Resolving deltas: 100% (26576/26576), done.

So it should be possible to solve this issue on the Wikimedia end: downloading 7MB sounds very reasonable to me.


Merlijn