Gentlemen, I am very thankful for http://www.mediawiki.org/wiki/Download_from_Git however it assumes one wants to 'get involved' with MediaWiki, whereas all I want to do is 'rsync snapshots' of the same files 'that the tarballs contain'. I.e., I just want to oh, every few days update (overwrite in place) my running copy of MediaWiki. No need for the history etc. that a 'clone' pulls in, or further 'pull's refresh. (The disk size I use should not grow a single bit between updates unless you upstream added some lines to a file.)
So it would be nice if someone mentioned the commands necessary so I can add them to that article. Let's pretend we are starting fresh (and not from SVN like in http://lists.wikimedia.org/pipermail/wikitech-l/2012-February/058190.html .)
https://www.mediawiki.org/wiki/Download_from_Git#Latest_development_version_... should behave the same as updating from SVN in the past (as to my understanding).
On Mon, 26 Mar 2012 23:56:58 -0700, jidanni@jidanni.org wrote:
Gentlemen, I am very thankful for http://www.mediawiki.org/wiki/Download_from_Git however it assumes one wants to 'get involved' with MediaWiki, whereas all I want to do is 'rsync snapshots' of the same files 'that the tarballs contain'. I.e., I just want to oh, every few days update (overwrite in place) my running copy of MediaWiki. No need for the history etc. that a 'clone' pulls in, or further 'pull's refresh. (The disk size I use should not grow a single bit between updates unless you upstream added some lines to a file.)
So it would be nice if someone mentioned the commands necessary so I can add them to that article. Let's pretend we are starting fresh (and not from SVN like in http://lists.wikimedia.org/pipermail/wikitech-l/2012-February/058190.html .)
99M git.git # A --bare clone 167M git-clean # A simple clone of master 215M git-fetch # A clone with a full fetch of the history of all branches 68M git-linked # A clone linked to git-clean 128M git-shallow # A --depth=5 clone 147M svn # A SVN checkout
Your concerns about space aren't really warranted. For something that keeps the entire source history it is very efficient at storing that history. A basic checkout (git-clean) is only 20MB more than a svn checkout.
To top it off git supports bare repos (repos with only history and no working copy) and linking a repo to another. ie: If you `git clone /abs/path/to/other/git/repo newrepo` git will use some hard links and references to the other repo instead of physically copying the data. By the way 99M (git.git) + 68M (git-linked) = 167M (git-clean), a --bare repo plus a linked repo is the same size as a normal clone.
And it gets even better if you have multiple wikis. If you have a setup like this: web/core.git web/somewiki.com/ web/anotherwiki.com/ Where core.git is a --bare repo and the other two are clones pointing to core.git, you will actually be SAVING space with git. In git that's 99M + 68M * 2 = 235M In svn that's 147M * 2 = 294M ;) Just two wikis and git is already taking less space than svn. And it goes in git's favor the more wikis you have: 1 = (git: 167M, svn: 147M; diff: -20M | 113.605442177%) 2 = (git: 235M, svn: 294M; diff: 59M | 79.9319727891%) 5 = (git: 439M, svn: 735M; diff: 296M | 59.7278911565%) 10 = (git: 779M, svn: 1470M; diff: 691M | 52.9931972789%) 50 = (git: 3499M, svn: 7350M; diff: 3851M | 47.6054421769%) 100 = (git: 6899M, svn: 14700M; diff: 7801M | 46.9319727891%) 500 = (git: 34099M, svn: 73500M; diff: 39401M | 46.3931972789%)
OK, so you don't offer rsync. And the closest I can get to $ du -sh med* 17M mediawiki-1.18.2.tar.gz 72M mediawiki-1.18.2 is your 99M product, which I hope you could document on http://www.mediawiki.org/wiki/Download_from_Git Apparently it involves a --bare option.
And how might one update it occasionally without accruing things I will not need. I.e., I just want to do an rsync.
Let's assume 72M mediawiki-1.23 72M mediawiki-1.24 72M mediawiki-1.25
I guess I will just have to accept some bloat with git over this. I guess there is no way to just get the same content without using git in some way that it is not intended.
P.S., I use http://www.mediawiki.org/wiki/Manual:Wiki_family#Ultimate_minimalist_solutio... so each additional wiki is just a symlink.
On Tue, Mar 27, 2012 at 10:05 AM, jidanni@jidanni.org wrote:
OK, so you don't offer rsync. And the closest I can get to $ du -sh med* 17M mediawiki-1.18.2.tar.gz 72M mediawiki-1.18.2
Well if you want the vanilla install without the ability to update via version control, of course that's going to be smaller. That's no different now from how it was under SVN.
is your 99M product, which I hope you could document on http://www.mediawiki.org/wiki/Download_from_Git Apparently it involves a --bare option.
No. Most people don't want --bare anyway...they're likely to get confused.
And how might one update it occasionally without accruing things I will not need. I.e., I just want to do an rsync.
`git pull`
I guess I will just have to accept some bloat with git over this. I guess there is no way to just get the same content without using git in some way that it is not intended.
The amount of "bloat" we're talking about here is between 20 and 30M, at most. I see zero problem here.
Would you rather have the initial conversions I did, that are around 6.5G of bloat? ;-)
-Chad
Le 27/03/12 16:13, Chad a écrit :
Would you rather have the initial conversions I did, that are around 6.5G of bloat? ;-)
The ones with the nice theora video of some fishes in an aquarium?
;-D
A regular git clone.
-Chad On Mar 27, 2012 10:34 PM, jidanni@jidanni.org wrote:
"C" == Chad innocentkiller@gmail.com writes:
C> No. Most people don't want --bare anyway...they're likely C> to get confused.
So what do I want?
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
I know. I will use --depth 1 !
$ git clone --depth 1 https://gerrit.wikimedia.org/r/p/mediawiki/core.git mediawiki Cloning into 'mediawiki'... error: RPC failed; result=22, HTTP code = 500 fatal: The remote end hung up unexpectedly $ git clone --depth 2 https://gerrit.wikimedia.org/r/p/mediawiki/core.git mediawiki Cloning into 'mediawiki'... error: RPC failed; result=22, HTTP code = 500 fatal: The remote end hung up unexpectedly $ git clone --depth 0 https://gerrit.wikimedia.org/r/p/mediawiki/core.git mediawiki Cloning into 'mediawiki'... remote: Counting objects: 356168, done remote: Finding sources: 100% (356168/356168) Receiving objects: 8% (29137/356168), 6.38 MiB | 112 KiB/s
I don't really know what --depth 0 will get me. Fingers crossed.
wikitech-l@lists.wikimedia.org