


Is not set, defaults to -1, the zlib default, which is "a default compromise between speed and compression (currently equivalent to level 6)." 0 means no compression, and 1.9 are various speed/size tradeoffs, 9 being slowest. pressionĪn integer -1.9, indicating the compression level for objects in a pack file. If that is not set, defaults to 1 (best speed). core.loosecompressionĪn integer -1.9, indicating the compression level for objects that are not in a pack file. If set, this provides a default to other compression variables, such asĬore.loosecompression and pression. Have a look at these config settings (I assume pression may help you lower the server load) pressionĪn integer -1.9, indicating a default compression level. You can look at lowering server load by reducing/removing compression. On the receiving end, just git clone snapshot.bundle myclonedir/ That's the best of both worlds, while of course you won't get the items from the bullet list above. git bundle create snapshot.bundle -all # (or mention specific ref names instead of -all)Īnd distribute the snapshot bundle instead. all includes more than simple git clone, consider e.g. You can now distribute the bundle, without the server even being involved. If you want a fast clone without the server-side cost, the git way is bundle create. When distributing to a truly big number of clients, consider using bundles. Since git clone, by default, does not mirror all branches (see -mirror) it would not make sense to just dump the pack-files as-is (because that will send possibly way more than required). Git clone by default optimizes for bandwidth. This may or may not be what you require, but it is nice to be aware of the fact backups (from filter-branch, e.g.) and various other things (intermediate state from rebase, bisect etc.).stashes (see Can I fetch a stash from a remote repo into a local branch? also).config (remotes, push branches, settings (whitespace, merge, aliases, user details etc.).This will certainly not waste time compressing, repacking, deltifying and/or filtering. (The contain links to other relevant SO posts) Dumb copyĪs mentioned you could just copy a repository with 'dumb' file transfer. "fetch -all" in a git bare repository doesn't synchronize local branches to the remote ones.
Gitbox clone update#
You can always just git remote update and get incremental changes.įor various other ways to keep full repos in synch see, e.g. Also, if you routinely clone full repos from scratch, you'd be doing something wrong anyway. You should try cloning a full repo from darcs, bazaar, hg (god forbid: TFS or subversion.). Git is generally considered blazingly fast.
