0 votes
1 view
in DevOps and Agile by (30k points)

What is the most efficient mechanism (with respect to data transferred and disk space used) to get the contents of a single file from a remote git repository?

So far I've managed to come up with:

git clone --no-checkout --depth 1 [email protected]:foo/bar.git && cd bar && git show HEAD:path/to/file.txt

This still seems overkill.

What about getting multiple files from the repo?

1 Answer

0 votes
by (51.2k points)

In this case, you can use git archive's '--remote=<URL>' option, for example:

git archive [email protected]:foo/bar.git --prefix=path/to/ HEAD:path/to/ |  tar xvf -

this command will produce a tar or zip archive, so all you need is to pipe the output through tar to get the file content:

git archive --remote=git://git.foo.com/project.git HEAD:path/to/directory filename | tar -x

this command will save a copy of 'filename' from the HEAD of the remote repository in the current directory.

The :path/to/directory part is optional. If excluded, then the fetched file will be saved to <current working dir>/path/to/directory/filename.

Welcome to Intellipaat Community. Get your technical queries answered by top developers !