开发者

is it possible to take a large number of files & tar/gzip and stream them on-the-fly?

开发者 https://www.devze.com 2023-03-03 15:58 出处:网络
I have a large number of files which I need to backup, problem is there isn\'t enough disk space to create a tar file of them and then upload it offsite. Is there开发者_StackOverflow中文版 a way of us

I have a large number of files which I need to backup, problem is there isn't enough disk space to create a tar file of them and then upload it offsite. Is there开发者_StackOverflow中文版 a way of using python, php or perl to tar up a set of files and upload them on-the-fly without making a tar file on disk? They are also way too large to store in memory.


I always do this just via ssh:

tar czf - FILES/* | ssh me@someplace "tar xzf -"

This way, the files end up all unpacked on the other machine. Alternatively

tar czf - FILES/* | ssh me@someplace "cat > foo.tgz"

Puts them in an archive on the other machine, which is what you actually wanted.


You can pipe the output of tar over ssh:

tar zcvf - testdir/ | ssh user@domain.com "cat > testdir.tar.gz"
0

精彩评论

暂无评论...
验证码 换一张
取 消