I have a large number of files which I need to backup, problem is there isn't enough disk space to create a tar file of them and then upload it offsite. Is there开发者_StackOverflow中文版 a way of using python, php or perl to tar up a set of files and upload them on-the-fly without making a tar file on disk? They are also way too large to store in memory.
I always do this just via ssh:
tar czf - FILES/* | ssh me@someplace "tar xzf -"
This way, the files end up all unpacked on the other machine. Alternatively
tar czf - FILES/* | ssh me@someplace "cat > foo.tgz"
Puts them in an archive on the other machine, which is what you actually wanted.
You can pipe the output of tar over ssh:
tar zcvf - testdir/ | ssh user@domain.com "cat > testdir.tar.gz"
精彩评论