Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 13 years ago.
Improve this questi开发者_运维百科onI am trying to transfer a large (~3GB) file between two unix machines.
I can use scp or rsync, but sometimes the transfer is corrupted. (I have to check manually.) I can split the file into pieces and transfer them and then checksum and then recombine, but this is tedious.
Is there a single command to correctly transfer a large file between two Unix machines? I want it to automatically checksum both copies, and keep redoing the transfer (or pieces thereof) until it gets all bytes across the wire correctly.
Sorry, but scp should not corrupt files. It's depending on tcp/IP which takes care of correct data transfer. Maybe you should check for bad ram or other problems on your servers/clients.
精彩评论