I'm going to have a need for an efficient remote change detection algorithm for backup of an ordinary filesystem.
The files are backed up to a remote machine a开发者_运维技巧nd bandwidth is at a premium, so it's going to be difficult to compare files. I've looked into Remote Differential Compression and rsync, but I don't know which direction I should go from here. Which is more bandwidth efficient? What does commercial backup software do? Is there a standard algorithm everyone uses?
I found two very good articles on this:
Remote File Synchronization Single-Round Algorithms explains and compares leading methods very helpfully.
Algorithms for Low-Latency Remote File Synchronization goes into lots of technical detail on remote file synchronization based on set reconciliation techniques.
精彩评论