开发者

Offsite backups - possible with large amounts of code/source images etc?

开发者 https://www.devze.com 2022-12-08 05:04 出处:网络
The biggest hurdle I have in developing an effective backup strategy is being able to do s开发者_如何学Goome sort of offsite backup. Unfortunately, this can only be via uploading data to the offsite s

The biggest hurdle I have in developing an effective backup strategy is being able to do s开发者_如何学Goome sort of offsite backup. Unfortunately, this can only be via uploading data to the offsite source but my internet cable has upload speeds which prohibit this.

Has anyone here managed to do offsite backups of large libraries of source code?

This is only relevant to home users and not in the workplace where budgets may open up doors.

EDIT: I am using Windows Vista (So 'nix solutions aren't relevant).

Thanks


I don't think your connections upload speed will be as prohibitive as you think. Just make sure you look for a solution where your changes can be sent as diffs. Even if your initial sync takes days, daily changes would likely be more manageable.

Knowing a few more specifics about how much data you are talking about, and exactly how slow your connection is, I think would allow the community to make more specific suggestions.


Services like Mozy allow you to back up large amounts of data offsite.

They upload slowly in the background, and getting the initial sync to the servers can take a while depending on your speed and amount of data, but after that they use efficient diffs to keep the stored data in sync.

The pricing is very home-friendly too.


I think you have to define backup and whats acceptable to you.

At my house, i have a hot backup of our repositories where I poll svn once an hour over the VPN and it takes down any check ins. this is just to catch any check ins that are not captured each 24 hours via the normal backup. I also send a full backup every 2 days through the pipe to be outside of the normal 3 tier backup we do at the office. our current zipped repository is 2GB zipped at max compression. That takes 34 hrs at 12 k/s and 17hrs at 24k/s, you did not say the speed of your connection, so its hard to judge if thats workable.

If this isnt viable, you might want to invest in a couple of 2.5" USB drives and load/swap them offsite to a safety deposit box at the bank. this used to be my responsibility but I lacked the discipline to do this consistently each week to assure some safety net. In the end it was just easier to live uploading the data to an ftp site at my house.

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号