开发者

Backing up my database is taking too long

开发者 https://www.devze.com 2022-12-14 02:38 出处:网络
On a windows mobile unit, the software I\'m working on relies on a sdf file as it\'s database. The platform that the software is targeted towards is \"less than optimal\" and hard resets every once a

On a windows mobile unit, the software I'm working on relies on a sdf file as it's database. The platform that the software is targeted towards is "less than optimal" and hard resets every once and a while. In the far distant past we lost data. Now we close the database, and copy the SDF file to the SD card. If the unit gets hard reset, we restore the app (also on the sd card) and the database.

I'm not concerned about the restore (just yet). The problem we have now is that doing a "backup" takes a crazy amount of time because the SDF is 7+ megs and writing to the SD card is slow slow slow.

My boss suggested we create hashes of "chunks" of the file and then write to开发者_运维技巧 the destination file only when a compare of the hashes is !=.

So here's the question.

How would you test if a file is changed if you can only have one copy of the file and thus can't compare it with it's original.

I'm just shooting for a bit of brain storming.


Just store your hashes of your chunks somewhere. You don't need the "backup" copy to compare to if you know what your hashes are. Obviously this creates a chicken and egg problem for at least one hash, but copying a single "chunk" is a much smaller problem.

Your proposed approach will still have performance problems though, as hashing a large file isn't going to be a pretty operation on a slow CPU powered by a battery.

I assume you don't have the granular control to keep track of the parts of the file you modify, and then update just those sections when you need to do backup?

0

精彩评论

暂无评论...
验证码 换一张
取 消