开发者

Effect of a Large CoreData Store on Time Machine

开发者 https://www.devze.com 2023-02-11 23:16 出处:网络
A project I\'m working on potentially entails storing large amounts (e.g. ~5GB) of binary data in CoreData. I\'m wondering if this would negatively impact the user\'s Time Machine backup. From reading

A project I'm working on potentially entails storing large amounts (e.g. ~5GB) of binary data in CoreData. I'm wondering if this would negatively impact the user's Time Machine backup. From reading the documentation it seems that CoreData's persistent store uses a single file (e.g. XML, SQLite DB, etc) so it would seem to me that any time the user changes a piece of data in the datastore Time Machine would copy the data store in its entiret开发者_如何学JAVAy to the backup drive.

Does CoreData offer a different datastore format that is more Time Machine friendly?

Or is their a better way to do this?


You can use configurations in your data model to separate the larger entities into a different persistent store. You will need to create the persistent store coordinator yourself, using addPersistentStoreWithType:configuration:URL:options:error: to add each store with the correct configuration.


To answer your question directly, the only thing I can think of is to put your Core Data store in a sparsebundle disk image, so only the changed bands would be backed up by Time Machine. But really, I think if you're trying to store this much data in SQLite/Core Data you'd run into other problems. I'd suggest you try using a disk-based database such as PostgreSQL.

0

精彩评论

暂无评论...
验证码 换一张
取 消