When using python
, can SQLite
be used as production databa开发者_StackOverflowse to manage say 10,000 databases files (each in a separate file -- of size 500[MB])?
Only one thread will be used to write data to the database files (no concurrency).
Are there alternatives libraries that can work better / faster / more reliably?
Maybe you'll look at this page titled "appropriate uses for sqlite". To quote:
The basic rule of thumb for when it is appropriate to use SQLite is this: Use SQLite in situations where simplicity of administration, implementation, and maintenance are more important than the countless complex features that enterprise database engines provide. As it turns out, situations where simplicity is the better choice are more common than many people realize.
Another way to look at SQLite is this: SQLite is not designed to replace Oracle. It is designed to replace fopen().
If you are dealing with one SQLite database at a time, there is no limit on the number of database files you can handle. Just make sure you clean up properly (close each database connection) before you open the next database and you should see no problems whatsoever.
Opening one database at a time makes this no different from using just one database file. SQLite is a great format, I have yet to see any integrity issues, and I've abused the format quite extensively, including rsyncing an updated database in place before re-opening the database (the overwritten database was only ever read from), or perform a complete clear and rebuild from a second process (wrapped in one big transaction, again the first process only ever read from it). The one thing you shouldn't do with it is store it on a network share.
As for size limits, take a look at http://www.sqlite.org/limits.html; the SQLite project takes it's testing serious and the database limits are included. With a maximum BLOB size of 2 GB this means they test databases that are at least that large in their test suite, so databases of up to 500MB should be a breeze to deal with.
精彩评论