开发者

Efficiently inserting / updating 1000's records with mysql transaction help?

开发者 https://www.devze.com 2022-12-12 18:00 出处:网络
I have to insert and then keep updating 1000\'s of records every minute, will mysql transaction help in the speed performance as it does in sqlite3.

I have to insert and then keep updating 1000's of records every minute, will mysql transaction help in the speed performance as it does in sqlite3.

I 开发者_运维技巧carried out a test (Inserting 1300 records) but showed the same result with InnoDB (for transaction) and MyISAM.

Using MySQLdb for Python


Transactions aren't about performance; they're about data integrity and consistency.

I think that transactions will slow things down, because MySQL will have to maintain transaction logs and rollback segments. But the benefit is that it will ensure that each INSERT will keep its integrity.

InnoDB maintains referential integriry; MyISAM does not. If you observed no difference, I'm guessing that it's because you didn't have transaction boundaries set.


Try batch inserts. Can not say anything about consequent updates


I don't know if this will help for MySQL, but for large-scale Oracle system I was involved with in a previous job, the inserts/updates were done in batches of 100's or 1000's to maximize throughput. A lot of the grunt work was done server-side in PL-SQL.

0

精彩评论

暂无评论...
验证码 换一张
取 消