开发者

PHP/MySQL and Doctrine Best Practices for Multi-Insert

开发者 https://www.devze.com 2023-03-16 17:56 出处:网络
StackOverflow answered a lot of questions before, but i\'m looking for a more specific one that I couldn\'t find a开发者_开发知识库n exact answer.

StackOverflow answered a lot of questions before, but i'm looking for a more specific one that I couldn't find a开发者_开发知识库n exact answer.

I'm currently using PHP/MySQL with Doctrine 1.2.2 looking forward to improve my script. It actually adds about 300 entries every 5 minutes into the database. Actually this script uses mysql multi-insert, not doctrine.

I was wondering what is better, faster and uses less resources. Thanks in advance.

First Edit and Testing

OK guys, firsts things first :)

I tried first using only MySQL statement. Fabrizio suggested using INSERT DELAYED, but that is not available since I'm using innoDB.

I realized some tests using the MySQL statement running three times each job I have to run. For the first job I received an average of 53 seconds to run, using 6.25 Mb (remember that I load 138 pages here)

For the second job I received an average of 3 minutes and 30 seconds, using 4.25 Mb (remember that I load 510 pages here)

For the third job I received an average of 3 minutes and 9 seconds, using 4.75 Mb (remember that I load 410 pages here)

The first Job I was able to convert do Doctrine, running about the same time (53 seconds), but using 9.75 Mb. On second and third jobs I had a problem, since Doctrine does not support ON DUPLICATE KEY and using replace() makes me lose the data added before, due to Foreign Key and that replace() actually deletes the old value and add a new one.

From this tests, I will still use MySQL query multi-inserting. Any mode ideas to test?

Thanks you all :)


Why don't you check for yourself?

Try benchmarking both solutions with simple time() compare before and after SQL execution. There's no better way of knowing which is better, it all depends on your query, database structure, and number of joins (at least these factors, but probably that's not everything).

For memory usage, you can try using memory_get_usage.


The best way to do this is using MySQL batching. You can do this two ways:

1) Create a long mysql string with all inserts in one line. More detail here insert multiple rows via a php array into mysql

2) Use prepared statements to create a prepared insert statement and execute the same statement with different params. http://php.net/manual/en/mysqli.prepare.php


self benchmarking is the final solution, but keep in mind that the more traffic you do between PHP and MySQL the more time you spend executing the query.

I would suggest an extended insert, something on the line of

INSERT DELAYED INTO `myTable` (field1, field2) VALUES ('val1','val2),('val11','val22')

If the table is big and not queried often, maybe you want to lock the table before the inserts and unlocked it after.

Maybe you want to take a look also to bulk_insert_buffer_size, if it has a value too small it might slow down your inserts.

Again, you will probably need to do some self testing on this one

0

精彩评论

暂无评论...
验证码 换一张
取 消