I have a table that needs regular updating. These updates happen in batches. Unlike with INSERT, I can't just include multiple rows in a single query. What I do now is prepare the UPDATE statement, then loop through all the possibilities and execute each. Sure, the preparation happens only once, but there are still a lot of executions.
I created several versions of the table of different sizes (thinking that maybe better indexing or splitting the table would help). However, that did not have an effect on update times. 100 updates take about 4 seconds for either 1,000-row table or 500,000-row one.
Is there a smarter way of doing this faster?
As asked in the comments, here is actual code (PHP) I have been testing with. Column 'id' is a primary key.
$stmt = $dblink->prepare("UPDATE my_table SET col1 = ? , col2 = ? WHERE id = ?");
$rc = $stmt->bind_param("ssi", 开发者_如何学JAVA$c1, $c2, $id);
foreach ($items as $item) {
$c1 = $item['c1'];
$c2 = $item['c2'];
$id = $item['id'];
$rc = $stmt->execute();
}
$stmt->close();
If you really want to do it all in one big statement, a kludgy way would be to use the "on duplicate key" functionality of the insert statement, even though all the rows should already exist, and the duplicate key update will hit for every single row.
INSERT INTO table (a,b,c) VALUES (1,2,3),(4,5,6)
ON DUPLICATE KEY UPDATE 1=VALUES(a), b=VALUES(b), c=VALUES(c);
Try LOAD DATA INFILE. Much faster than MySQL INSERT's or UPDATES, as long as you can get the data in a flat format.
精彩评论