开发者

MYSQL 100000 update

开发者 https://www.devze.com 2023-03-13 00:00 出处:网络
I everyone, I have to update about 100000 fields in my db with a file. So I have a file like this: update db set field=\'truc0\' where id=\'0\'

I everyone,

I have to update about 100000 fields in my db with a file. So I have a file like this:

update db set field='truc0' where id='0'
update db set field='truc1' where id='1'
...
update db set field='trucn' where id='n'

I parse the file and for each line I send the request to my db

But my problem is that the process is extremly slow. For example, I have started the process and 2% = 2064 requests = 900seconds

Do you know another proc开发者_如何学编程ess to do the same things but in an much better time?

thx a lot for your answers


Something like this will surely go much faster because it's a single query instead of n queries:

update db set field='truc' + cast(id as varchar)


From your question it is not exactly clear if you want to update all rows in the db or just special ids that you read from a file.

In the latter case, try to read chunks from the file and group the ids together. eg. if a file chunk contains the ids 3 4 5 6 7 8 you can optimize the query

update db set field=.... where id between 3 and 8; 

(careful if you use varchar keys instead of integers though!)

if the ids are random you might still get a performance boost by using the sql "IN" clause

update db set field=... where id in ('18','49','72','30','187');
0

精彩评论

暂无评论...
验证码 换一张
取 消