开发者

How to update 400k rows on a production MySQL db and don't kill it

开发者 https://www.devze.com 2023-01-20 22:16 出处:网络
On our production server we need to split 900k images into different dirs and update 400k rows (MySQL with InnoDB engine). I wrote a python script which goes through next steps:

On our production server we need to split 900k images into different dirs and update 400k rows (MySQL with InnoDB engine). I wrote a python script which goes through next steps:

  1. Select small chunk of data from db (10 rows)
  2. Make new dirs
  3. Copy files to the created dirs and rename it
  4. Update db (there are some triggers on update which will load server)
  5. Repeat

My code:


import os, shutil
import database # database.py from tornado

LIMIT_START_OFFSET = 0
LIMIT_ROW_COUNT = 10
SRC_PATHS = ('/var/www/site/public/upload/images/',)
DST_PATH = '/var/www/site/public/upload/new_images/'

def main():
    offset = LIMIT_START_OFFSET
    while True:
        db = Connection(DB_HOST, DB_NAME, DB_USER, DB_PASSWD)
        db_data = db.query('''
            SELECT id AS news_id, image AS src_filename
            FROM emd_news
            ORDER BY id ASC
            LIMIT %s, %s''', offset, LIMIT_ROW_COUNT)
        offset = offset + LIMIT_ROW_COUNT
        news_images = get_news_images(db_data) # convert data to easy-to-use list
        make_dst_dirs(DST_PATH, [i['dst_dirname'] for i in news_images]) # make news dirs
        news_to_update = copy_news_images(SRC_PATHS, DST_PATH, news_images) # list of moved files
        db.executemany('''
            UPDATE emd_news
            SET image = %s
            WHERE id = %s
            LIMIT 1''', [(i['filename'], i['news_id']) for i in news_to_update])
        db.close()
        if not db_data: break

if __name__ == '__main__':
    main()

Quite simple task, but I'm a little bit nervous about performance.

How can I make this script more efficient?

UPD: Af开发者_运维知识库ter all I've used original script without any modifications. It took about 5 hours. And it was fast in the beginning and very slow in the end.


i will Add:

Why you create in each loop a new connexion and close it heh !!!

And maybe you can use db.autocommit(False) specially for the UPDATE and do a db.commit() for each 100 rows or something ;

and like Alin Purcaru you should do some benchmark as well.

Hope this can help :)


What I recommend.

  1. Add an isProcessed column to your table.
  2. Make your script work on a chunk of, say, 1k rows for the first run (of course select only rows that are not processed).
  3. Benchmark it.
  4. Adjust the chunk size if needed.
  5. Build another script that calls this one at intervals.

Don't forget to add some sleep time in both your scripts!

This will work if your change does not need to be continuous (and I don't think it has to be). If you have to do it all at once you should put your database offline during the time the script runs.


    db_data = db.query('''
        SELECT id AS news_id, image AS src_filename
        FROM emd_news
        ORDER BY id ASC
        LIMIT %s, %s''', offset, LIMIT_ROW_COUNT)
     # Why is there any code here at all?  If there's no data, why proceed?
     if not db_data: break
0

精彩评论

暂无评论...
验证码 换一张
取 消