开发者

how to import large datasets to mysql - the right way?

开发者 https://www.devze.com 2023-03-31 08:29 出处:网络
I have a 400MB *.sql file which is a dump from a MySQL database. I now want to import this to a new database. I used phpmyadmin but it breaks after a few minutes because it cannot handle such large da

I have a 400MB *.sql file which is a dump from a MySQL database. I now want to import this to a new database. I used phpmyadmin but it breaks after a few minutes because it cannot handle such large dataset (max file size limit is reached). So I used the following:

logged in via SSH. selected the database via mysql tool. command: source myb开发者_运维百科ackup.sql

Now the queries run but it is running over 3 hours now. What is wrong or does it really take so long?

Thanks, WorldSignia


mysql databaseName < mybackup.sql

That is as fast as it gets, I think.


If it's not done already, you can use set autocommit=0 to turn off autocommit before you import. Otherwise InnoDB will COMMIT after every INSERT statement. It may help some.

You can also turn off secondary unique keys and foreign keys as well to help. See this link for more ideas:

http://dev.mysql.com/doc/refman/5.5/en/optimizing-innodb-bulk-data-loading.html

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号