开发者

mysqldump doing a partial backup - incomplete table dump

开发者 https://www.devze.com 2023-02-24 22:34 出处:网络
I have a database of about 6GB in size and it has a table with 12.6 million rows. I tried to export the database into a SQL dump开发者_开发知识库 by:

I have a database of about 6GB in size and it has a table with 12.6 million rows. I tried to export the database into a SQL dump开发者_开发知识库 by:

mysqldump -u root -p db_name > db_name.sql

When the command finishes, the exported SQL dump file is just about 2GB and the primary table got exported only about 1 million rows.

What could possibly be wrong?


There is a 2GB filesize limit for some reason, the easiest way to get around this is using split:

mysqldump ... | split -b 250m - filename.sql-

You can also compress the files like this:

mysqldump ... | gzip -9c | split -b 250m - filename.sql.gz-

To restore from a non-compressed file, do this:

cat filename.sql-* | mysql ...

For a compressed file:

cat filename.sql-* | zcat | mysql ...

Of course if you want a single file, you can then tar the result.

Obviously you can replace the 250m with a different size if you wish.


Your filesystem probably is limited to 2gb files.


It's happen because some SQL Dump have limited size for dumping data. you could not dump the database if it over the limit.

If you really want to do this you must compress the database.By using ZIP,GZIP,etc. Before dumping data.


I had similar, though all the tables were exported up to a certain point.

I'd removed a column on which an old redundant View depended, and mysqldump quietly choked trying to 'export' the View

0

精彩评论

暂无评论...
验证码 换一张
取 消