Long story from my weekend with an EC2 instance. Dumped the data I massaged there into a 685MB dump file to move over to my server. Early morning so didn't detect that I had the option set for an extended insert. I don't have access to the massaged data in the temporary database as I figured I was covered 开发者_如何学JAVAwith the dump file.
So now I have this dumpfile that includes a table that has one extended insert with 17 million rows. The mysql on my server choked on it.
I did find a perl script called mysqdump-convert.pl that converts the extended inserts into simple ones. I piped its output into a new file, but it appears to have choked after 144MB of the 685MB task.
Is there any other tool you know of that can handle this task? Thanks.
Making a bullet-proof script to do this would be a pain. However, to do it for just your data is probably easy.
If you are starting with this:
INSERT INTO tblTable VALUES (1,4,"String"),(2,4,"String"),(3,4,"String");
You can do a search and replace, using your favorite editor:
Find: ,(
Replace with: ;\nINSERT INTO tblTable VALUES (
Now, glance through and make sure it looks correct and let it rip.
精彩评论