I'm 开发者_运维知识库creating a PHP script that imports some data from text files into a MySQL database. These text files are pretty large, an average file will have 10,000 lines in it each of which corresponds to a new item I want in my database. (I won't be importing files very often)
I'm worried that reading a line from the file, and then doing a INSERT query, 10,000 times in a row might cause some issues. Is there a better way for me to do this? Should I perform one INSERT query with all 10,000 values? Or would that be just as bad?
Maybe I can reach a medium, and perform something like 10 or 100 entries at once. Really my problem is that I don't know what is good practice. Maybe 10,000 queries in a row is fine and I'm just worrying for nothing.
Any suggestions?
yes it is
<?php
$lines = file('file.txt');
$count = count($lines);
$i = 0;
$query = "INSERT INTO table VALUES ";
foreach($lines as $line){
$i++;
if ($count == $i) {
$query .= "('".$line."')";
}
else{
$query .= "('".$line."'),";
}
}
echo $query;
http://sandbox.phpcode.eu/g/5ade4.php
this will make one single query, which is multiple faster than one-line-one-query style!
Use prepared statements, suggested by the authors of High Performance MySQL. It saves a lot of time (saves from wasteful protocol and SQL ASCII code).
I would do it in one large query with all the values at once. Just to be sure, though, make sure you run START TRANSACTION;
before and COMMIT;
afterwards, so that if something goes wrong during the execution of the query (which is possible, since it will most likely run for a fairly long time), the database will not be affected.
精彩评论