I did an importer in VB .Net witch get data from an SQLServer an inserts this data throught ADSL connection in a remote MySQL server.
in the first time, it was like 200 records, but now there are more than 500.000 records and it expends like 11hours exporting all the data and that is bad, veryyy bad.
I need to optimize my importer, witch now gets the data into a datatable an them i have a function witch with a loop (row to row) inserts the data with a "insert into" query... like this:
For Each dr As DataRow In dt.Rows
Console.Write(".")
Dim sql As String = "INSERT INTO clientes(id,nombrefis,nombrecom,direccion,codpos,municipio_id,telefono,fax,cif)" & _
"VALUES (@id,@nombrefis,@nombrecom,@direccion,@codpos,@municipio_id,@telefono,@fax,@cif)"
cmd = New MySqlCommand(sql, c开发者_JS百科nn)
cmd.Parameters.AddWithValue("id", Int32.Parse(dr("ID EMPRESA").ToString))
cmd.Parameters.AddWithValue("nombrefis", dr("NOMEMP"))
cmd.Parameters.AddWithValue("nombrecom", dr("EMPRESA"))
cmd.Parameters.AddWithValue("direccion", dr("DIRECC"))
cmd.Parameters.AddWithValue("codpos", dr("CODPOS"))
cmd.Parameters.AddWithValue("municipio_id", Int32.Parse(dr("CODIGO MUNICIPIO")).ToString)
cmd.Parameters.AddWithValue("telefono", dr("TELEF"))
cmd.Parameters.AddWithValue("fax", dr("FAX"))
cmd.Parameters.AddWithValue("cif", dr("CIF"))
cmd.ExecuteNonQuery()
Next
any ideas or advices? thanks so much
- Create the command, add the parameters and prepare the command (
Prepare
method) before the loop. Then, in the loop, set the parameter values and execute the command - Execute the commands in transactions, by batches of 1000 or so commands
EDIT: I just realized you're using MySQL... So unless you're using an InnoDB table, you can't use transactions. Anyway, you can still use my first advice, so that the command is only prepared once
精彩评论