开发者

import bulk data into MySQL

开发者 https://www.devze.com 2023-01-04 04:43 出处:网络
So I\'m trying to import some sales data into my MySQL database. The data is originally in the form of a raw CSV file, which my PHP application needs to first process, then save the processed sales da

So I'm trying to import some sales data into my MySQL database. The data is originally in the form of a raw CSV file, which my PHP application needs to first process, then save the processed sales data to the database.

Initially I was doing individual INSERT 开发者_如何学Pythonqueries, which I realized was incredibly inefficient (~6000 queries taking almost 2 minutes). I then generated a single large query and INSERTed the data all at once. That gave us a 3400% increase in efficiency, and reduced the query time to just over 3 seconds.

But as I understand it, LOAD DATA INFILE is supposed to be even quicker than any sort of INSERT query. So now I'm thinking about writing the processed data to a text file and using LOAD DATA INFILE to import it into the database. Is this the optimal way to insert large amounts of data to a database? Or am I going about this entirely the wrong way?

I know a few thousand rows of mostly numeric data isn't a lot in the grand scheme of things, but I'm trying to make this intranet application as quick/responsive as possible. And I also want to make sure that this process scales up in case we decide to license the program to other companies.

UPDATE:

So I did go ahead and test LOAD DATA INFILE out as suggested, thinking it might give me only marginal speed increases (since I was now writing the same data to disk twice), but I was surprised when it cut the query time from over 3300ms down to ~240ms. The page still takes about ~1500ms to execute total, but it's still noticeably better than before.

From here I guess I'll check to see if I have any superfluous indexes in the database, and, since all but two of my tables are InnoDB, I will look into optimizing the InnoDB buffer pool to optimize the overall performance.


LOAD DATA INFILE is very fast, and is the right way to import text files into MySQL. It is one of the recommended methods for speeding up the insertion of data -up to 20 times faster, according to this:

https://dev.mysql.com/doc/refman/8.0/en/insert-optimization.html

Assuming that writing the processed data back to a text file is faster than inserting it into the database, then this is a good way to go.


LOAD DATA or multiple inserts are going to be much better than single inserts; LOAD DATA saves you a tiny little bit you probably don't care about that much.

In any case, do quite a lot but not too much in one transaction - 10,000 rows per transaction generally feels about right (NB: this is not relevant to non-transactional engines). If your transactions are too small then it will spend all its time syncing the log to disc.

Most of the time doing a big insert is going to come from building indexes, which is an expensive and memory-intensive operation.

If you need performance,

  • Have as few indexes as possible
  • Make sure the table and all its indexes fit in your innodb buffer pool (Assuming innodb here)
  • Just add more ram until your table fits in memory, unless that becomes prohibitively expensive (64G is not too expensive nowadays)

If you must use MyISAM, there are a few dirty tricks there to make it better which I won't discuss further.


Guys, i had the same question, my needs might have been a little more specific than general, but i have written a post about my findings here.

http://www.mediabandit.co.uk/blog/215_mysql-bulk-insert-vs-load-data

For my needs load data was fast, but the need to save to a flat file on the fly meant the average load times took longer than a bulk insert. Moreover i wasnt required to do more than say 200 queries, where before i was doing this one at a time, i'm now bulking them up, the time savings are in the region of seconds.

Anyway, hopefully this will help you?


You should be fine with your approach. I'm not sure how much faster LOAD DATA INFILE is compared to bulk INSERT, but I've heard the same thing, that it's supposed to be faster.

Of course, you'll want to do some benchmarks to be sure, but I'd say it's worth writing some test code.

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号