I have SQLite3 database, which is populated with some large set of data. I use migration for that.
3 tables will have following count of records: Table_1 will have about 10 records each record of Table_1 will be associated with ~100 records in Table_2 each record of Table_2 will be associated with ~2000 records in Table_3
Th开发者_JS百科e count of records will be about 10*100*2000 = 2000000
This takes a long time... Event, if i populate my database with about 20000 records, it takes about 10 minutes.
Also, i have noticed, that, during migration execution, ruby interpreter takes just 5% from CPU time and 95% remains unused ...
What the reason of such pure performance ?
Quite simply, inserting large amounts of records through manually saving AR objects one at a time is going to take years.
The best compromise between speed and "cleanness" (i.e. not a complete dodgy hack) for inserting large amounts of data is ar-extensions's (http://github.com/zdennis/ar-extensions) import method. It's not ideal, but it's better than any of the alternatives I could find, and the syntax is clean and doesn't require you to drop to raw sql (or anywhere close).
Example syntax:
items = Array.new
1.upto(200) do |n|
items << Item.new :some_field => n
end
Item.import items, :validate => false
At least in mysql this will batch the records into a single INSERT statement with multiple sets of values. Pretty damn fast.
If you run each INSERT
statement in it's own transaction, SQLite can be very, very slow. But if you run it all in one transaction (or a logical set of transactions), then it can be very fast.
Seed_fu could help, as discussed in this question
精彩评论