If I used DataContext DB to store large num开发者_JS百科ber of data. I found that the performance will be very very slow when data number grows up.
The number of data is about 6000 record.
If I insert one data and SubmitChange, the SubmitChange will cost 1.X secs.
Is there any way to improve the performance or it is the limitation.....
Thanks.
I didn't test it myself but try to not call SumbitChanges( )
after each insert.
Perform all 6000 inserts and then call SubmitChanges( )
just once. The DataContext
should be aware of all the changes you have made.
var _db = new YourDbContext( );
for( int i = 0; i < 6000; i++ )
{
FoobarObj newObject = new FoobarObj( )
{
Name = "xyz_" + i.ToString( );
};
_db.FoobarObjects.InsertOnSubmit( newObject );
}
_db.SubmitChanges( );
One second sounds like too long. Much longer than I have measured. In a trivial test I just did:
using(var dc = new MyDc(@"isostore:/stuff.sdf"))
{
if(!dc.DatabaseExists())
dc.CreateDatabase();
dc.Data.InsertAllOnSubmit(
Enumerable.Range(0, 6000).Select( i => new MyData { Data = "Hello World" }));
dc.SubmitChanges();
}
I can insert at a rate of 1~2 item per ms in 6000 batches, and this remained pretty steady as the data size continued to grow. If I change it to smaller batches (like say 5 items) then it drops to about 10ms per item, since there is quite a bit of overhead involved in initializing the datacontext and that dominates the execution time.
So there must be something else going on. Can you give some details about what you are inserting? Maybe a code sample which demonstrates the problem end to end?
精彩评论