开发者

Limiting Number of Rows Inserted into a SQL Server Database

开发者 https://www.devze.com 2023-01-04 22:47 出处:网络
I have a program in c# in VS that runs a mainform. That mainform exports data to an SQL Database with stored procedures into tables. The data exported is a lot of data (600,000 + rows).

I have a program in c# in VS that runs a mainform.

That mainform exports data to an SQL Database with stored procedures into tables. The data exported is a lot of data (600,000 + rows).

I have a problem tho. On my mainform I need to have a "database write out interval". This is a number of how many "rows" will be imported into the database.

My problem is however the steps on how to implement that interval. The mainform runs, and when the main program is done, the sql still takes IN data for another 5-10 minutes.

Therefore, if I close the mainform, the rest of the data will not me imported.

Do you professional programmers out there know a way where I can somehow communicate with SQL to only export data for a user-specified interval. T

his has to be done with my c# clas开发者_C百科s.

I dont know where to begin.

I dont think a timer would be a good idea because differenct computers and cpu's perform differently. Any advice would be appreciated.


If the data is of a fixed format (ie, there are going to be the same columns for every row and its not going to change much), you should look at Bulk Insert. Its incredibly fast at inserting large numbers of rows.

The basics are you write your data out to a text file (ie, csv, but you can specify whatever delimiter you want), then execute a BULK INSERT command against the server. One of the arguments is the path to the file you wrote out. It's a bit of a pain to use because you have to write the file in a folder on the server (or a UNC path that the server has access to) which leads to configuring windows shares or setting up FTP on the server. It sounds like exactly what you want to use, though.

Here's the MSDN documentation on BULK INSERT:
http://msdn.microsoft.com/en-us/library/ms188365.aspx


Instead of exporting all of your data to SQL and then trying to abort or manage the load a a better process might be to split your load into smaller chunks (10,000 records or so) and check whether the user wants to continue after each load. This gives you a lot more flexibility and control over the load then dumping all 600,000 records to SQL and trying to manage the process.

Also what Tim Coker mentioned is spot on. Even if your stored proc is doing some data manipulation it is a lot faster to load the data via bulk insert and run a query after the load to do any work you have to do then to run all 600,000 records through the stored proc.


Like all the other comments before, i will suggest you to use BulkInsert. You will be amazed by how fast the performance is when it comes to large dataset and perhaps your concept about interval is no longer required. Inserting 100k of records may only take seconds.

Depends on how your code is written, ADO.NET has native support for BulkInsert through SqlBulkCopy, see the code below http://www.knowdotnet.com/articles/bulkcopy_intro1.html

If you have been using Linq to db for your code, there are already some clever code written as extension method to the datacontext which transform the linq changeset into a dataset and internally use ADO.NET to achieve the bulk insert

http://blogs.microsoft.co.il/blogs/aviwortzel/archive/2008/05/06/implementing-sqlbulkcopy-in-linq-to-sql.aspx

0

精彩评论

暂无评论...
验证码 换一张
取 消