SqlBulkCopy is supposed to help do large volume of inserts instead of sending individual insert statements. But what about calling stored procedures? I have a procedure that the data gets passed to it and then it does some lookups in another table and possibly a second insert into that lookup table.
Since this can't be converted to a query, is there a way to use SqlBulkCopy for stored procedure calls or would it not m开发者_开发知识库ake any sense?
I'm only making 2000 or less calls at a time per db connection but wanted to know if there was a more efficient way.
Based on @Kev answer, you can do the bulk insert into a staging table and then you can have a trigger kick off your stored procedure.
I'm assuming you have a FK constraint to your other table so you'll need the value before inserting into the target table. You might try removing the constraint if possible. Do your bulk insert then the trigger can just update the columns afterward.
The only way can think of doing this would to have a trigger on the destination table and set the SqlBulkCopyOptions FireTriggers
option.
From there you could call your stored procedure or put the stored procedure logic in that trigger.
If the table is appended to by other clients (e.g. a web app) then you'd need to have some way of differentiating between the bulk copy client and other regular apps. I guess you could differentiate by setting the Application Name value in the connection string and checking for it in the trigger using SELECT APP_NAME()
.
If you're wanting to run the stored proc on each row inserted then you're kind of defeating the purpose of the insert which is to insert a lot of records quickly. I would insert into a staging table and then modify the stored proc to use set logic to operate over the whole set of data at once.
精彩评论