开发者

Flat File data to Multiple tables USING SSIS

开发者 https://www.devze.com 2023-02-27 07:23 出处:网络
First time i am paling to implement my task in SSIS I am having data like 4.5 gB ,i would like to upload into the Sql Server 2008 .

First time i am paling to implement my task in SSIS

I am having data like 4.5 gB ,i would like to upload into the Sql Server 2008 . my data is like combination of 4to 6 tables of data, there is no Primary key in raw data .

one row containing the multiple tables of information

now i need to split that data in to respective tables.

my data is like this

row1: col1, col2, col3...........col125

Now i have to insert some columns in to master table, if i insert the records in to the master table i have to get last inserted row id and using that id i have to insert coloumns of raw data(col5 to col20 and so on) in to another tables like..

last inserted row is 5

table 1 5, col2 5 ,col3 5, col4 5 ,col5 and has to insert another table

table 3 5, col12 5 ,col32 5, col45 5 ,col55

table 4 5, col72 5 ,col82 5, col95 5 ,col105 like that from first row.

can any one suggest me how to impleme开发者_运维问答nt this task. Please see the attaced file (http://www.bidn.com/Assets/Uploaded-CMS-Files/fc8b892d-8652-4f0e-bdc6-56e297149315Table Extract.pdf)


I would propose an SSIS package with a Dataflow that merely uploads the source file into a staging table version of the file. At that point, my package would then call an Execute SQL Task which would fire a very large stored procedure to do all the master/child work you will need to do.

If performance is not terribly important, within the stored procedure, create a cursor to read each row, by agonizing row, and perform your insert into table 1. Capture the scope_identity() value (5 in your example) and use that and the remaining columns to insert into tables 2-6. Lather, rinse, repeat.

A better performance approach would be to perform set based operations on that data. That pattern will require more skill to pull off but is entirely feasible. Instead of a cursor, insert all of the rows into table 1. To capture all of the inserted identity values, you will want to use the OUTPUT clause into something like a table variable. At that point, you have a mapping between input rows (row 1 = id 5, row 2 = id 6, etc) and can perform the same inserts in the cursor based approach except you'd use columns instead of variables.

If you absolutely must use SSIS the entire way, you have my condolences. The cursor pattern could be applied there with similar performance results. Instead of loading to a staging table, load to an SSIS variable of type object in your data flow. Create 126 or whatever variables in the package and then shred the recordset

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号