开发者

Is there a way to force truncated columns to be inserted into a SQL Server table?

开发者 https://www.devze.com 2023-03-01 01:12 出处:网络
Data are given to me in a spread sheet which I\'m converting to pipe-delimited files. I tried inserting over 46000 rows but four (4) of them failed with the error message

Data are given to me in a spread sheet which I'm converting to pipe-delimited files. I tried inserting over 46000 rows but four (4) of them failed with the error message

SQLState = 22001, NativeError = 0 Error = [Microsoft][SQL Native Client]String data, right truncation

It would be nice to know which rows fail. Is there a开发者_如何学Python way I can either force these rows to be inserted with the truncated strings, or to determine which ones fail with BCP?

  1. I have tried constructing the the insert statements in the XLSX file with CONCATENATE. That fails because I have too many parameters for CONCANTENATE.
  2. I have tried comma- and tab-delimited files, which fail because some of the data have those characters.
  3. I finally constructed pipe-delimited files after changing the localization settings so I can do this with Excel (How to save an Excel spreadsheet as pipe-delimited), and then importing them into the table with bcp TableName in MyCreatedCSVFile.csv -T -c -t^| <- I had to escape the | character on the command-line with ^.

Is there a better way to do this?


Are you using a staging table?

You'd normally use one with generous column lengths etc then perform validation and quality checks. Even a simple

INSERT Finaltable (..)
SELECT (..), LEFT (offendingcolumn, 100) FROM Stagingtable

A stored proc can be used too to do a BULK INSERT then some checks then an INSERT into the final table.


You can set the number of errors high enough to cover the exception and then specify an error file. You then handle the exceptions one at a time. This is the cleanest method. If you are willing to use SSIS, you can get more elaborate in your workflow and handle the truncation and perhaps logging, but you lose the benefit of bulk copying.


I managed to do this by creating an "Instead of" trigger.

I created a table with varchar(max) to ensure I didn't get the truncates. I then created an instead-of trigger on the insert (so this 'proxy' table never get rows).

In the trigger I do a convert(varchar(50),name255). Ie: doing the truncate explicitly.

However, this means that you need to specify "FIRE_TRIGGER" with your BCP command.

How this helps.

0

精彩评论

暂无评论...
验证码 换一张
取 消