开发者

How to execute big SQL files on SQL Server?

开发者 https://www.devze.com 2022-12-29 01:57 出处:网络
开发者_C百科I have about 50 T-SQL files, some of them are 30MB but some of them are 700MB. I thought on executing them manually, but if the file is bigger than 10MB it throws an out of memory exceptio
开发者_C百科

I have about 50 T-SQL files, some of them are 30MB but some of them are 700MB. I thought on executing them manually, but if the file is bigger than 10MB it throws an out of memory exception on the SQL Server Management Studio.

Any ideas?


you can try the sqlcmd command line tool, that may have different memory limits. http://msdn.microsoft.com/en-us/library/ms162773.aspx

Usage example:

sqlcmd -U userName -P myPassword -S MYPCNAME\SQLEXPRESS -i myCommands.tsql


If you have that much data - wouldn't it be a lot easier and smarter to have that data in e.g. a CSV file and then bulk importing those into SQL Server??

Check out the BULK INSERT command - allows you to quickly and efficiently load large data volumes into SQL Server - much better than such a huge SQL file!

The command looks something like:

BULK INSERT dbo.YourTableName
   FROM 'yourfilename.csv'
   WITH ( FIELDTERMINATOR =';',
          ROWTERMINATOR =' |\n' )

or whatever format you might have to import.


Maybe this is too obvious, but...did you consider writing a program to loop through the files and call SqlCommand.ExecuteNonQuery() for each line? It's almost trivial.

Less obvious advantages:

  • You can monitor the progress of the feed (which is going to take some time)
  • You can throttle it (in case you don't want to swamp the server)
  • You can add a little error handling in case there are problems in the input files
0

精彩评论

暂无评论...
验证码 换一张
取 消