开发者

How to copy schema and some data from SQL Server to another instance?

开发者 https://www.devze.com 2023-04-01 08:02 出处:网络
My product uses a SQL Server database - each client has their own deployed instance on their own intranet.The db has about 200 tables.Most of them are configuration tables that have only a few rows, b

My product uses a SQL Server database - each client has their own deployed instance on their own intranet. The db has about 200 tables. Most of them are configuration tables that have only a few rows, but there are a couple of transaction data tables which may have several million rows. Often I need to troubleshoot a customer's configuration problem so I need a copy of their database to work with locally on my dev system... However it may be very large due to the transaction data, which makes it very difficult for the customer to send me a backup to work with. I need a way to backup/copy/export everything EXCEPT for the large tables.

Ideally, the large tables really belong a separate database catalog, such that a customer could backup their "data" separately from their "configuration". However, I don't have the authority to make that type of change, so I'm looking for a good way to export or copy just the c开发者_如何学Pythononfiguration parts without having to copy the whole thing.

Currently I have a quick and dirty .NET app which uses a SQL Adapter and DataSet to blindly select * from all tables except the ones I want to omit, dump that to an XML file (this is my poor-man's export feature). And I have companion app I can use locally to load the XML back into a DataSet and then use SQLBulkCopy to load it into my dev database. I had to do some tricks to disable constraints and so forth, but I made it work. This gets me the data, but not the schema.... ideally I'd like a way to get the schema along with it. It is a brute force approach, and I believe there must be a simpler way. Any suggestions for a better approach?

The solution needs to be automated since I'm dealing with end-users who generally have very little IT support.

(SQL 2005 and higher)


Could you generate the scripts from SSMS directly?

  1. Right click the database
  2. Select Tasks -> Generate Scripts
  3. (Click next if you get the intro screen)
  4. Select "Select specific database objects"
  5. Pick the objects to generate scripts for (tables, stored procedures, etc...)
  6. Click Next, then specify the output filename
  7. Click Finish to generate the script

This will generate the schemas only. If you want to do data generating scripts as well, in step 6) click the Advanced button and scroll down to the "Types of data to script" and change it from "Schema only" to "Data only" or "Schema and data"

In your situation, you could do "Schema and data" for all of your small config tables, then do "Schema only" for the large tables where you don't want to export the data.

I know this isn't totally automated however it's pretty close. If you want to look at automating this even further, check out this thread. Not sure if this works on SQL 2005:

How can I automate the "generate scripts" task in SQL Server Management Studio 2008?


Just a couple comments since we had similar problem in the past.

Generate scripts option in SSMS is great for small databases where you can manually re-order the individual scripts and avoid dependency issues. Not sure why so many people suggest using this feature that has dependency issues.

For larger databases only viable solution is something like you created yourself or using third party tools.

We ended up using this third party tool. SQL Packager worked a bit better at the time we evaluated but we managed to get the other tool basically for free…

Anyway, hope this helps even though it’s a late answer/comment…


I solved it as follows. I created a custom winform app to export their data:

  1. use SMO (Transfer class) to script the entire schema to a .sql file
  2. loop through all specified tables (actually I just listed the large-growth ones to ignore, and process everything else) using: select * from
  3. capture the records into DataTable objects and dump them to .xml files
  4. Zip the whole folder up to be sent back to me

And I have a custom winform app I use locally on my dev system to import the data:

  1. Create an empty database
  2. Run the .sql file to build the schema
  3. Disable all constraints on all tables (using sp_foreachtable)
  4. loop through all .xml file and bulk import them using SqlBulkCopy object
  5. Re-enable all constraints (using sp_foreachtable)

This approach works well for my database, but I would not necessarily advise this for some database designs. In my case, there are many "small" configuration tables, and only a few "large" tables with minimal relations between them so I can easily ignore the large ones. If there were more complex relations it would probably not be a good solution...

Seems to be working great so far. I was hoping to find a good way to do this without writing custom code, but I got it working with only a few hours of dev time.

FYI, I almost got it working with sqlpubwiz, but the option to script data is all-or-nothing... you can't ignore specific tables... that was a deal-breaker for my app. The Transfer object in SMO is a very useful class - it only requires a few lines of code to script an entire database schema including all dependencies in the correct creation order to recreate the database from scratch!

http://msdn.microsoft.com/en-us/library/microsoft.sqlserver.management.smo.transfer.aspx


Here at Red Gate we have a tool called SQL Packager that has a command line that can be automated to perform this task. This uses the SQL Compare and SQL Data Compare engines under the hood. I'd be interested to get your feedback on whether this satisfies your use case.

http://www.red-gate.com/products/sql-development/sql-packager/

0

精彩评论

暂无评论...
验证码 换一张
取 消