开发者

SQL Server database change workflow best practices

开发者 https://www.devze.com 2022-12-30 13:20 出处:网络
The Background My group has 4 SQL Server Databases: Production UAT Test Dev I work in the Dev environment. When the time comes to promote the objects I\'ve been working on (tables, views, functi

The Background

My group has 4 SQL Server Databases:

  • Production
  • UAT
  • Test
  • Dev

I work in the Dev environment. When the time comes to promote the objects I've been working on (tables, views, functions, stored procs) I make a request of my manager, who promotes to Test. After testing, she submits a request to an Admin who promotes to UAT. After successful user testing, the same Admin promotes to Production.

The Problem

The entire process is awkward for a few reasons.

  1. Each person must manually track their changes. If I update, add, remove any objects I need to track them so that my promotion request contains everything I've done. In theory, if I miss something testing or UAT should catch it, but this isn't certain and it's a waste of the tester's time, anyway.
  2. Lots of changes I make are iterative and done in a GUI, which means there's no record of what changes I made, only the end result (at least as far as I know).
  3. We're in the fairly early stages of building out a data mart, so the majority of the changes made, at least count-wise, are minor things: changing the data type for a column, altering the names of tables as we crystallize what they'll be used for, tweaking functions and stored procs, etc.

The Question

People have been doing this kind of work for decades, so I imagine there have got to be a much better way to manage the process. What I would love is if I could run a diff between two databases to see how the structure was different, use that diff to generate a change script, use that change script as my promotion request. Is this possible? If not, are there any other ways to organize this process?

For the record, we're a 100% Microsoft shop, just now updating everything to SQL Server 2008, so any tools available in that package would be fair game.


I should clarify I'm not necessarily looking for diff tools. If that's the best way to sync our environments then it's fine, but if there's a better way I'm looking for that.

An example doing what I want really well are migrations in Ruby on Rails. Dead simple syntax, all changes are well documented automatically and by default, determining what migrations need to run is almost trivially easy. I'd love if there was something similar to this for SQL Server.

My ideal solution is 1) easy and 2) hard to mess up. Rails Migrations are both; everything I've d开发者_如何转开发one so far on SQL Server is neither.


Within our team, we handle database changes like this:

  • We (re-)generate a script which creates the complete database and check it into version control together with the other changes. We have 4 files: tables, user defined functions and views, stored procedures, and permissions. This is completely automated - only a double-click is needed to generate the script.
  • If a developer has to make changes to the database, she does so on her local db.
  • For every change, we create update scripts. Those are easy to create: The developer regenerates the db script of his local db. All the changes are now easy to identify thanks to version control. Most changes (new tables, new views etc) can simply be copied to the update script, other changes (adding columns for example) need to be created manually.
  • The update script is tested either on our common dev database, or by rolling back the local db to the last backup - which was created before starting to change the database. If it passes, it's time to commit the changes.
  • The update scripts follow a naming convention so everybody knows in which order to execute them.

This works fairly well for us, but still needs some coordination if several developers modify heavily the same tables and views. This doesn't happen often though.

The important points are:

  • database structure is only modified by scripts, except for the local developer's db. This is important.
  • SQL scripts are versioned by source control - the db can be created as it was at any point in the past
  • database backups are created regularly - at least before making changes to the db
  • changes to the db can be done quickly - because the scripts for those changes are created relatively easily.

However, if you have a lot of long lasting development branches for your projects, this may not work well.

It is by far not a perfect solution, and some special precautions are to be taken. For example, if there are updates which may fail depending on the data present in a database, the update should be tested on a copy of the production database.

In contrast to rails migrations, we do not create scripts to reverse the changes of an update. But this isn't always possible anyway, at least in respect to the data (the content of a dropped column is lost even if you recreate the column).


Version Control and your Database

The root of all things evil is making changes in the UI. SSMS is a DBA tool, not a developer one. Developers must use scripts to do any sort of changes to the database model/schema. Versioning your metadata and having upgrade script from every version N to version N+1 is the only way that is proven to work reliably. It is the solution SQL Server itself deploys to keep track of metadata changes (resource db changes).

Comparison tools like SQL Compare or vsdbcmd and .dbschema files from VS Database projects are just last resorts for shops that fail to do a proper versioned approach. They work in simple scenarios, but I see them all fail spectacularly in serious deployments. One just does not trust a tool to do a change to +5TB table if the tools tries to copy the data...


RedGate sells SQL Compare, an excellent tool to generate change scripts.

Visual Studio also has editions which support database compares. This was formerly called Database Edition.

Where I work, we abolished the Dev/Test/UAT/Prod separation long ago in favor of a very quick release cycle. If we put something broken in production, we will fix it quickly. Our customers are certainly happier, but in the risk avert corporate enterprise, it can be a hard sell.


There are several tools available for you. One is from Red-Gate called SQL Compare. Awesome and highly recommended. SQL Compare will let you do a diff in schemas between two databases and even build the sql change scripts for you.

Note they have been working on a SQL Server source control product for awhile now as well.

Another (if you're a visual studio shop) is the schema and data compare features that is part of Visual Studio (not sure which versions).


Agree that SQL Compare is an amazing tool.

However, we do not make any changes to the database structure or objects that are not scripted and saved in source control just like all other code. Then you know exactly what belongs in the version you are promoting because you have the scripts for that particular version.

It is a bad idea anyway to make structural changes through the GUI. If you havea lot of data, it is far slower than using alter table at least in SQL Server. You only want to use tested scripts to make changes to prod as well.


I agree with the comments made by marapet, where each change must be scripted.
The problem that you may be experiencing, however, is creating, testing and tracking these scripts.
Have a look at the patching engine used in DBSourceTools.
http://dbsourcetools.codeplex.com

It's been specifically designed to help developers get SQL server databases under source-code control.

This tool will allow you to baseline your database at a specific point, and create a named version (v1).
Then, create a deployment target - and increment the named version to v2.
Add patch scripts to the Patches directory for any changes to schema or data.
Finally, check the database and all patches into source-code control, to distribute with devs.

What this gives you is a repeatable process to test all patches to be applied from v1 to v2.
DBSourceTools also has functionality to help you create these scripts, i.e. schema compare or script data tools.

Once you are done, simply send all of the files in the patches directory to your DBA to upgrade from v1 to v2.

Have fun.


Another "Diff" tool for databases:

http://www.xsqlsoftware.com/Product/Sql_Data_Compare.aspx


  1. Keep database version in a versioning table
  2. Keep script file name that was successfully applied
  3. Keep md5 sum of each sql script that has been applied. It should ignore spaces when calculate md5 sum. Must be effective.
  4. Keep info about who applied a script Keep info about when a script was applied
  5. Database should be verified on application start-up
  6. New sql script should be applied automatically
  7. If md5 sum of a script that was already applied is changed, error should be thrown (in a production mode)
  8. When script have been released it must not be changed. It must be immutable in a production environment.
  9. Script should be written in a way, so it could be applied to different types of database (see liquibase)
  10. Since most ddl statements are auto-committing on most databases, it is best to have a single ddl statement per SQL script.
  11. DDL sql statement should be run in a way, so it can be executed several times without errors. Really helps in a dev mode, when you may edit script several times. For instance, create a new table, only if it does not exist, or even drop table before creating a new one. It will help you in a dev mode, with a script that has not been released, change it, clear md5 sum for this script, rerun it again.
  12. Each sql script should be run in its own transaction.
  13. Triggers/procedures should be dropped and created after each db update.
  14. Sql script is kept in a versioning system like svn
  15. Name of a script contains date when it was committed, existing (jira) issue id, small description
  16. Avoid adding rollback functionality in scripts (liquibase allow to do that). It makes them more complicated to write and support. If you use exactly one ddl statement per script, and dml statements are run within a transaction, even failing a script will not be a big trouble to resolve it


This is the workflow we have been using succesfully:

  • Development instance: SQL objects are created/updated/deleted in DB using MSSQL Studio and all operations are saved to scritps we include in each version of our code.
  • Moving to production: We compare schema between dev and prod db using SQL Schema Compare in Microsoft Visual Studio. We update prod using the same tool.
0

精彩评论

暂无评论...
验证码 换一张
取 消