Monday, 19 September 2011

T-SQL: A Week Of Continuous Integration Day 1

This is the first post of a 5 part series outlining my journey to getting SQL Server databases under source control and into a continuous integration environment. Hopefully, others will find this useful.

Powershell & Scripting objects

This all stemmed from 2 things. Firstly, my stated yearly goal to get my hands dirty with powershell and secondly a desire to take regular scripts of my production databases for recovery purposes (should my backups let me down). The idea being that I could re-create the database from a set of scripts stored on a central repository on any given server. So, I wrote a powershell script to query the production database model and output these scripts to the filesystem, scheduled it through the SQL Agent and we were off and running.

The benefit of using Powershell to do this was that you could essentially customise your own folder structure and also use it across your entire SQL Server estate. You could easily schedule the scripts to run a central server, loop through a list of sql servers and output all the databases to a central filesystem so you have all your scripts in one place. Oh, and its free. FREE!! And we all like free stuff - a pretty compelling case to implement such a thing.

But this got me thinking. Our production database is obviously crucial to us and is fully backed up with various levels of redundancy and avenues for restore should a catastrophic failure occur. The main concern for me was our development database which one could argue is at the highest risk and possibly has the most valuable code. In this instance, there was a shared development database (more on shared vs dedicated later) which we could apply the same principle to. Even better, we would be able to extend the functionality to hook into our chosen source control provider (SVN) and make regular check-ins. Ok, so the job would still be scheduled and therefore we'd be exposed to potential dataloss but we'd be in a much better place than before.

Shared vs Dedicated development model

I've mentioned above and in a previous post about the different database development models. There are two common approaches for database development, shared and dedicated. Dedicated means each developer works off of a local copy of the database much more akin to traditional development. Changes can be made and tested on a local instance before checking back into your source control repository. Shared means there is a single central database which all developers connect to and make changes to. I've used this approach when dealing with large integrated systems dependent on other technologies/services such as a website where its not really practical to have the entire system on your local machine. Essentially, the shared database is "the truth" and as a result, including the schema in source control can be overlooked.

So at this stage, we have the looked at the following:

1) Database scripted to the filesystem
2) Scripts stored in source control

Obviously, this is great but there is much more that can be done and i'll cover more in my next post.

No comments:

Post a Comment

/* add this crazy stuff in so i can use syntax highlighter