Thursday, 22 September 2011

T-SQL: A Week Of Continuous Integration Day 4

This is the fourth post of a 5 part series outlining my journey to getting SQL Server databases under source control and into a continuous integration environment.

The other posts are:
Day 1 - Powershell & Scripting Objects
Day 2 - Making Use of a Build Server
Day 3 - Database Developer Toolkits


Adding Value to Continuous Integration

As i've mentioned previously, regular deployments of a database from source control has benefits such as validating source code in a similar way to the Build feature in Visual Studio. But this really is just the tip of the iceberg. We are now able to use this as the foundation to add extra building blocks which will add real value to our database solutions. Below is a summary of the 3 features that I'm gaining from including my database applications into continuous integration.

Unit/Integration Testing

The subject of unit testing for databases comes up frequently and although this post won't delve into this, I will touch on a couple of points. Essentially, meaningful unit testing is an area that is notoriously difficult to implement in SQL Server mainly due to the dependency on data. Visual Studio offers a database unit testing framework which makes it easier to develop the tests but you are still using c# code to execute T-SQL commands. It isn't a native T-SQL framework. However in my opinion, pure unit testing isn't a must have and as long as there are a suite of tests being executed then i'm happy. It may even be possible to run a full end to end solution of your database application. The point is that you are able to execute regular tests against your Continuous Integration server and significantly improve the confidence in your code and hopefully minimise the level of bugs that reach production.

I ended up writing a few "pseudo" unit tests using the NUnit Framework. Unfortunately, it appears that database unit test framework available with Visual Studio 2008 is not avaiable without installing the full version/IDE making it unsuitable for deploying to multiple build agents (although i'm told this is addressed in VS2010). The tests each had a setup script which would deploy a set of known test data to the database which would allow my tests to run.

I'm no expert in writing valuable unit tests, but to give you an idea of what it MIGHT look like, here is my example:

[TestFixture]
public class DBTests
{
[Test]
public void FormatDateIsNotNull()
{
var dt
= DateTime.Now;
string str = null;

using (var sqlConnection = new SqlConnection("Data Source=CISERVER;Initial Catalog=CIDATABASE;Integrated Security=SSPI;"))
{

SqlCommand cmd
= new SqlCommand();
cmd.CommandText = "SELECT dbo.FormatDate('20101231')";
cmd.Connection = sqlConnection;
sqlConnection.Open();

str = cmd.ExecuteScalar().ToString();
}

Assert.IsNotNull
(str);
}
}


Change/Deploy scripts

From a production and release perspective, there is huge value in automating the generation of scripts for release. In the past, constructing a release script manually has been massively time consuming and more importantly, prone to error. Similarly, there is a need for the person who is charged with creating the release/change script to have strong SQL development skills.

Using a development product such as Visual Studio/Red-Gate Compare, the hassle is taken out and you can compare your SVN repository with a production environment and generate the required script to sync them. You could take it a step further and automatically deploy the changes but this is something i'd be reluctant to do as there may be a whole set of steps post Continous Integration that need to be completed before go-live. But still, the very fact that the script has been automatically generated means that a relatively unskilled person can take them and deploy the changes. Along with the data and schema diff capabilities, the Red-Gate tools also provide a nice diff report which can allow users to review the changes without having to understand and interpret a potentially huge file of SQL code.

Database Versioning

A nice feature you can also harness with this process is to automatically manage some form of database build version. As your database develops and new features of the application are supported, its helpful to know which version of the database objects are being used. It'd be nice if MS had an inbuilt function which stamp a database or object with a version, but as they don't you can implement this with either a simple version table or by the use of extended properties. To what level you take the versioning is a personal choice, but I've chosen to have a Major/Revision stamp where the Major version is manually changed in the table and the Revision version is taken from the internal value that comes out of SVN. The beauty of having your database in a CI envrionment is that this number is automatically maintained so as you check changes into source control, the number is incremented and the database version can be updated removing the reliance on someone remembering to update it.

Targeted Database Version Testing

One final benefit of having a slick continuous integration environment configured is that you can easily point your builds to target a different server. This allows you to quickly prove that your database application supports different editions and versions of SQL Server. For example, a client may be running a particular Service Pack or you are considering upgrading your production kit to Denali. Having a standard and reliable build process will give you peace of mind that your database deploys without error and functions correctly on any given environment. As the pace of change increases and hotfixes, service packs and versions are released more frequently, so your test matrix expands and so the ability to quickly test your solution is invaluable.

No comments:

Post a Comment

/* add this crazy stuff in so i can use syntax highlighter