Friday, 30 September 2011

Powershell: Out-File truncate text

I was looking to use Powershell to write out some data from a SQL table using a combination of Invoke-Sqlcmd with out-file. Everything was going nicely until my output file suddenly started having data truncated. Something like:

Invoke-Sqlcmd -Query "SELECT LongText FROM MyTable" -ServerInstance $sql | out-file -filepath $filepath

Produced a file containing:

LongText
----------------
This was my really really long and boring piece of text that I had writ...


The thing is, it happened quite randomly. Or so I thought. Turns out that the out-file cmdlet outputs exactly as if it was appearing within the console on that machine and my console was too narrow for the text. (This is all in the documentation to be fair). So manipulating the size of your console window (by editing the properties) results in different output lengths.


However, each powrshell install on a machine can be different so I really need my script to be able to handle this limitation. As of writing, I couldn't find anything fullproof and my best hack was to just explicity set a width that I "knew" was big enough to cater for my text.

Invoke-Sqlcmd -Query "SELECT LongText FROM MyTable" -ServerInstance $sql | out-file -filepath $filepath -width 2000

Friday, 23 September 2011

T-SQL: A Week Of Continuous Integration Day 5

This is the final post of a 5 part series outlining my journey to getting SQL Server databases under source control and into a continuous integration environment.

The other posts are:
Day 1 - Powershell & Scripting Objects
Day 2 - Making Use of a Build Server
Day 3 - Database Developer Toolkits
Day 4 - Adding Value to Continuous Integration

Putting it all together

My journey is at an end and I finally have a working solution which has the following components:

1) A SQL Server dedicated to Continuous Integration
2) A build server/agent with Team City & Final Builder
3) Red-Gate Developer Toolbelt installed on the build agent and the developer machine
4) SVN as my version control choice and databases under Red-Gate source control
5) Tests written with the NUnit framework

Below I have outlined the full process with a few supporting screenshots to help show how things work. This isn't intended as an explanation of all the technologies used, but merely how I have bolted the pieces together to serve my purpose. Further reading can be found by visiting the wesbites of the software vendors.

SQL Under Source Control



Here we have the database SubversionTest under source control using RedGates SQL Source Control. Nothing too surprising there, just a set of directories with scripts.




With the database linked to SVN, adding the object dbo.NewTable to the database gives a visual aid to let you know changes need to be commited.



On browsing to the SQL Source Control tab, you get to view the DDL for the new object and, if the object has changed, the DDL that exists within SVN.




Here is the generic FinalBuilder script I use to deploy the database to the continuous integration server. Its very simple and just drops the database if it exists, creates a new one and then uses the red-gate compare tools to deploy the database objects. I've wrapped these up in a batch file for maintenance but they don't do anything clever and they just use a combination of the switches which the sqlcompare.exe's support.



When a "build" is run, it also runs some tests. Again, I won't delve into exactly which tests are being run but this just demonstrates the sort of screen you may see subsequent to the deploy of your database. Of course, these tests are run against the Continuous Integration server and are written using the NUnit framework - something which TeamCity supports. In this instance we only have a couple of quick tests to sanity check the two objects but on more complex apps these could take a significant amount of time.



Finally, we have the artifacts generated by the process in the form of some sql scripts which will bring a production server (which the sqlcompare has run against) to the latest build we have in source control. There is also a supporting HTML file which is just a graphical representation of the changes that will be made by executing the sql script.

To ensure that the development is as bug free as possible, I've hooked the CI to occur whenever a database change is commited to SVN. This will give you confidence that the changes made have not caused any issues and at the very least haven't lead to an invalid database build. It may be that you are able to run a full run of an application which may not be suitable for "on check-in" so its possible to schedule some builds/tests to run during quiet time.

Taking it even further

If you ship a database enabled product, one of the real benefits of a solution such as this is that you can quickly and easily target different versions/editions of SQL Server. It may be that a client runs a particular hotfix of SQL Server and wants to know whether this can be supported. If you can quickly provision an instance with to mimic that of a client, you can easily point your TeamCity projects at the new instance and see if your database builds and runs successfully.

I've talked before about using extended properties to version your database objects and you can use this automated process to help that. SVN has built in revision numbers to when docs get checked in so you could easily set up a build step within teamcity to run a T-SQL script to update extended properties with this number. I haven't done this but will definitely be experimenting with it.

The final word

Hvaing worked with databases out of source control for sometime, I'm really excited about how this is going to improve the quality of our database development going forward. The tools that Red-Gate have provided are easy to use and the simplicity of them is whats so impressive. I have hardly had to change my working practices at all and know that i'm already in a better place. Time will tell whether the versioning and targeting of different database editions takes off but I know that its not going to be a big challenge to implement.

I hope this has been useful to you and would be interested in hearing from people who may have implemented similar solutions or have any ideas on how to make this even better.

References

Red Gate SQL Source Control
Red Gate SQL Toolbelt
FinalBuilder
Team City
Microsoft Visual Studio
Microsoft SQL Server
Powershell
NUnit

Thursday, 22 September 2011

T-SQL: A Week Of Continuous Integration Day 4

This is the fourth post of a 5 part series outlining my journey to getting SQL Server databases under source control and into a continuous integration environment.

The other posts are:
Day 1 - Powershell & Scripting Objects
Day 2 - Making Use of a Build Server
Day 3 - Database Developer Toolkits


Adding Value to Continuous Integration

As i've mentioned previously, regular deployments of a database from source control has benefits such as validating source code in a similar way to the Build feature in Visual Studio. But this really is just the tip of the iceberg. We are now able to use this as the foundation to add extra building blocks which will add real value to our database solutions. Below is a summary of the 3 features that I'm gaining from including my database applications into continuous integration.

Unit/Integration Testing

The subject of unit testing for databases comes up frequently and although this post won't delve into this, I will touch on a couple of points. Essentially, meaningful unit testing is an area that is notoriously difficult to implement in SQL Server mainly due to the dependency on data. Visual Studio offers a database unit testing framework which makes it easier to develop the tests but you are still using c# code to execute T-SQL commands. It isn't a native T-SQL framework. However in my opinion, pure unit testing isn't a must have and as long as there are a suite of tests being executed then i'm happy. It may even be possible to run a full end to end solution of your database application. The point is that you are able to execute regular tests against your Continuous Integration server and significantly improve the confidence in your code and hopefully minimise the level of bugs that reach production.

I ended up writing a few "pseudo" unit tests using the NUnit Framework. Unfortunately, it appears that database unit test framework available with Visual Studio 2008 is not avaiable without installing the full version/IDE making it unsuitable for deploying to multiple build agents (although i'm told this is addressed in VS2010). The tests each had a setup script which would deploy a set of known test data to the database which would allow my tests to run.

I'm no expert in writing valuable unit tests, but to give you an idea of what it MIGHT look like, here is my example:

[TestFixture]
public class DBTests
{
[Test]
public void FormatDateIsNotNull()
{
var dt
= DateTime.Now;
string str = null;

using (var sqlConnection = new SqlConnection("Data Source=CISERVER;Initial Catalog=CIDATABASE;Integrated Security=SSPI;"))
{

SqlCommand cmd
= new SqlCommand();
cmd.CommandText = "SELECT dbo.FormatDate('20101231')";
cmd.Connection = sqlConnection;
sqlConnection.Open();

str = cmd.ExecuteScalar().ToString();
}

Assert.IsNotNull
(str);
}
}


Change/Deploy scripts

From a production and release perspective, there is huge value in automating the generation of scripts for release. In the past, constructing a release script manually has been massively time consuming and more importantly, prone to error. Similarly, there is a need for the person who is charged with creating the release/change script to have strong SQL development skills.

Using a development product such as Visual Studio/Red-Gate Compare, the hassle is taken out and you can compare your SVN repository with a production environment and generate the required script to sync them. You could take it a step further and automatically deploy the changes but this is something i'd be reluctant to do as there may be a whole set of steps post Continous Integration that need to be completed before go-live. But still, the very fact that the script has been automatically generated means that a relatively unskilled person can take them and deploy the changes. Along with the data and schema diff capabilities, the Red-Gate tools also provide a nice diff report which can allow users to review the changes without having to understand and interpret a potentially huge file of SQL code.

Database Versioning

A nice feature you can also harness with this process is to automatically manage some form of database build version. As your database develops and new features of the application are supported, its helpful to know which version of the database objects are being used. It'd be nice if MS had an inbuilt function which stamp a database or object with a version, but as they don't you can implement this with either a simple version table or by the use of extended properties. To what level you take the versioning is a personal choice, but I've chosen to have a Major/Revision stamp where the Major version is manually changed in the table and the Revision version is taken from the internal value that comes out of SVN. The beauty of having your database in a CI envrionment is that this number is automatically maintained so as you check changes into source control, the number is incremented and the database version can be updated removing the reliance on someone remembering to update it.

Targeted Database Version Testing

One final benefit of having a slick continuous integration environment configured is that you can easily point your builds to target a different server. This allows you to quickly prove that your database application supports different editions and versions of SQL Server. For example, a client may be running a particular Service Pack or you are considering upgrading your production kit to Denali. Having a standard and reliable build process will give you peace of mind that your database deploys without error and functions correctly on any given environment. As the pace of change increases and hotfixes, service packs and versions are released more frequently, so your test matrix expands and so the ability to quickly test your solution is invaluable.

Wednesday, 21 September 2011

T-SQL: A Week Of Continuous Integration Day 3

This is the third post of a 5 part series outlining my journey to getting SQL Server databases under source control and into a continuous integration environment.

The other posts are:
Day 1 - Powershell & Scripting Objects
Day 2 - Making Use of a Build Server

Database Developer Toolkits

Although the path to Continuous Integration had taking a knock back with the limitations of my powershell solution, i was determined to succeed. Initially, i'd wanted to cobble something together on a shoestring harnessing existing technologies to make the solution portable and cheap. However as time went by, the importance and benefits of CI became clearer and clearer and I felt it was time to look into other products (both 3rd party and MS) to see if there was a solution out there for us. After all, this can't be anything new and there's no need to reinvent the wheel.

There were 2 main issues that I wanted any solution to address:

1) keeing our source control up to date easily
2) deploying database solutions to a Continuous Integration server

Based upon this, I quickly found 2 potential avenues to explore.

1) Visual Studio Database Edition
2) Red-Gate Developer Toolbelt

Visual Studio Database Edition 2008

I was fortunate in that I was already in possession of a licensed copy of Visual Studio and indeed, a few of the guys had used the database projects (dbproj) for SQL development. It also has diff capabilities for both schema and data which would allow you to generate a deployment script. Fundamentally though, I believe that database development should be done in SQL Server Management Studio (SSMS) and i've outlined a few of the differences I see between the two as a database development IDE.

SQL Server Management Studio (SSMS)


  • more natual for a Database Developer
  • Can develop "online" against a database
  • immediate feedback from intellisense
  • easy to view dependencies in sql
  • edit multiple databases in one session
  • edit multiple objects in one connection
  • no native support for source control
  • Free

    Visual Studio 2008 Database Edition
  • version dependencies - tied to a particular version
  • have to develop against scripts and apply them
  • solutions allow grouping of all dev project types (c#, ssrs, ssas, ssis, dbproj etc)
  • builds fail if there are dependencies on other dbs
  • doesn't support shared development model
  • can manually deploy to different servers
  • integrates into source control (with plugins for some vendors)

    Visual Studio Database Edition is a good product in its own right and really does check a lot of the boxes for me with development. Being able to bundle all your projects as one solution is really nice and the diff capability is something that I believe MS really should incorperate into SSMS. However, from a continuous integration perspective, the use of VS just doesn't work for me. Besides the cost and heavyweight install requirements for running it on a build server (i don't think there is a simple SDK which works), the major show stopper for me is the disconnecting nature of development with SQL objects. Working with individual scripts and then applying the changes manually just isn't a viable solution for me. A database professionals natural habitat is SSMS and unless MS make database projects available through this IDE then it will the case for using Visual Studio Database Edition will always struggle to stack up.

    Red-Gate SQL Developer Toolbelt

    The red-gate solution is based upon their SQL Developer Toolbelt and it solves both the issues highlighted above. The first componenent is the SQL Source Control product which is essentially a plug-in to SQL Server Management Studio and allows you to manage your source control from the same environment you are developing in. You can essentially make changes to objects from within SSMS and you can then right click the object and commit them. Simple but effective, it even has colours to suggest when a change needs to be sync'd to the repository while it supports both the shared and dedicated development models. With my belief that development should happen in SSMS, this is a huge tick in the box as you don't have to jump through any hoops to keep your database in sync with source control.

    SQL Source Control scripts out the database to file (including data!) in an intuitive folder structure which leads nicely into generating a build script to deploy to a continuous integration server. Red-Gate offer a "Compare" tool which allows "diff"ing of databases as well as generating change scripts and deploying a database from source control. Importantly, all this functionality is available from the command prompt and could therefore be scheduled on a buildserver. The ability to easily generate a build script and execute was the key ingrediant missing from the Powershell solution and essentially leaves us with a solution which answers all the questions (and a few more besides!) and is an intuitive end-to-end development process which fits nicely into my way of working.

    We are now in a position to:
    1) Database Objects Scripted and maintained in realtime
    2) Scripts stored in Source Control
    3) A database server provisioned for continuous integration
    4) A build server which could execute the steps.
    5) Software which can deploy the database from source control.
  • Tuesday, 20 September 2011

    T-SQL: A Week Of Continuous Integration Day 2

    This is the second post of a 5 part series outlining my journey to getting SQL Server databases under source control and into a continuous integration environment.

    The other posts are:
    Day 1 - Powershell & Scripting Objects


    Making Use of a Build Server

    So we've seen in the last post how we can start scripting databases but we could still do so much more. Like many others, our dev team run continuous integration on a build server with TeamCity at the hub of things and there was genuine excitement at the prospect at pulling our database development in line with the rest of the code base. With my database scripts in SVN, we could make regular, automated "builds" of a database which would validate our schema while also giving us a basis to add a testing framework in the future. For me, this would be another real step forward and would solve a couple of issues. For example, imagine a database with hundreds of stored procedures referencing a particular column in a table. A drop or rename of this column would render all the stored procedures invalid and having a regular build of the schema would quickly identify any issues. It would also encourage deprecated objects to be cleared up - good practice when keeping a lean database for maintenance and performance. All i needed was to make use of the existing build server and provision an instance of SQL Server to act as the target for deploying the database.

    So i'm sold. TeamCity can pull out the scripts from SVN and then using software such as FinalBuilder, I was able to loop through the directories and execute the sql scripts. Fine...in theory.

    However, when you come to wanting to build the database from these scripts, there is a problem in that when using a simple powershell script (with Invoke-SQLcmd) to build the db in that dependencies are not honoured. SQL requires that objects are deployed in a certain order (functions need tables to be created first, tables need to be created before its relations etc) and the only way to create a db from a scripts directory is to manually build (and maintain) a deploy script - kind of against the idea of Continuous Integration. So while the Powershell scripting works nicely for my production environment, i'd need a different approach to get continuous integration working.

    But at this stage, the plan had developed further and we now had the following components:

    1) Database Objects Scripted
    2) Scripts stored in Source Control
    3) A database server provisioned for continuous integration
    4) A build server which could execute the steps.

    Monday, 19 September 2011

    T-SQL: A Week Of Continuous Integration Day 1

    This is the first post of a 5 part series outlining my journey to getting SQL Server databases under source control and into a continuous integration environment. Hopefully, others will find this useful.

    Powershell & Scripting objects

    This all stemmed from 2 things. Firstly, my stated yearly goal to get my hands dirty with powershell and secondly a desire to take regular scripts of my production databases for recovery purposes (should my backups let me down). The idea being that I could re-create the database from a set of scripts stored on a central repository on any given server. So, I wrote a powershell script to query the production database model and output these scripts to the filesystem, scheduled it through the SQL Agent and we were off and running.

    The benefit of using Powershell to do this was that you could essentially customise your own folder structure and also use it across your entire SQL Server estate. You could easily schedule the scripts to run a central server, loop through a list of sql servers and output all the databases to a central filesystem so you have all your scripts in one place. Oh, and its free. FREE!! And we all like free stuff - a pretty compelling case to implement such a thing.

    But this got me thinking. Our production database is obviously crucial to us and is fully backed up with various levels of redundancy and avenues for restore should a catastrophic failure occur. The main concern for me was our development database which one could argue is at the highest risk and possibly has the most valuable code. In this instance, there was a shared development database (more on shared vs dedicated later) which we could apply the same principle to. Even better, we would be able to extend the functionality to hook into our chosen source control provider (SVN) and make regular check-ins. Ok, so the job would still be scheduled and therefore we'd be exposed to potential dataloss but we'd be in a much better place than before.

    Shared vs Dedicated development model

    I've mentioned above and in a previous post about the different database development models. There are two common approaches for database development, shared and dedicated. Dedicated means each developer works off of a local copy of the database much more akin to traditional development. Changes can be made and tested on a local instance before checking back into your source control repository. Shared means there is a single central database which all developers connect to and make changes to. I've used this approach when dealing with large integrated systems dependent on other technologies/services such as a website where its not really practical to have the entire system on your local machine. Essentially, the shared database is "the truth" and as a result, including the schema in source control can be overlooked.

    So at this stage, we have the looked at the following:

    1) Database scripted to the filesystem
    2) Scripts stored in source control

    Obviously, this is great but there is much more that can be done and i'll cover more in my next post.

    Thursday, 1 September 2011

    Powershell: change start location of shell

    I'm starting to build up a repository of powershell scripts and I have been getting a little frustrated that every time I open up the PS shell, it defaults to my user directory:


    I then have to "cd" to the location of my scripts to start working. Back a couple years ago, I remember going through some steps to personalise my PS shell (although obviously that knowledge didn't stick!) and so I thought i'd find out how to change the default dir to be my PS scripts folder.

    This article suggested that I just needed to edit the Microsoft.Powershell_profile.ps1 file in the WindowsPowerShell directory in My Documents. Easy.

    Off i went to that directory, only to discover it wasn't there - not hidden or protected it just didn't exist. Thinking that the location may be different on my setup, I ran the $Profile command in PS and it confirmed that it should be in C:\Documents and Settings\RichardB\My Documents\WindowsPowerShell



    Then it dawned on me. Perhaps I just need to create the directory and file myself. So I navigated to My Documents and created the WindowsPowerShell folder and a Microsoft.Powershell_profile.ps1 file. The final step was to edit the file to change the behaviour I was after:

    Set-Location C:\PS



    A quick restart of Powershell and there we have it. A bit of time saved when using Powershell.
    /* add this crazy stuff in so i can use syntax highlighter