Automating DB and SSIS builds from source control

This week I’ve been working on a new branch of a data warehouse project. To make deployment to development environments easier I decided I’d attempt to get some sort of continuous integration/automated builds to our dev server up and running. Doing this sort of thing has never been particularly straightforward for database projects, but with the fantastic SQL Compare command line tools and SQL Source Control from Red Gate, this task is made so much easier.

I wanted something relatively simple (as always), something that meant not having to install a framework. I did look at TeamCity, NAnt and MSBuild, but decided against using any of these, at least in the short term. My reasoning being that something light and fairly portable with as few dependencies as possible would be more useful at this stage.

The intention was to set up a nightly build job that would:

  1. Get the latest code from the development branch from SVN
  2. Deploy any schema changes to the development databases
  3. Delete and re-deploy the SSIS packages for the ETL

I used the excellent Continuous Integration article from Red Gate to get me going. I had started to implement this using a batch file (yuk!), but quickly moved onto using PowerShell, my new favourite toy on Windows! The examples in each section will show the PowerShell commands used in the script.

Setup and Configuration

Working locally initially, I created the following folder structure on the C drive:

  • NightlyBuilds
  • — BI (the project name)
  • —— Configuration (to hold any configuration files)
  • —— SVN (for the SVN checkout)
  • —— Output (for any output or report files)

Source Control

Our development team use dedicated local databases and push changes to the centralised Subversion (SVN) repository using Red Gate SQL Source Control. The first step in the build was to check out the latest SQL schema scripts and SSIS packages. After installing the SlikSVN client, I could use the following Powershell commands to perform the check out:

Set-Location "C:Program FilesSlikSVNbin"
svn checkout http:///repos/BI/Database "c:NightlyBuildsBISVNDatabase"
svn checkout http:///repos/BI/ETL/branches/Phase2 "c:NightlyBuildsBISVNETL"

Deploy Schema Changes

Next up was to use the SQL Compare command line tools to compare each of the databases on the development environment with the schema scripts.

Set-Location "c:Program FilesRed GateSQL Compare 9"
Invoke-Expression ".sqlcompare /ArgFile:C:NightlyBuildsBIConfigurationEDWCompare.xml"
Invoke-Expression ".sqlcompare /ArgFile:C:NightlyBuildsBIConfigurationStagingCompare.xml"

At the moment, I’m using XML argument files for each comparison (this may well change as this progresses), these have a format like:

server name

Deploy SSIS Packages

With the databases being up to date, the next task was to deploy the SSIS packages. Having experienced a couple of issues in the past with packages not being overwritten correctly, I decided that, to be on the safe side, I’d delete an existing package before re-deploying it. The command line tool dtutil.exe lets us perform these tasks and more, but it can be incredibly frustrating to use at times, with it’s inconsistent syntax and problems with differences in SQL Server versions and 32 and 64 bit versions. Using the following function, the path to the SSIS packages is specified and each one is deleted on development, then redeployed.

function DeployPackages 
    param($filepath, $ssisfolder, $server)

    Set-Location $filepath

    $files=get-childitem *.dtsx

    foreach ($file in $files) {
        $deletearguments = " /delete /SQL $ssisfolder/$basename /SourceServer $server"
        $deployarguments = " /file $file /copy SQL``;$ssisfolder/$basename /destserver $server"

        Set-Location $dtutil

        Invoke-Expression "dtutil $deletearguments"
        Invoke-Expression "dtutil $deployarguments"

DeployPackages "C:NightlyBuildsBISVNETLSourceToStage" "EDW/SourceToStage" ""
DeployPackages "C:NightlyBuildsBISVNETLStageToWarehouse" "EDW/StageToWarehouse" ""


So that, in a nutshell, is the process I’m currently using. It works, which is nice, but I have quite a few improvements in mind:

  • Firstly, it seems to be very slow when running dtutil. I’m not sure if this is because I’m currently doing everything over a VPN, but I’ll be looking to improve the general performance. To be honest, it’s not a major issue, as it’s mainly designed for builds that happen overnight, but it would be nice if I could streamline it. I did try to use the /fdelete switch on dtutil to remove an entire folder, but that demands that the folder is empty.
  • Add more parameterisation so that it could be used for different projects on different environments
  • Use the SQL Data Compare command line tools to ensure any static/lookup data is up to date
  • Add an option to drop (if required) & create the databases and SSIS folders. This would allow the script to deploy to “virgin” machines and test the entire build.
  • Add job creation scripts
  • Build in SSRS report deployment. The project I’m working on does not use a great deal of SSRS reports (OBIEE is the end user tool), but given that report deployment can be a pain, it would be nice to include this too.
  • Add some reporting as to the success or failure of any component of the script.

Update: Looks like the VPN was indeed causing a slowdown. Running it connected locally to the network, 60+ SSIS packages were deployed in <3 minutes.


I’ve started a project on Codeplex for this, really just as a place to keep the code. Any suggestions or comments are, as always, more than welcome.

Leave a Reply

%d bloggers like this: