Implementing Continuous Delivery with VSO
In another series of blog posts on this website I describe how to implement continuous delivery with TFS. I start from scratch by describing how to install and configure TFS and there's no denying that it's quite a lot of work. Once installed TFS can require a not inconsiderable amount of care and feeding and in bigger organisations it's almost certainly going to be someone's full-time job. There is an alternative though -- Visual Studio Online. This is a SaaS version of TFS hosted in Azure, and started life as Team Foundation Service back in 2012. The name change coincided with the release of Visual Studio 2013 in November 2013.
VSO isn't the full-blown TFS as it's missing the SSRS reporting capabilities and the SharePoint portal integration. At the time of writing a subscription can only consist of one Team Project Collection and editing of process templates isn't supported. On the other hand, VSO receives updates approximately every three weeks so it contains new application features well ahead of TFS. A post here has a nice comparison. What you may find interesting is that as per Brian Harry's blog post Microsoft are planning to gradually move their teams currently using TFS over to VSO. If that isn't putting faith in VSO then I don't know what is!
So if it's good enough for Microsoft it's surely good enough for the rest of us. But as always the key questions for some people are how to get started and can using VSO give us the nice integration with the other tools such as Microsoft Test Manager and Release Management? This blog post series will focus on answering those very questions. I'm aiming to write a soup-to-nuts guide on how to implement continuous delivery with VSO, comparing and contrasting with the TFS blog post series as we go. Do use the comments system to give me feedback!
Cheers -- Graham
Continuous Delivery with TFS: Behind the Scenes of the RM Deployment Agent
As with many aspects of technology understanding how something works behind the scenes can be a real boon when it comes to troubleshooting. In this post in my blog series on implementing continuous delivery with TFS we take a look at the Release Management for Visual Studio Deployment Agent, and specifically how it does its thing. Bear in mind that I don't have any inside knowledge about the Deployment Agent and this post is just based on my own experience and observations.
Basic Hook-Up
The first step in eliminating easy errors with the Deployment Agent is to ensure that it is installed correctly and can communicate with the RM Server. The key question is whether your servers are part of a domain. If they are then the easiest way to configure RM is to create a domain account (RMDEPLOYER for example) and add this to the Manage Users section of the RM Client as a Service User. On target nodes add this domain account to the Administrators group and then install the Deployment Agent specifying the domain RMDEPLOYER as the service account. See this post for a bit more detail. If your servers are not part of a domain you will need to use shadow accounts which are simply local accounts that all have the same name and password. The only difference is that you add the different shadow accounts of the different nodes to the Manage Users section of the RM Client as a Service User -- make sure you use the machine name as well as the account name ie the Windows Account field should be $MyNodeName$\RMDEPLOYER.
The test that all is working is to see the servers listed as Ready in the RM Client under Configure Paths > Servers. Something I have observed in my demo environment is that when deployment nodes boot up before my all-in-one TFS machine they don't seem to communicate. When that happens I use a PowerShell script to remotely restart the service (eg Start-AzureVM -ServiceName $cloudservicename -Name $SERVERNAME
).
In a production environment your circumstances could be wildly different from a clean demo setup. I can't cover all the scenarios here but if you are experiencing problems and you have a complicated environment then this post could be a good troubleshooting starting point.
Package Deployment Mechanism
When the agent is installed and running it polls the RM server for packages to deploy on a schedule that can be set in the RM client under Administration > Settings > Deployer Settings > Look for packages to deploy every. On my installation it was set at 28 seconds but if time is critical you may want to shorten that.
When the agent detects that it has a package to deploy or actions to perform it copies the necessary components over to C:\Users\RMDEPLOYER\AppData\Local\Temp\RM\T\RM (where RMDEPLOYER is the name of the account the Deployment Agent is running under which might be different in your setup). There are at least two types of folder that get created:
- Deployer Tools. This contains any tools and their dependencies that are needed to perform tasks. These could be executables, PowerShell scripts and so on. They are organised in a folder structure that relates to their Id and version number in the RM server database. For example in my database XCopy Deployer (irxcopy.cmd) has Id = 12 Version = 2 in dbo.DeployerTool and is thus deployed to C:\Users\RMDEPLOYER\AppData\Local\Temp\RM\T\RM\DeployerTools\12\2.
- Action or Component. These folders correspond to the actions that will take place on that node. The names are the same as the names in the Release Management client Deployment Log (from Releases > Releases). A sub folder (whose name includes the date and time and other more mysterious numbers) contains the tool, the files it will deploy or otherwise work with and a file called IR_ProcessAutoOutput.log which is the one displayed when clicking the View Log link in the Deployment Log:

Component folders warrant a little bit more analysis. What exactly gets deployed to the timestamped sub-folder is dependant on how the component is configured under Configure Apps > Components, specifically the Build Drop Location. If this is configured simply with a backslash (\) then all of the drop folder is deployed. This can be further refined by specifying a specific folder, in which case the contents of that folder get deployed. For example the Contoso University\Deploy Web Site component specifies \_PublishedWebsites\ContosoUniversity.Web as the Build Drop Location folder which means that just the website files are deployed.
It's perhaps worth noting here that there are two mechanisms for the Deployment Agent to pull in files: UNC or HTTP(S). This is configured on a per-server basis in Configure Paths > Servers > Deployment Agent. UNC is much quicker than HTTP(S) but the latter method is the only choice if your node doesn't have access to the UNC path.
A final aspect to touch on is that over time the node would get choked with old deployments if no action were taken, and to guard against this the Deployment Agent runs a cleanup process on a schedule specified in Administration > Settings > Deployer Settings. This is something to look at if disk space is tight.
Debugging Package Deployment
Having described how package management works -- at least at a high level -- what are the troubleshooting options when a component is failing to deploy or run correctly? These are the logs that are available on a target node:
- IR_ProcessAutoOutput.log -- saved to the action or component folder as above.
- DeploymentAgent.log -- cumulative log saved to C:\Users\RMDEPLOYER\AppData\Local\Temp\Microsoft\ReleaseManagement\12.0\Logs.
- $GUID$DeploymentAgent.log -- instance log saved to C:\Users\RMDEPLOYER\AppData\Local\Temp\Microsoft\ReleaseManagement\12.0\Logs. Not sure of the value of these since I've never seen them contain anything.
If between them these logs don't provide sufficient information for troubleshooting your problem you can increase the level of detail -- this post has the details but don't forget to restart the Microsoft Deployment Agent service. Additionally, if you have SMTP set up and working you will also receive a Deployment Failed! notification email. This can be particularly useful because it invariably contains the command line that failed. This leads on to a useful debugging technique where you rerun the failing command yourself. For example if the command was running a PowerShell script simply open the PowerShell console, switch to the correct folder and paste in the command from the email. Chances are that you will get a much more informative error message this way.
Final Thoughts
I know from personal experience that debugging RM components can be a frustrating experience. Typically it's a daft mistake with a parameter or something similar but sorting this type of problem out can really eat time. Do you have any tips for debugging components? Are there other error logs that I haven't mentioned? Please do share your experiences and findings using the comments.
Cheers -- Graham
Getting Started with Visual Studio Online
Not everyone wants or needs the full-blown power of an on-premises TFS installation and if that's you then Visual Studio Online (TFS in the cloud) is a great way to integrate Visual Studio with version control, backlog management, agile planning tools and many of the other great features that are available with TFS. Some things in VSO work in a similar way to TFS so some of the learning resources can do double duty -- my Getting Started with Team Foundation Server and ALM guide is here. There are also dedicated VSO learning resources and this is my recommend list:
One of the really great things about VSO is that you can get started for free (and it can stay free within very generous limits). Combined with the free Visual Studio Community 2013 it makes for a fantastic learning opportunity for anyone without an MSDN account.
Cheers -- Graham
Continuous Delivery with TFS: Automatically Versioning Assemblies as Part of The Build
In a previous post in this series on implementing continuous delivery with TFS we looked at how some simple tweaks to the build process can help with the goal of baking quality in. This post continues in the vein of making improvements to the pipeline by addressing the issue of assembly versioning. What issue is that, I hear some of you asking? It's the situation where your Visual Studio solution contains many projects (maybe dozens) and you want all the projects to have same assembly versioning, ie the details you would traditionally set in AssemblyInfo.cs. A Google search will reveal several ways to accomplish this but most techniques involve some maintenance when a new project is added. In this post I explain how to make a publicly available low maintenance solution work with the Release Management build process template. I should point out that this issue won't affect everyone, and if you or your business don't care about this issue then do feel free to ignore. It is quite interesting though as it involves editing a build process template.
TFSVersioning on CodePlex
If assembly versioning is important to you and you use TFS there is a good chance you've seen the TFSVersioning solution available on CodePlex. It's a very nice piece of work that versions all of your solution's assemblies as part of the build process. If a new project is added it automatically gets included, so it's a low maintenance solution. There are essentially two ways to use TFSVersioning -- with the build process template that the project provides or with your own process template. This latter technique is a little involved as it requires editing your build template, but it's the technique we need to use here since we would like to use the ReleaseTfvcTemplate.12.xml build process template that ships with Release Management 2013.4. It turns out that editing this template is quite a job and I'm indebted to my good friend, colleague and TFS guru Bharath Sundaresan for figuring out all of the complicated details. An additional required hurdle is that the project hasn't been updated for TFS 2013 but fortunately it's not a lot of work to remedy this. The TFSVersioning deployment pack is available from the Downloads page and it has some great documentation which I recommend reading before you begin.
Update TFSVersioning for TFS2013
The core component that we need to update for TFS 2013 is TfsBuild.Versioning.Activities.dll. To accomplish this follow these steps:
- Download the latest source code for TFSVersioning from the Source Code page and unzip to somewhere convenient.
- Navigate to the latest version under Prod and open BuildVersioning.sln. Remove the TfsBuild.Versioning.Activities.Tests and TfsBuild.Versioning.Activities.Tests.Stubs projects as we don't need to amend them for what we are doing here.
- Expand the References node of the TfsBuild.Versioning.Activities project and notice that the Microsoft.TeamFoundation.* references are marked as missing:

- Remove these references and replace them with the 2013 versions from C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\ReferenceAssemblies\v2.0.
- You shoud now be able to build a Release version of TfsBuild.Versioning.Activities.dll.
Once you have succesfully updated the project for TFS 2013 it's probably a good idea to make sure that a basic installation of TFSVersioning works. Follow these steps to verify this:
- Download and unzip the latest TFSVersioning deployment pack -- currently 2.0.1. Copy VersioningBuildTemplate20.xaml from the pack to the ContosoUniversity BuildProcessTemplates folder and check in to version control.
- Under ContosoUniversity create a new folder called CustomActivityStorage and copy over the new version of TfsBuild.Versioning.Activities.dll. Check in to version control.
- From Team Explorer in Visual Studio navigate to Builds > Actions > Manage Build Controllers.
- In Manage Build Controllers dialog choose Properties and in Manage Controller Properties set Version control path to custom assemblies to $/ContosoUniversity/CustomActivityStorage.

- Now create a test build definition, replacing the standard release template with VersioningBuildTemplate20.xaml and setting all required properties including the drop folder.
- Whilst editing the build definition set any properties under the Build Versioning and Build Versioning (Optional) sections as you wish. Refer to the documentation for TFSVersioning for details.
- Queue a manual build of the test build definition. Observe in the drops folder that the ContosoUniversity.* binaries all have the same File version.
Update the ReleaseTfvcTemplate.12 Release Template with the TFSVersioning Custom Activity
This process broadly follows the Harder Installation but More Instructive section of the TfsVersioning User and Development Guide however modifying ReleaseTfvcTemplate.12.xaml requires several more steps. Partly this is because TfsBuild.Versioning.Activities.dll contains more functionality that isn't referred to in the documentation and partly because ReleaseTfvcTemplate.12.xaml is missing some activities that (reading between the lines) were present in the template that was used by the TFSVersioning project. In the instructions below I assume a degree of familiarity with editing release templates. If you need guidance take a look here for just one example of how to get started. You should be aware that there are two ways to edit templates: through the XAML designer and through notepad. The former is less prone to error but slow and the latter is much faster but with the distinct possibility of a copy and paste error. The technique I describe below also sets you up for relatively easy debugging of the process template since there is a good chance of not getting everything right first time.
- Install the updated TfsBuild.Versioning.Activities.dll to the Global Assembly Cache by opening a Visual Studio command prompt (from C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\Tools\Shortcuts if your shortcuts are missing in Windows 8.1) and issuing a command similar to gacutil -i "C:\Users\Graham\Downloads\tfsversioning-103318\Prod\V 2.0.1.0\Source\TfsBuild.Versioning.Activities\bin\Release\TfsBuild.Versioning.Activities.dll".
- Copy C:\Program Files (x86)\Microsoft Visual Studio 12.0\Release Management\Client\bin\ReleaseTfvcTemplate.12.xaml to the ContosoUniversity BuildProcessTemplates folder and check in to version control. Chances are you already have a template with the same name so you'll probably want to change the name to ReleaseTfvcTemplate.12.Versioning.xaml or similar. Once checked in open this template in Visual Studio so it displays in the XAML editor.
- Set up the Visual Studio Toolbox to work with TfsBuild.Versioning.Activities.dll in a custom tab. You can reference the version in CustomActivityStorage. Note that you only need to add the VersionAssemblyInfoFiles item.
- Drag the VersionAssemblyInfoFiles activity from the toolbox to the workflow as the first item under Compile, Test and Publish. Feel free to give the activity a custom name. If you examine the properties of the activity you will see all the InArguments that need to be married up with either Variables or Arguments that do not yet exist in the process template:

- The arguments can be created as per the TfsVersioning User and Development Guide instructions using the Arguments editor but a faster way is to open the template in Notepad, copy the following values and append them to the <x:Members> section.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18
|
<x:Property Name="AssemblyCompanyPattern" Type="InArgument(x:String)" /> <x:Property Name="AssemblyConfigurationPattern" Type="InArgument(x:String)" /> <x:Property Name="AssemblyCopyrightPattern" Type="InArgument(x:String)" /> <x:Property Name="AssemblyCulturePattern" Type="InArgument(x:String)" /> <x:Property Name="AssemblyDescriptionPattern" Type="InArgument(x:String)" /> <x:Property Name="AssemblyFileVersionPattern" Type="InArgument(x:String)" /> <x:Property Name="AssemblyInfoFilePattern" Type="InArgument(x:String)" /> <x:Property Name="AssemblyInformationalVersionPattern" Type="InArgument(x:String)" /> <x:Property Name="AssemblyProductPattern" Type="InArgument(x:String)" /> <x:Property Name="AssemblyTitlePattern" Type="InArgument(x:String)" /> <x:Property Name="AssemblyTrademarkPattern" Type="InArgument(x:String)" /> <x:Property Name="AssemblyVersionPattern" Type="InArgument(x:String)" /> <x:Property Name="BuildNumberPrefix" Type="InArgument(x:Int32)" /> <x:Property Name="BuildSettings" Type="InArgument(mtbwa:BuildSettings)" /> <x:Property Name="DoCheckinAssemblyInfoFiles" Type="InArgument(x:Boolean)" /> <x:Property Name="ForceCreateVersion" Type="InArgument(x:Boolean)" /> <x:Property Name="UseVersionSeedFile" Type="InArgument(x:Boolean)" /> <x:Property Name="VersionSeedFilePath" Type="InArgument(x:String)" /> |
- The next step is to add the metatdata items that allow each of the above arguments to be set. Again, it's possible to use the Metadata editor but the faster Notepad way is to copy the following values and append them to the <mtbw:ProcessParameterMetadataCollection> section.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17
|
<mtbw:ProcessParameterMetadata Category="Build Versioning" Description="This is the pattern used to replace the AssemblyFileVersion value." DisplayName="Assembly File Version Pattern" ParameterName="AssemblyFileVersionPattern" /> <mtbw:ProcessParameterMetadata Category="Build Versioning" Description="This is the pattern used to replace the AssemblyVersion value." DisplayName="Assembly Version Pattern" ParameterName="AssemblyVersionPattern" /> <mtbw:ProcessParameterMetadata Category="Build Versioning" Description="This is the pattern used to find the AssemblyInfo files. Generally, you shouldn't need to change this value." DisplayName="AssemblyInfo File Pattern" ParameterName="AssemblyInfoFilePattern" /> <mtbw:ProcessParameterMetadata Category="Build Versioning" Description="Indicated whether the AssemblyInfo files should be checked back into source control after they are modified." DisplayName="Perform Check-in of the AssemblyInfo Files" ParameterName="DoCheckinAssemblyInfoFiles" /> <mtbw:ProcessParameterMetadata Category="Build Versioning" Description="If true, the versioning process will create AssemblyVersion or AssemblyFileVersion values even if they do not already exist." DisplayName="Force Create Version" ParameterName="ForceCreateVersion" /> <mtbw:ProcessParameterMetadata Category="Build Versioning" Description="Indicate which values to use as the versioning patterns. If set to True, the "seedfile.xml" file must exist in the location described by the "Version Seed File Path" setting. Otherwise, the "Assembly Version Pattern" and "Assembly File Version Pattern" values will be used." DisplayName="Use Version Seed File" ParameterName="UseVersionSeedFile" /> <mtbw:ProcessParameterMetadata Category="Build Versioning" Description="Relative path location for the seed (xml) file containing the Assembly Version and Assembly File Version values." DisplayName="Version Seed File Path" ParameterName="VersionSeedFilePath" /> <mtbw:ProcessParameterMetadata Category="Build Versioning" Description="Number added to the version component that uses the "B" symbol pattern (Build Number). This helps create a unique version for a build definition." DisplayName="Build Number Prefix" ParameterName="BuildNumberPrefix" /> <mtbw:ProcessParameterMetadata Category="Build Versioning (Optional)" Description="Assembly Title Attribute: String value specifying a friendly name for the assembly. For example, an assembly named comdlg might have the title Microsoft Common Dialog Control." DisplayName="Assembly Title Pattern" ParameterName="AssemblyTitlePattern" /> <mtbw:ProcessParameterMetadata Category="Build Versioning (Optional)" Description="Assembly Description Attribute: String value specifying a short description that summarizes the nature and purpose of the assembly." DisplayName="Assembly Description Pattern" ParameterName="AssemblyDescriptionPattern" /> <mtbw:ProcessParameterMetadata Category="Build Versioning (Optional)" Description="Assembly Configuration Attribute: String value indicating the configuration of the assembly, such as Retail or Debug. The runtime does not use this value." DisplayName="Assembly Configuration Pattern" ParameterName="AssemblyConfigurationPattern" /> <mtbw:ProcessParameterMetadata Category="Build Versioning (Optional)" Description="Assembly Company Attribute: String value specifying a company name." DisplayName="Assembly Company Pattern" ParameterName="AssemblyCompanyPattern" /> <mtbw:ProcessParameterMetadata Category="Build Versioning (Optional)" Description="Assembly Product Attribute: String value specifying product information." DisplayName="Assembly Product Pattern" ParameterName="AssemblyProductPattern" /> <mtbw:ProcessParameterMetadata Category="Build Versioning (Optional)" Description="Assembly Copyright Attribute: String value specifying copyright information." DisplayName="Assembly Copyright Pattern" ParameterName="AssemblyCopyrightPattern" /> <mtbw:ProcessParameterMetadata Category="Build Versioning (Optional)" Description="Assembly Trademark Attribute: String value specifying trademark information.
" DisplayName="Assembly Trademark Pattern" ParameterName="AssemblyTrademarkPattern" /> <mtbw:ProcessParameterMetadata Category="Build Versioning (Optional)" Description="Assembly Culture Attribute: Enumerated field indicating the culture that the assembly supports. An assembly can also specify culture independence, indicating that it contains the resources for the default culture." DisplayName="Assembly Culture Pattern" ParameterName="AssemblyCulturePattern" /> <mtbw:ProcessParameterMetadata Category="Build Versioning (Optional)" Description="Assembly Informational Version Attribute: String value specifying version information that is not used by the common language runtime, such as a full product version number. " DisplayName="Assembly Informational Version Pattern" ParameterName="AssemblyInformationalVersionPattern" /> |
- Back in the XAML editor navigate to the Arguments editor and supply default values for some arguments as follows:
- AssemblyFileVersionPattern = "1.0.J.B"
- AssemblyInfoFilePattern = "AssemblyInfo.*"
- AssemblyVersionPattern = "1.0.0.0"
- BuildNumberPrefix = 0
- DoCheckinAssemblyInfoFiles = False
- ForceCreateVersion = False
- UseVersionSeedFile = False
- VersionSeedFilePath= "TfsVersion\VersionSeed.xml"
- Navigate to the Variables editor and create the following variables (you may need to Browse for Types to get some of the variable types):
- Name = BuildAgent; Variable Type = Microsoft.TeamFoundation.Build.Client.iBuildAgent; Scope = Compile, Test and Publish
- Name = BuildDetail; Variable Type = Microsoft.TeamFoundation.Build.Client.iBuildDetail; Scope = Compile, Test and Publish
- Name = BuildDirectory; Variable Type = String; Scope = Compile, Test and Publish
- Name = Workspace, Variable Type = Microsoft.TeamFoundation.VersionControl.Client.Workspace; Scope = Compile, Test and Publish
- From Toolbox > Team Foundation Build Activities add the following activities to the top of Compile, Test and Publish so they appear in the order listed:
- Activity = GetBuildAgent; Result = BuildAgent
- Activity = GetBuildDetail; Result = BuildDetail
- Activity = GetWorkspace; Name = String.Format("{0}_{1}_{2}", BuildDetail.BuildDefinition.Id, Microsoft.TeamFoundation.LinkingUtilities.DecodeUri(BuildAgent.Uri.AbsoluteUri).ToolSpecificId, BuildAgent.ServiceHost.Name); Result = Workspace
- Return to the properties of the VersionAssemblyInfoFiles activity and use the ellipsis at the end of each row to replace Enter a VB expression with the correct value. The result should be as follows:

- As a final step in this section save all the changes and check them in to version control.
Testing the Updated ReleaseTfvcTemplate.12 Release Template
At long last we are in a position to test the new template. The easiest way is to edit the test build definition created above and replace VersioningBuildTemplate20.xaml with our updated ReleaseTfvcTemplate.12.xaml version. Set any properties as required and queue a new build. With luck you will have a successful build and a set of uniformly versioned assemblies!
If you are having difficulty in implementing the steps above the debugging process is reasonably straightforward. Once the build template has been added to the test build definition you can make changes to the template, save them and then check them in to version control. Simply queue a new build to check your changes.
The final piece of the jigsaw when everything is working is to edit ContosoUniversity_Main_Nightly to use the new version of the template. And to enjoy a well-deserved drink.
Cheers -- Graham
Continuous Delivery with TFS: Save to a Folder for Stages You Can’t Yet Deploy to
In previous posts in this blog series on continuous delivery with TFS our activities have been geared up to deploying the sample application to target servers -- or nodes as they are sometimes referred to. But what happens when for some reason you have an environment that's just not ready to be a target for automated deployment? Maybe the business is just not ready to go that far right now. Maybe there is some technical hurdle you have yet to overcome. On the other hand you have already experienced how easy it is to manage the configuration of your application with Release Management and how it can bring config order where once there was chaos.
If you find yourself in this position a possible interim solution is to use Release Management to create what I call Release Ready Builds™ of your application. These are builds which have the correct configuration for the environment they are destined for but which are saved to a staging disk location rather than being actually deployed to target servers. Deployment of these builds is still likely to be a manual process but that's probably what you are doing already and at least the configuration and any other jiggery-pokery can be taken care of by Release Management.
Create a New Stage
In order to illustrate this technique I'm going to add a new stage to the deployment pipeline called PRD. This will represent the production environment that for whatever reason I'm unable to deploy to using Release Management automation. Carry out the following in Release Management:
- Add a stage type called PRD from Administration > Manage Pick Lists > Stage Type.
- Create a new Agent-based environment called Contoso University\PRD from Configure Paths > Environments. I linked my web server (ALMWEB01) for convenience but in a non demo context you would probably want to use something more permanent such as a build server. This would of course entail installing the Deployment Agent on that server.
- Add the new environment to the Contoso University\DAT>DQA release path from Configure Paths > Agent-based Release paths. The stage should be fully automated with email notifications turned off and this is probably a good time to change the name of the release path to Contoso University\DAT>DQA>PRD.

- Whilst in the newly renamed Contoso University\DAT>DQA>PRD release path help to speed up the debugging process by making the DQA stage fully automated, removing any Approvers from the Approval Step and turning off email notifications.
- Open the Contoso University\DAT>DQA release template from Configure Apps > Agent-based Release Templates and in Properties change the name to Contoso University\DAT>DQA>PRD.
- On the web server (ALMWEB01 in my case) create a folder called C:\ReleaseReadyBuilds.
Configure New Components
With the PRD stage added we now need to create new components that will deploy to a folder structure. For the web application it's not really all that different because we're using XCopy anyway as the deployment method so it's just a case of specifying a new location. The database side of things is more interesting because we need to create scripts rather than run actions against a database. On top of all this we need a mechanism to ensure that each release is placed in a unique folder. To achieve this carry out the following steps in Release Management Configure Apps > Components:
- Create a new component called Contoso University\Script DACPAC and configure as follows:
- Source > Builds with application (= selected) > Path to package = \
- Deployment:
- Tool = DACPAC Database Deployer
- Arguments = /Action:Script /SourceFile:"__FileName__" /TargetServerName:"__ServerName__" /TargetDatabaseName:"__DatabaseName__" /OutputPath:"__OutputPath__"
- Parameters > OutputPath > Description = The location for the DACPAC script file
- Create a new component called Contoso University\Script Login & User SQL Script and configure as follows:
- Source > Builds with application (= selected) > Path to package = \Scripts
- Deployment:
- Tool = Windows Common IO
- Arguments = -File ./ManageWindowsIO.ps1 -Action __Action__ -FileFolderName "__FileFolderName__" -DestinationName "__DestinationName__"
- Parameters > DestinationName > Description = Location to copy the file to
- Configuration Variables:
- Variable Replacement Mode = Before Installation
- File Extension Filter = *.sql
- Parameter #1 = LOGIN_OR_USER | Standard | Name of login or user to create
- Parameter #2 = DB_NAME | Standard | Database to set security for
- Create a new component called Contoso University\RRB and configure as follows:
- Source > Builds with application (= selected) > Path to package = \
- Deployment:
- Tool = Windows Common IO
- Arguments = -File ./ManageWindowsIO.ps1 -Action __Action__ -FileFolderName "__FileFolderName__" -DestinationName "__DestinationName__"
- Parameters > DestinationName > Description = Destination name
Configure the PRD Stage
Next we need to configure the PRD stage. To achieve this carry out the following steps in the Contoso University\DAT>DQA>PRD agent-based release template:
- In order to speed up the debugging process visit the DAT and DQA stages and click the top left hand icon of every action or component to toggle the skip state:

- In the PRD stage drag the ALMWEB01 server to the Deployment Sequence.
- Drag the Contoso University\Deploy Web Site to ALMWEB01 and set the parameters as follows:
- Installation Path = C:\ReleaseReadyBuilds\PRD\Web
- DATA_SOURCE = ALMSQL01
- INITIAL_CATALOG = CU-PRD
- Navigate to Toolbox > Windows OS and drag a Create Folder action below Contoso University\Deploy Web Site and set the FolderName to C:\ReleaseReadyBuilds\PRD\Database.
- Navigate to Toolbox > Components and right-click the Components node. Add Contoso University\Script DACPAC.
- Drag the Contoso University\Script DACPAC component below the Create Folder action and set the parameters as follows:
- FileName = ContosoUniversity.Database.dacpac
- ServerName = ALMSQL01
- DatabaseName = CU-PRD
- OutputPath = C:\ReleaseReadyBuilds\PRD\Database\DACPAC.sql
- Navigate to Toolbox > Components and right-click the Components node. Add Contoso University\Script Login & User SQL Script.
- Drag the Contoso University\Script Login & User SQL Script component below the Contoso University\Script DACPAC component and set the parameters as follows:
- Action = move
- FileFolderName = Create login and database user.sql
- DestinationName = C:\ReleaseReadyBuilds\PRD\Database\Create login and database user.sql
- LOGIN_OR_USER = ALM\CU-PRD
- DB_NAME = CU-PRD
- Navigate to Toolbox > Components and right-click the Components node. Add Contoso University\RRB.
- Drag the Contoso University\RRB component below the Contoso University\Script Login & User SQL Script component and set the parameters as follows:
- Action = create
- FileFolderName = C:\ReleaseReadyBuilds\$(BuildNumber)
- Drag a second Contoso University\RRB component below the first one and set the parameters as follows:
- Action = move
- FileFolderName = C:\ReleaseReadyBuilds\PRD
- DestinationName = C:\ReleaseReadyBuilds\$(BuildNumber)\PRD
Check the Stage Works
Navigate to Releases > Releases and manually create a release based on the latest build. Because the previous stages are in skip mode the PRD stage should finish almost immediately and you should very quickly be able to verify everything is working. Essentially you should be checking for a folder similar to C:\ReleaseReadyBuilds\ContosoUniversity_Main_Nightly_20150214.8 that contains a PRD folder with Web and Database folders containing our deployable artefacts. The magic that achieves this is the $(BuildNumber) configuration variable that is one of several that are only available in components. If that's all working then you should revisit the DAT and DQA stages and toggle all the component and actions so that they are not skipped, but leave DQA in automated mode. Now run a couple of complete builds from the nightly build definition in Visual Studio, confirming that each PRD build is placed in its own separate folder. Finally, when that's working you can revisit the DQA stage and return it to its manual status.
By way of finishing off I want to stress that you should be doing everything possible to use the full automation capabilities of Release Management to deploy your application along the whole delivery pipeline. But when you really do have a problem with a particular stage hopefully the technique I've illustrated here will tide you over.
Cheers -- Graham
Continuous Delivery with TFS: Promoting a Release to the DQA Stage
In a previous post in this series on implementing continuous delivery with TFS we arrived at the point of being able to run automated web tests on a build of the application deployed to the DAT environment. Any build that passes this stage is a candidate to be promoted to the next stage in the pipeline, and in my demo environment this is DQA (development quality assurance). The scenario I had in mind when building the demo environment was that it would be used by a Scrum team delivering features from a backlog of Product Backlog Items. Part of this development effort would include some manual testing, either of features that are impossible or impractical to test automatically ("text is a certain colour" for example), or for features where the effort of writing an automated test wouldn't make sense. In my demo environment the DQA stage is where this manual testing activity would take place.
Typically, not every build that passes the DAT stage will be promoted to the DQA stage as there may be insufficient change to warrant manual inspection. Additionally, if someone is in the middle of running some manual tests in DQA they probably don't want to find that the build they are testing has suddenly been replaced by a new one. Consequently the deployment of builds that do need to be promoted to DQA will need to be triggered manually. This leads on to questions such as who should trigger a deployment to DQA, how can they decide which build to promote if there are choices to be made and which tools should be used? (Note that I'm assuming here that any build that passes the DAT stage is a candidate for promotion to DQA. It's perfectly possible though to build in an approval step at the end of the DAT stage to require someone to explicitly indicate that a build can be promoted to DQA.)
Revisiting the Release Management Approvals Workflow
In order to understand the options that are available for answering the questions I have just posed a good starting point is to revisit the Release Management approvals workflow for the DQA stage by opening Release Management and navigating to Configure Paths > Agent-based Release Paths and opening the Contoso University\DAT>DQA release path. In a previous post we configured the DQA stage so that each stage is manual and all activity is governed by members of the Quality Assurance security group:

I am well aware that The Scrum Guide "recognizes no titles for Development Team members other than Developer" and I'm not going to go in to the ins and outs of which team members should be allowed to govern the deployment of releases. Suffice it to say though that Release Management provides plenty of options for whatever scenario you are working with, and each step can be either a named individual or a security group consisting of one of more individuals.
Looking at the DQA stage image in closer detail there are some subtle details (highlighted in red) that are worth mentioning. Firstly, it's possible to toggle email notifications for each step by clicking the email icon, which changes to a brighter image when notifications are turned on. Secondly, it's possible to have multiple Approvers for the Approval Step. Note that this means that all Approvers need to approve before the build can become a candidate for the next stage. (A further stage hasn't been defined in my demo environment so the Approval Step concept doesn't make complete sense here, but you get the idea.)
Managing the Governance of the Release Process
Having looked at the options for setting up who should be allowed to govern the release to DQA process we can now turn to the actual process of managing it. Although the Release Management client allows for the management of the approvals process (Releases > My Approval Requests) there is another tool called Release Explorer (actually a web page) which is dedicated to this purpose. This simplifies things somewhat since you don't need to install the client on lots of PCs and you don't need to have lots of users needing to get to grips with the client user interface. The Release Explorer URL will be something like http://almtfsadmin:1000/ReleaseManagement -- obviously if your server name is different then so will the URL be. The first time you open Release Explorer can be a bit of a shock -- it's ugly and of limited functionality to say the least. The Release Management User Guide has instructions on how to use what functionality is present.
With DQA configured as above and with all email notifications turned on triggering a build of ContosoUniversity_Main_Nightly will result in the following process once the DAT stage is complete:
- Anyone in the Acceptance Step > Approver role receives an Deployment Acceptance Required! email.
- Release Explorer lists the release as waiting for Acceptance. If you have multiple releases listed and you have a reasonably descriptive release template name chances are that you'll need to hover over the release name to show the date and time as a popup (highlighted in red below):

- Still in Release Explorer the release can be selected and clicking the Approve button displays a dialog for confirming the actual deployment to DQA with the ability to add a comment.
- At the start of the deployment anyone in the Deployment Step > Owner role receives a Deployment in Progress email.
- At the end of the deployment anyone in the Validation Step > Validator role receives a Deployment Validation Required! email.
- Release Explorer lists the release as waiting for Validation. A person in the Validator role would presumably navigate to the application and make sure it is working as expected or in a more sophisticated setup start a series of automated validation smoke tests.
- Still in Release Explorer the release can be selected and clicking the Approve button displays a dialog for confirming the successful deployment to DQA with the ability to add a comment.
- Anyone in the Approval Step > Approver role receives a Deployment Approval Required! email.
- Release Explorer lists the release as waiting for Approval. All the people in the Approver role now need to assure themselves by whatever means that, in the words of the email, the "deployed application meets the minimum quality requirements needed to complete this stage".
- Still in Release Explorer the release can be selected and clicking the Approve button displays a dialog for confirming the successful release to DQA with the ability to add a comment.
- In Release Management under Releases > Releases the release now has the status Released.
The full process can involve a lot of emails and a lot of individual steps. It might be overkill for every stage of your deployment but it's good to know that there is a comprehensive set of options should you need them.
Which Release to Promote to DQA?
Unfortunately the workflow around determining which release to actually promote to DQA is somewhat clunky and I can only hope that this will be improved in a future version of Release Management. Of course if you just want the latest and greatest then it's easy to work that out from Release Explorer as it will be the first release in the list. Another scenario though is where the development team are coding features against PBI Task work items and fixing bugs against Bug work items and checking code in against those work items as appropriate. In this case someone who is responsible for manual testing may want to choose a specific release that contains certain features or fixed bugs. Each build report has a list of Associated work items and either Visual Studio or Team Web Access can be used to inspect build reports but unfortunately there is nothing in Release Explorer. Of course each build report only lists what has been associated since the last build so there may be some effort required to find the build that contains all the desired work items.
Regrettably once the desired build has been identified in Visual Studio or Team Web Access there doesn't appear to be an exact way to match it with a release listed in Release Explorer since Release Explorer identifies a release by date and time and the build report uses date and build number for that day. Of course if only one release can become a candidate for promotion each day (because you run DAT overnight for example) then disambiguation will be easy but in a more fast-paced environment that may not be the case.
That completes our examination of the workflow for promoting a release to the DQA environment. If you have any ideas for fine tuning it do let me know via the comments.
Cheers -- Graham
Continuous Delivery with TFS: Making Sense of the DSC Feature in Release Management
When I first started listing the draft titles of blog posts for my series on implementing continuous delivery with TFS naturally the vNext / Agent-less / PowerShell DSC feature of Release Management that shipped with 2013.3 was on my list. And why not? Surely this was the successor to the agent-based way of doing things? Out with the old and in with the new...
Naturally I'd looked in to PowerShell DSC and knew that it was touted as a make-it-so technology for configuring Windows servers: rather than using an imperative script to install and configure components one uses a declarative approach that describes what a server should look like and PowerShell DSC goes off and does one's bidding, so to speak. It wasn't immediately obvious how the new vNext features of Release Management would relate to the delivery pipeline I was building in Azure but I trusted that time would tell. Well time has now told and as far as my research is concerned I can't see that the vNext features have any part to play. Deploying a DACPAC? Running automated tests via Microsoft Test Manager? The vNext features appear to be irrelevant.
Interestingly I'm not the only one who has come to the conclusion that vNext is not a must-do replacement for agent-based deployments. Both Colin Dembovsky and Donovan Brown have recently blogged on similar lines. So what is the point of the vNext features in Release Management? Clearly if you want to ensure that your environment is configured correctly before you deploy your components then a vNext release template might be the way to go. But most organisations are probably (or should be) thinking about automating the configuration of their servers at a higher, more global level, not just when it comes to triggering an actual deployment. Certainly at the time of writing this post I think I'm right in saying that if you want to use Release Management with Visual Studio Online you have to use a vNext release template, but this just feels like Microsoft haven't implemented using agent-based release templates yet.
Although I'm planning to cover PowerShell DSC in a different blog post series as far as this series is concerned I'm not going to complicate things by covering the vNext way of implementing releases as it feels like it won't add much value and will be entering a world of unnecessary rework and pain. Disagree? Sound off in the comments...
Cheers -- Graham
Getting Started with Windows PowerShell
If you are just getting started with Windows PowerShell or haven't done much with it yet you may be thinking that it is just another scripting language. Nothing could be further from the truth because although PowerShell is a scripting language it's also a huge amount more than that. A Wikipedia page here has a nice overview of the history of PowerShell and of the different features that became available with each version, and gives the reader a good idea about the breadth of functionality. A key concept to understand is that PowerShell is involved in almost every area of automation on the Windows and Azure platforms and knowing, learning and using PowerShell is increasingly going to be essential for anyone working with Windows or Azure. Here are my top learning resources for getting started with PowerShell:
PowerShell is huge and in terms of resources this is just the tip of the iceberg. In my view the two Jump Start series of videos on the Microsoft Virtual Academy are unmissable. What's great about them is that Jason Helmick is a superb presenter and extremely funny guy and Jeffrey Snover is also an excellent presenter and also the inventor of PowerShell. This all adds up to an immensely enjoyable series of videos where you learn about the history of PowerShell as well as how to use it. Also well worth watching are the two videos from TechEd North America 2014 -- lots of value for the time it takes to watch them. I've listed two courses from Pluralsight that are useful if you are just getting going with PowerShell but there are plenty more for anyone wanting to dig deeper.
Cheers -- Graham
Continuous Delivery with TFS: Running Automated Web Tests with MTM
In this instalment of my series on implementing continuous delivery with TFS we pick up where we left off in the previous post and add the automated web tests we created to Microsoft Test Manager. We then look at how to schedule these tests for automatic execution through the deployment pipeline. Exciting stuff so lets get started...
Configure a Test Controller
Our starting point is to configure our TFS admin machine as a test controller. (You could create a dedicated machine of course but in our demo environment it's an unnecessary overhead.)
- Create a new domain account called TFSTEST and add to the Administrators group on your ALMTFSADMIN machine.
- Download the Agents for Microsoft Visual Studio 2013. See here for 2013.4 releases.
- Mount the Agents for Microsoft Visual Studio 2013 iso and from the TestController folder install the test controller on ALMTFSADMIN configuring it to run with the TFSTEST account and for the appropriate Team Project Collection:

- Click Apply settings to ensure that the configuration steps were successful.
Configure a Web Client Test Machine
The big picture here is that we will deploy the Contoso University application to the DAT (Development Automated Test) environment and then run the automated tests against DAT. We won't be using our development machine and instead will make use of a dedicated web client test machine which should be configured as follows:
- Create an new Azure VM from a Windows 8.1 Enterprise (x64) image called ALMCLIENTWIN81. (Note that Windows desktop editions are only available to MSDN subscribers).
- Add the RMDEPLOYER and TFSTEST to the Administrators group.
- Install the Release Management Deployment Agent and ensure it is in the Ready status in the Release Management client. (See here for details.)
- Install Microsoft Test Manager 2013.
- Mount the Agents for Microsoft Visual Studio 2013 iso (see above) and from the TestAgent folder install the test agent configuring it to run with the TFSTEST account and to register with the test controller installed above:

- Click Apply settings to ensure that the configuration steps were successful.
- Install FireFox.
Configure Microsoft Test Manager
As usual I'm assuming you have a degree of familiarity with the tooling, in this case Microsoft Test Manager 2013. If not then see my getting started with Microsoft Test Manager blog post here. You will need to have installed MTM on to your development machine and successfully connected to your ContosoUniversity Team Project. Now carry out the following steps.
- Open MTM and connect to Lab Center. Navigate to Controllers and ensure that ALMTFSADMIN is registered as the Test Controller and that the Test Agent on ALMCLIENTWIN81 is in the Ready status:

- Navigate to Lab and create a new Standard environment called ALMCLIENTWIN81. From the Machines page add ALMCLIENTWIN81 and give it the Web Client role and specify credentials which are in the Administrators group on ALMCLIENTWIN81. From the Verification page click Verify to initiate the verification procedure. If all verifications pass you should end up with a new environment in the ready state:

- Navigate to Test Settings and create a new entry called Automated Test Run and choose Automated for the What type of tests do you want to run? question. From the Roles page choose the Web Client role which should match the environment created above. From the Data and Diagnostics page choose everything except Screen and Voice Recorder and feel free to configure each option as you see fit. Click on Finish to save the settings and close MTM.
- Open MTM and connect to Testing Center. If this is the first time you have connected you will need to Add a Test Plan:

- Add a test plan called Regression and then choose Select plan to open it at the Contents tab. Add a new suite called Department by right-clicking the Regression node and choosing New suite:

- Now over in the right-hand pane create two new test cases called Can_Navigate_To_Departments and Can_Create_Departmemt:

- Note the IDs of these new test cases and switch to Visual Studio, opening the ContosoUniversity solution if it isn't already open. From Team Explorer > Work Items search for each of the two new test cases and in turn open them to carry out the next step for both test cases.
- Click on the Associated Automation tab and then choose the ellipsis at the right of the Automated test name field. This brings up the Choose Test dialog from where you can select the required test:

- Make sure to save the test case work items after associating the automated tests.
- Back in Microsoft Test Manager navigate to the Run Settings tab of the test plan. Under Automated runs set Test settings = Automated Test Run and Test environment = ALMCLIENTWIN81 -- the items we created above. Make sure you Save and Close.
Configure Release Management
With our automated web tests incorporated in to an MTM Test Plan the next step is to configure Release Management to run them as part of the DAT stage of the deployment pipeline. Carry out the following steps from within Release Management:
- From Configure Paths > Servers add ALMCLIENTWIN81 and ensure it is in the Ready status.
- From Configure Paths > Environments add ALMCLIENTWIN81 via the Link Existing button.
- From Configure Apps > Components create a new component called Contoso University\Run MTM Auto Tests and configure as follows:
- Source > Builds with application (= selected) > Path to package = \
- Deployment > Tool = MTM Automated Tests Manager
- Configuration Variables > Variable Replacement Mode = Only in Command
- From Configure Apps > Agent-based Release Templates open Contoso University\DAT>DQA and edit the Deployment Sequence for DAT as follows:
- From Toolbox > Servers drag ALMCLIENTWIN81 to the Deployment Sequence below ALMSQL01.
- From Toolbox > Components right click the node and add Contoso University\Run MTM Auto Tests.
- Now drag Contoso University\Run MTM Auto Tests to ALMCLIENTWIN81 and configure the compnent as follows:
- TestRun Title = Department Tests
- PlanId = 28 (or whatever your Plan ID is as it is likely to be different)
- SuiteId = 29 (or whatever your Suite ID is as it is likely to be different)
- ConfigId = 1 (or whatever your Configuration ID is as it is likely to be different)
- TfsCollection = http://almtfsadmin:8080/tfs/PrmCollection (or whatever your Team Collection URL is as it is likely to be different)
- TeamProject = ContosoUniversity
- TestEnvironment = ALMCLIENTWIN81
- Build Directory = $(PackageLocation)
Some Final Configuration
In order for automated tests to run I've found that as a minimum the Deployment Agent account (RMDEPLOYER) needs to be in the Project Collection Administrators group. It may be possible to fine tune this requirement but it's a level of trial and error I haven't had time to perform. This permission can be granted on your TFS admin machine from Team Foundation Server Administration Console > $(TfsAdminMachineName) > Application Tier > Team Project Collections > $(TeamProjetName) > General > Group Membership > [$(TeamProjetName)]\Project Collection Administrators > Properties.
If your Visual Studio solution contains a mix of unit test and automated test projects and the build definition is configured to run unit tests you are going to want to ensure the automated tests are excluded as they will fail and cause the build to report it only partially succeeded. There are various ways to do this but the technique I've used is to modify all build definitions to change Process > Build Process Parameters > 3. Test > 1. Automated tests > 1. Test source > Test sources spec to unittest where it was previously just test. This obviously ties in with my project naming convention of ContosoUniversity.Web.UnitTests and ContosoUniversity.Web.AutoTests.
Start a Deployment
From Visual Studio manually queue a new build from ContosoUniversity_Main_Nightly. If everything is in place the build should succeed and you can open Microsoft Test Manager to check the results. Navigate to Testing Center > Test > Analyze Test Runs. You should see your test run listed and double-clicking it will open up the fruits of all our endeavours so far, the test run results:

Wrap-Up
If getting to this stage felt like it required a huge amount of detailed configuration you are probably right. There are a great many moving parts to get working correctly and any future simplification of the process by Microsoft in the in the future will be welcomed.
Although we've reached a major milestone on this continuous delivery journey there is still plenty more to talk about so do watch out for the next installation.
Cheers -- Graham
Getting Started with Microsoft Test Manager
Microsoft Test Manager is the part of the TFS ecosystem that helps you test your application. Test Cases are organised in to Test Suites which are all contained in a Test Plan. There's a wealth of automated capability for taking some of the drudgery out of manual testing and all manner of features for analysing and managing test results. That's only about 5% of what it does so be sure to check out these resources to learn about the full breadth of what MTM can do:
One thing to be aware of is that Team Web Access (the browser component of TFS) has a growing number of test features available directly in the browser. In some sitations that may be all you need See here for just one article that has more details.
Cheers -- Graham