Continuous Delivery with TFS: Building the Deployment Pipeline using an Agent-based Release Template

Posted by Graham Smith on January 13, 2015No Comments (click here to comment)

In this instalment of my series on implementing continuous delivery with TFS we finally get to build the deployment pipeline with Release Management. This won't be a tutorial on how to use Release Management so if you need to get up to speed with it I have a getting started post here. One point to note is that there are two ways to implement a deployment pipeline with Release Management: using Agent-based Release Templates which use Release Management Actions, Tools and Components and using vNext Release Templates which leverage PowerShell DSC. This post focusses on Agent-based Release Templates and a future post will look at the vNext Release Templates.

Administration Settings

We are going to be deploying to DAT and DQA environments so our starting point in Release Management is to navigate to Administration > Manage Pick Lists and add entries for DAT and DQA. Now add two new groups from Administration > Manage Groups called Development and Quality Assurance. There is a lot that can be done here to lock down each group to certain stages and activities but for our demo environment it's probably overkill, although feel free to configure away if you would like. The only configuration needed is to add yourself to each group.

Configure Paths Settings -- Servers and Environments

Next we need to confirm that the two servers (ALMWEB01 and ALMSQL01) that we will be deploying to are registered and Ready via Configure Paths > Servers. Still in the Configure Paths tab move to the Environments page and configure two environments called Contoso University\DAT and Contoso University\DQA). For each environment link the ALMWEB01 and ALMSQL01 servers. The screenshot shows the configured DAT environment:

Configure Paths Settings -- Agent-based Release Paths

Now move to Configure Paths > Agent-based Release Paths and create a new path called Contoso University\DAT>DQA. Add two stages to it, one for DAT and another for DQA, and configure with the appropriate environments. Set all the approvals for the DAT stage to the Development group and make all the steps automated. Set all the approvals for the DQA stage to the Quality Assurance group and make all the steps manual. Additionally add Quality Assurance as an Approval Step. The result should be as follows:

release-management-agent-based-release-path-with-approver-step

This will have the effect of making the DAT stage completely automated, and if the DAT stage is successful leaving anyone in the Quality Assurance group able to manually accept a build in to DQA. (We want this to be manual because if Quality Assurance is part way through some manual testing in DQA they probably don't want to have a new build automatically overwrite the one they are working with.)

Configure Apps Settings -- Components

Next up we need to create three Components: one to deploy the web site, one to deploy the DACPAC and one to run an SQL script to create environment-specific logins and associated database users. (As an aside, we need to create components any time the build location is required and a component is based on -- or inherits if you like -- an existing tool. We need to perform other tasks as well such as deleting files and we do this using Actions. Actions are usually based on a tool but since they typically take a simple parameter they can be used directly.)

The first component we are going to make will deploy the web site. From Configure Apps > Components create a new component called Contoso University\Deploy Web Site and configure as follows:

  1. Source > Builds with application (= selected) > Path to package = \_PublishedWebsites\ContosoUniversity.Web
  2. Deployment > Tool = XCopy Deployer
  3. Configuration Variables:
    1. Variable Replacement Mode = After Installation
    2. File Extension Filter = *.config
    3. Parameter #1 = DATA_SOURCE | Standard | Connection String: Data Source
    4. Paramater #2 = INITIAL_CATALOG | Standard | Connection String: Initial Catalog

The second component we need will deploy the DACPAC. From Configure Apps > Components create a new component called Contoso University\Deploy DACPAC and configure as follows:

  1. Source > Builds with application (= selected) > Path to package = \
  2. Deployment > Tool = DACPAC Database Deployer

The third component will run an SQL script. From Configure Apps > Components create a new Component called Contoso University\Run Login & User SQL Script (not an ideal name but length is limited)  and configure as follows:

  1. Source > Builds with application (= selected) > Path to package = \Scripts
  2. Deployment > Tool = Database Deployer -- Execute Script
  3. Configuration Variables:
    1. Variable Replacement Mode = Before Installation
    2. File Extension Filter = *.sql
    3. Parameter #1 = LOGIN_OR_USER | Standard | Name of login or user to create
    4. Parameter #2 = DB_NAME | Standard | Database to set security for
Configure Apps Settings -- Agent-based Release Templates

We now use these three components in a Release Template. Create a new template called Contoso University\DAT >DQA from Configure Apps > Agent-based Release Templates and select the Release Path to Contoso University\DAT>DQA. Now edit the Build Definition and select ContosoUniversity as the Team Project and ContosoUniversity_Main_Nightly as the Build Definition. Lastly before closing this dialog check Can Trigger a Release from a Build?.

We are now presented with the ability to edit the Deployment Sequence for DAT as follows:

  1. Expand the Servers node of the Toolbox and drag ALMWEB01 to the Deployment Sequence area of the workflow designer.
  2. Expand the Windows OS node of the Toolbox and drag a Delete File(s) or Folder Action over to ALMWEB01, double-click it and set the FileFolderName parameter to C:\inetpub\wwwroot\CU-DAT\*.*. Use the breadcrumb trail to navigate back to ALMWEB01.
  3. Right-click the Component node of the Toolbox and Link the two components we created earlier and drag the Contoso University\Deploy Web Site to ALMWEB01 so it follows the delete action. Double-click it and set the parameters as follows:
    1. Installation Path  = C:\inetpub\wwwroot\CU-DAT
    2. DATA_SOURCE = ALMSQL01
    3. INITIAL_CATALOG = CU-DAT
  4. Use the breadcrumb trail to navigate back to Deployment Sequence and drag ALMSQL01 to the workflow designer so it follows ALMWEB01.
  5. Drag the Contoso University\Deploy DACPAC component to ALMSQL01, double-click and set the parameters as follows:
    1. FileName = ContosoUniversity.Database.dacpac
    2. ServerName = ALMSQL01
    3. DatabaseName = CU-DAT
  6. Use the breadcrumb trail to navigate back to ALMSQL01 and drag the Contoso University\Run Login & User SQL Script component so it follows the Deploy DACPAC component, double-click and set the parameters as follows:
    1. ServerName = ALMSQL01
    2. ScriptName = Create login and database user.sql
    3. LOGIN_OR_USER = ALM\CU-DAT
    4. DB_NAME = CU-DAT
  7. Next create the Create login and database user.sql as follows:
    1. In the ContosoUniversity solution navigate to the ContosoUniversity.Database project and add a folder called Scripts.
    2. Add a script file called Create login and database user.sql and set Copy to Output Directory = Copy always in the File Properties.
    3. Add the following tokensied T-SQL:
    4. Check this new file in to version control.
  8. With the DAT stage configured we now want to configure the DQA stage. There is great shortcut for this -- right click the DAT tab and choose Copy Deployment Sequence.
    release-management-copy-stage
  9. Now right-click the DQA tab and choose Paste Deployment Sequence. Accept the confirmation dialog and the sequence appears in DQA. Each part of the sequence now needs to be changed to replace all instances of DAT with DQA. Notice how the Configuration Variables link can be clicked to open a viewer to check the configuration of different components for different stages.
    release-management-configuration-variables
Ensuring Correct Permissions

With the DAT and DQA stages configured it's almost time to test the deployment but before we do we'll need to make sure that several permissions are in place:

  1. The first thing I'll discuss are the deployment targets, ie ALMWEB01 and ALMSQL01. They both have the deployment agent service account (ALM\RMDEPLOYER) in the local administrators group so performing operations in the file system and so on is all taken care of. However this won't be enough for SQL Server, where (for recent versions of SQL Server at least) local administrators do not automatically get rights in SQL Server. You'll need to add ALM\RMDEPLOYER as a login to SQL Server and then think very carefully what permissions you grant this login. If a database doesn't exist then the dbcreator role will fix that but then the login won't be able to do anything at a higher level. In this case the securityadmin role does the trick but if we were to add further functionality to the stage the permissions might need to change again. Clearly in a non-demo situation you will need to get your DBA involved right from the start of the continuous delivery adoption process so they fully understand what is trying to be achieved and can help to find the best way to manage the database permissions side of things.
  2. The second area for discussion is the accounts that need to be added to Release Management to make everything work. The Release Management server is running under the ALM\TFSSERVICE account so that will already be there as a Service User. This account does need the Make requests on behalf of others permission for the Project Collection -- details on how to set this are here. The two accounts that you will need to add are ALM\TFSBUILD (again in the Service User role) as that needs to be able to communicate with Release Management server to start the deployment after the build completes and also ALM\RMDEPLOYER in the Service User role as it also needs to communicate with the Release Management server.
Testing the Deployment

At long last it's time to test the deployment. In Visual Studio navigate to Team Explorer > Builds and queue a build of ContosoUniversity_Main_Nightly. If everything has been configured correctly you should be able to observe the deployment progressing in the Release Management client from Releases > Releases. If all the components deploy successfully (only to DAT at the moment since that is the only automated stage) you will want to check that the CU-DAT website works. From any server in the demo environment you should be able to browse to http://almweb01/CU-DAT and confirm the full working of Contoso University.

Testing the Approvals Workflow

The final piece of the jigsaw as far as this post is concerned is with the approvals workflow for the DQA stage. In the Release Management client navigate to Releases > My Approvals Requests and notice that your successful release is at the DQA stage waiting to be approved. Click on the Approve button and work your way past the confirmation dialog and the deployment then starts to the DQA environment. Back to Releases > My Approvals Requests and you will then see two more approvals requests -- one to validate the deployment and one to approve it. Finally back to Releases > Releases and we see that the release is Released!

It's been quite a journey but that's it for this post. Watch out for future posts where we add extra functionality and features to the pipeline.

Cheers -- Graham

Continuous Delivery with TFS: Creating an All-in-One TFS Server

Posted by Graham Smith on December 7, 2014No Comments (click here to comment)

In this third instalment of my series about creating a continuous delivery pipeline using TFS it's time to actually install TFS. In a production environment you will more than likely -- but not always -- split the installation of TFS across multiple machines, however for demo purposes it's perfectly possible and actually preferable from a management perspective to install everything on to one machine. Do bear in mind that by installing everything on one server you will likely encounter fewer issues (permissions come to mind) than if you are installing across multiple machines. The message here is that the speed of configuring a demo rig in Azure will bear little resemblance to the time it takes to install an on-premises production environment.

Ben Day maintains a great guide to installing TFS and you can find more details here. My recommendation is that you follow Ben's guide and as such I'm not planning to go through the whole process here. Rather , I will document any aspects that are different. As well as reading Ben's guide I also recommend reading the Microsoft documentation on installing TFS, particularly if you will ultimately perform an on-premises installation. See one of my previous posts for more information. One of the problems of writing installation instructions is that they date quite quickly. I've referred to the latest versions of products at the time of writing below but feel free to use whatever is the latest when you come to do it

  • Start by downloading all the bits of software from MSDN or the free versions if you are going down that (untried by me) route. At this stage you will need TFS2013.4 and VS2013.4. I tend to store all the software I use in Azure on a share on my domain controller for ease of access across multiple machines.
  • If you are following my recommendation and using a domain controller the first step is to create service accounts that you will use in the TFS installation. There is comprehensive guidance here but at a minimum you will need TFSREPORTS, TFSSERVICE and TFSBUILD. These are  sample names and you can choose whatever you like of course.
  • The second step is to create your VM with reference to my Azure foundations post. Mine is called ALMTFSADMIN. This is going to be an all-in-one installation if you are following my recommendation in order to keep things simple, so a basic A4 size is probably about right.
  • Ben's guide refers to Windows Server 2012 because of SharePoint Foundation 2013's lack of support for Windows Server 12 R2. This was fixed with SharePoint 2013 SP1 so you can safely create your VM as Windows Server 2012 R2 Datacenter. Having said that, you don't actually need SharePoint for what we're doing here so feel free to leave it out. Probably makes sense as SharePoint is a beast and might slow your VM down. Ben's guide is for an on-premises rather than Azure installation of Windows so some parts are not relevant and can obviously be skipped.
  • Early versions of TFS 2013 didn't support SQL Server 2014 and Ben's guide covers installing both SQL Server 2012 and 14. You might as well go for the 2014 version unless you have reason to stick with 2012.
  • The TFS installation part of Ben's guide starts on page 99 and refers to TFS2013.2. The latest version as of this post is TFS2013.4 and you should go for that. As above my recommendation is to skip SharePoint.
  • Go ahead and install the build service. On a production installation you would almost never install the build service on the TFS application tier but it's fine in this case.
  • The build service (or more correctly the build agents) will need to build Visual Studio applications. The easiest way to enable this is to install Visual Studio itself -- VS2013.4 (whatever is the best SKU you are entitled to use) without the Windows 8 Phone components will do very nicely.
  • You can leave the test controller installation for the time being -- we will look at that in detail in a future post.

When the installations are complete and the VM has been restarted you should be able to access the TFS Administration Console and check that all is in order.  Congratulations -- your TFS admin box is up-and-running! Watch out for the next post where we create a Visual Studio development environment.

Cheers -- Graham

Getting started with Team Foundation Server and ALM

Posted by Graham Smith on December 5, 20142 Comments (click here to comment)

Team Foundation Server is Microsoft's ecosystem (my term) that allows organisations to implement software application lifecycle management and continuous delivery. TFS consists of the core application on top of which run various clients that perform specialised roles. There are many articles that explain the capabilities of TFS and why you would want to use it in preference to other tools (or not as the case may be), and the purpose of this post isn't to go over that well-trodden ground. Rather, I'm assuming that TFS is your way forward any you are looking for ways to get started with it. I'll cover a couple of scenarios for the core product as follows:

You haven't yet implemented TFS and are looking for guidance on installing and configuring it:

TFS is installed and you want to learn how to use it to best effect:

Whilst some of the training courses I link to above are free from the Microsoft Virtual Academy I make no apology for linking to Pluralsight courses for which one needs a subscription. For Microsoft.NET -- and increasingly other -- developers a Pluralsight subscription is in my view an indispensable tool and excellent value for money.

Enjoy getting to grips with TFS -- it's a great piece of kit. Watch out for future Getting Started posts on the other applications that form part of the ecosystem.

Cheers -- Graham