Continuous Delivery with VSO: Configuring Release Management
In this post in my blog series on continuous delivery with VSO we look at configuring Release Management for Visual Studio. RM is part of the TFS ecosystem and is used to deploy our code to the different environments that constitute the delivery pipeline. It was originally built to work with TFS however the 2013.4 version released in November 2014 now works with VSO. Inevitably of course I'm going to be comparing how RM with VSO stacks up against RM with TFS.
Setting the Scene
From now on in this series of blog posts I'm going to assume that you are working in Azure and have a setup that resembles the one I created for my Continuous Delivery with TFS series of posts. If you are starting from scratch and need to catch up then these are the posts that can help:
- Getting Started with Microsoft Azure
- Continuous Delivery with TFS: Laying the Azure Foundations
- Continuous Delivery with TFS: Creating a Domain Controller
- Continuous Delivery with TFS: Provisioning a Visual Studio Development Machine
- Continuous Delivery with TFS: Pausing to Consider the Big Picture
- Continuous Delivery with TFS: Our Sample Application
One of the big advantages of RM-VSO is that there is no need to run a TFS instance. Additionally there is no need to run an RM server instance or Deployment Agents on target nodes since this is all taken care of, either behind the scenes in the case of the RM server or by using a different technique in the case of deploying to target nodes. Whilst the RM-VSO offering reduces the number of moving parts (which is good) it also imposes restrictions. As an example, RM-TFS allows us to reuse deployment VMs in different environments. In contrast RM-VSO doesn't allow this and consequently a multi-tenant model (eg one IIS machine hosting multiple websites) isn't possible, at least not without a substantial amount of jiggery-pokery. Does this matter? It depends... For a demo environment fewer VMs is preferable if you need to preserve your Azure credits, but in vivo you probably want separate VMs anyway. There is an easy -- if inelegant -- workaround for those that want to preserve Azure credits and I describe this below.
Configuring Azure to Work with RM
Our initial pipeline will consist of two environments: DAT (Development Automated Test) and DQA (Development Quality Assurance). Our Contoso University sample application has a web component and a database component so we'll need the services of IIS and SQL Server. With RM-TFS these can be dedicated web and database VMs that host multiple websites and databases but as mentioned above out of the box this isn't possible with RM-VSO. An additional requirement is a one-to-one mapping between RM-VSO environments and Azure cloud services. To work around all this we'll use VMs that host both IIS and SQL Server. A bit hacky for a demo setup but what to do? The procedure for setting all this up is as follows:
- In the Azure portal create two new cloud services to host VMs for each RM-VSO environment. I called mine datcloudservice.cloudapp.net and dqacloudservice.cloudapp.net -- you'll need to choose unique names for your services.
- Now create two new VMs -- one in each cloud service. I called mine ALMWEBDB01 and ALMWEBDB02. The good news is that despite being in different cloud services these servers can be in the same virtual network, affinity group and storage account. This keeps everything neat and tidy and also means the servers can be part of your domain if you have set one up.
- Both of these servers need to have IIS and SQL Server installed. This is fairly standard stuff so I won't be covering this here. One note of caution is that to preserve Azure credits be sure to install SQL Server from scratch rather than use an image from the gallery with SQL Server pre-installed as the latter technique is much more costly.
- These servers also need an account adding to the local administrators group that will be used in the deployment process. I used the RMDEPLOYER domain account that was set up for Deployment Agents to use in agent based deployments. In addition RMDEPLOYER will need a login for SQL Server and appropriate permissions. The easy path in a demo environment is to grant sysadmin but clearly that may be unwise in production.
The other VM which is core to all this is your developer workstation running Visual Studio, Release Management and Microsoft Test Manager. See above for the link to getting this machine configured if necessary.
Connect Release Management to VSO
I'm making the assumption here that you already have the RM client connected to TFS and want to connect it to VSO. If you have a new install of RM client the steps will be similar. You'll need to start an already configured RM client with your TFS instance up-an-running otherwise it just chokes. To switch over from TFS to VSO navigate to Administration > Settings > System Settings and click on the Edit link at the end of the Release management Server URL setting:
In the Configure Services dialog that appears add in the URL of your VSO account, ie https://myaccount.visualstudio.com. You'll probably be prompted to enter credentials after which you'll be prompted to allow the client to restart. When it does you have an instance of the client ‘re-branded' for VSO, by which I mean there are some changes to the user interface to reflect the difference between the features supported by TFS and VSO. One immediately obvious difference is that there is no place to specify SMTP settings as VSO handles all that.
Connect Release Management to Azure and Configure an Environment
One key difference between VSO and TFS is that VSO can only deploy to Azure VMs. In order to allow this you must configure RM with your Azure subscription:
- Download a text file containing your Azure subscription settings from here.
- From Administration > Manage Azure click on New and fill in the Name, Subscription ID and Management Certificate Key from the text file. Pay particular attention if you have more than one Azure subscription. For the Management Certificate Key you want everything between the quotes. Get the appropriate Storage Account Name from here. Consider deleting the Azure subscription settings file when you are finished with it for security purposes.
- Create DAT and DQC stages from Administration > Manage Pick Lists. See here for my TFS equivalent post.
- From Configure Paths > Environments click on New vNext: Azure to create a new environment and click Link Azure Environment to bring up the Azure Environments dialog. Select your Azure subscription and then use the Link button to link the DAT cloud service.
- With the environment created click on Link Azure Servers to link the VM hosted in the DAT cloud service:
- Note that you can't change the name of the environment -- it is fixed as the name of the cloud service.
- Now repeat the process for the DQA cloud service, after which you should have two environments:
Configure a vNext Release Path
With the environments created we can create a release path. Navigate to Configure Paths > vNext Release Paths and create a new path called Contoso University\DAT>DQA. Add two stages to it (one for DAT and another for DQA) and configure with the respective environments. You will need to add yourself or another user to the approvals workflow as the concept of groups isn't available in RM-VSO. Additionally the DAT workflow should be automated. You should end up with something similar to this:
Again there are differences between the VSO version and the TFS version, since for some reason the toggle email notification icons are missing from the VSO version. Other than that createing a release path with RM-VSO is very similar to RM-TFS.
Until Next Time
That's as far as we are going in this post. Next time we'll configure the actual release template and get to grips with using PowerShell scripts to deploy our components.
Cheers -- Graham