Archives by Graham Smith

Continuous Delivery with TFS / VSTS – A Lap Around the Contoso University Sample Application

Posted by Graham Smith on January 27, 20164 Comments (click here to comment)

In the previous post in this series on Continuous Delivery with TFS / VSTS we configured a sample application for Git in Visual Studio 2015. This post continues with that configuration and also examines some of the more interesting parts of Contoso University before firing it up and taking it for a spin.

Build and Update NuGets

Picking up from where we left off last time in Visual Studio, Team Explorer -- Home should be displaying ContosoUniversity.sln in the Solutions section. Double-click this to open Contoso University and navigate to Solution Explorer. The first thing to do is to hit F6 to build the solution. NuGet packages should be restored and the five projects should build. It's worth checking Team Explorer -- Changes at this point to make sure .gitignore is doing its job properly -- if it is there should be no changes as far as version control is concerned.

For non-production applications I usually like to keep bang up-to-date with the latest NuGets so in Solution Explorer right click the solution and choose Manage NuGet Packages for Solution. Click on the Updates link and allow the manager to do its thing. If any updates are found click to Select all packages and then Update:

visual-studio-manage-packages-for-solution

Commit (and sync if you like) these changes (see the previous post for a reminder of the workflow for this) and check the solution still builds.

Test the Unit Tests

The solution contains a couple of unit tests as well as some UI tests. In order to just establish that the unit tests work at this stage open Test Explorer (Test > Windows > Test Explorer) and then click on the down arrow next to Playlist : All Tests. Choose Open Playlist File, open ...\ContosoUniversity.Web.UnitTests\UnitTests.playlist and run the tests which should pass, albeit with very poor coverage (Tests > Analyze Code Coverage > All Tests).

Code Analysis

Code analysis (a feature formerly known as FxCop) is enabled for every project and reports on any rule violations. I've set projects to use the Microsoft Managed Recommended Rules (note though that rule sets don't exist for SQL Server database Projects and in the case of ContosoUniversity.Database everything is selected) however the code as it exists in GitHub doesn't violate any of those rules. To see this feature in action right-click the ContosoUniversity.Web project and choose Properties. On the Code Analysis tab change the rule set to Microsoft All Rules and build the solution. The Error List window should appear and there should be quite a few warnings -- 85 at the time of writing this post. Reverse the change and the solution should once again build without warnings.

SQL Server Data Tools -- A Better Way to Manage Database Changes

There are several ways to manage the schema changes for an application's database and whilst every technique undoubtedly has its place the one I recommend for most scenarios is the declarative approach. The simple explanation is that code files declare how you want the database to look and then a separate ‘engine' has the task of figuring out a script which will make a blank or existing database match the declaration. It's not a magic bullet since it doesn't cope with all scenarios out-of-the-box however there is usually a way to code around the trickier situations. I explained how I converted Contoso University from an Entity Framework Code Migrations way of managing database changes to a declarative one using SQL Server Data Tools here. If you are new to SSDT and want to learn about it I have a Getting Started post here.

As far as Visual Studio solutions are concerned, the declarative approach using SSDT is delivered by way of SQL Server Database Projects. In Contoso University this is the ContosoUniversity.Database project. This contains CREATE scripts for tables and stored procedures as well as for permissions and security. Not only does it contain scripts that represent the database schema, it also contains scripts that can insert reference data in to a newly created database. The significance of this becomes apparent when you realise that a database project can be used to create a database in LocalDB, which can be made fully functioning by inserting reference data. In many cases this could remove the need for a ‘dev' environment, as the local workstation is the dev environment, database and all. This eliminates the problem of config files containing connection strings specific to developers (and the check-in problems that this can cause) because the connection string for LocalDB is generic. This way of working has the potential to wave goodbye to situations where getting an application working on a development machine requires a lengthy setup script and usually some magic.

To see this in action, within the ContosoUniversity.Database project right-click CU-DEV.publish.xml and choose Publish. In the Publish Database CU-DEV.publish.xml dialog click on Publish:

visual-studio-publish-database-to-localdb

Now make sure that the ContosoUniversity.Web project is set as the startup project (if not in bold right-click and choose Set as StartUp Project) and hit F5 to run the application. A fully functioning ContosoUniversity should spring to life. Navigating to (for example) Departments should allow you to perform all CRUD activities. How's that for simplicity?

Tour Stops Here

That's it for our tour of Contoso University. Although there are automated UI tests in the solution I'm purposefully not covering them here as they are very much an advanced topic and I'm also planning some changes. That said, if you have FireFox installed on your development machine you can probably make them run. See my post in a previous blog series here.

In the next post we look at getting a basic CI build up-an-running. Happy coding until then!

Cheers -- Graham

Continuous Delivery with TFS / VSTS – Configuring a Sample Application for Git in Visual Studio 2015

Posted by Graham Smith on January 21, 2016No Comments (click here to comment)

In this instalment of my blog post series on Continuous Delivery with TFS / VSTS we configure a sample application to work with Git in Visual Studio 2015 and perform our first commit. This post assumes that nothing has changed since the last post in the series.

Configure Git in Visual Studio 2015

In order to start working with Git in Visual Studio there are some essential and non-essential settings to configure. From the Team Explorer -- Home page choose Settings > Git > Global Settings. Visual Studio will have filled in as many settings as possible and the panel will look similar to this:

visual-studio-git-settings

Complete the form as you see fit. The one change I make is to trim Repos from the Default Repository Location (to try and reduce max filepath issues) and then from Windows Explorer create the Source folder with child folders named GitHub, TFS and VSTS. This way I can keep repositories from the different Git hosts separate and avoid name clashes if I have a repository named the same in different hosts.

Clone the Repository

In the previous post we created ContosoUniversity (the name of the sample application we'll be working with) repositories in both TFS and VSTS. Before we go any further we need to clone the ContosoUniversity repository to our development workstation.

If you are working with TFS we finished-up last time with the Team Explorer -- Home page advising that we must clone the repository. However, at the time of writing Visual Studio 2015.1 seemed to have bug (TFS only) where it was still remembering the PRM repository that we deleted and wasn't allowing the ContosoUniversity repository to be cloned. Frustratingly, after going down several blind avenues the only way I found to fix this was to temporarily recreate the PRM repository from the Web Portal. With this done clicking on the Manage Connections icon on the Team Explorer toolbar now shows the ContosoUniversity repository:

visual-studio-display-repository-for-cloning

Double-click the repository which will return you to the Team Explorer home page with links to clone the repository. Clicking one of these links now shows the Clone Repository details with the correct remote settings:

visual-studio-clone-repository-correct-settings

Make sure you change the local settings to the correct folder structure (if you are following my convention this will be ...\Source\TFS\ContosoUniversity) and click on Clone. Finally, return to the Web Portal and delete the PRM repository.

The procedure for cloning the ContosoUniversity repository if you are working with VSTS is similar but without the bug. However in the previous post we hadn't got as far as connecting VSTS to Visual Studio. The procedure for doing this is the same as for TFS however when you get to the Add Team Foundation Server dialog you need to supply the URI for your VSTS subscription rather than the TFS instance name:

visual-studio-add-team-foundation-server-for-vsts

With your VSTS subscription added you will eventually end up back at the Team Explorer -- Home page where you can clone the repository as for TFS but with a local path of ...\Source\VSTS\ContosoUniversity.

Add the Contoso University Sample Application

The Contoso University sample application I use is an ASP.NET MVC application backed by a a SQL Server database. It's origins are the MVC Getting Started area of Microsoft's www.asp.net site however I've made a few changes in particular to how the database side of things is managed. I'll be explaining a bit more about all this in a later post but for now you can download the source code from my GitHub repository here using the Download ZIP button. Once downloaded unblock the file and extract the contents.

One of they key things about version control -- and Git in particular -- is that you want to avoid polluting it with files that are not actually part of the application (such as Visual Studio settings files) or files that can be recreated from the source code (such as binaries). There can be lots of this stuff but handily Git provides a solution by way of the .gitignore file and Microsoft adds icing to the cake by providing a version specifically tailored to Visual Studio. Although the sample code from my GitHub repository contains a .gitignore I've observed odd behaviours if this file isn't part of the repository before an existing Visual Studio solution is added so my technique is to add it first. From Team Explorer -- Home navigate to Settings > Git > Repository Settings. From there click on the Add links to add an ignore file and an attributes file:

visual-studio-ignore-and-attributes-files

Switch back to Team Explorer -- Home and click on Changes. These two files should appear under Included Changes. Supply a commit message and from the Commit dropdown choose Commit and Sync:

visual-studio-sync-ignore-and-attributes-files

Now switch back to Windows Explorer and the extracted Contoso University, and drill down to the folder that contains ContosoUniversity.sln. Delete .gitattributes and .gitignore and then copy the contents of this folder to ...\Source\TFS\ContosoUniversity or ...\Source\VSTS\ContosoUniversity depending on which system you are using. You should end up with ContosoUniversity.sln in the same folder as .gitattributes and .gitignore.

Back over to Visual Studio and the Team Explorer -- Changes panel and you should see that there are Untracked Files:

visual-studio-changes-untracked-files

Click on Add All and then Commit and Sync these changes as per the procedure described above.

That's it for this post. Next time we open up the Contoso University solution and get it configured to run.

Cheers -- Graham

Continuous Delivery with TFS / VSTS – Creating a Team Project and Git Repository

Posted by Graham Smith on January 12, 2016No Comments (click here to comment)

In the previous post in this blog series on Continuous Delivery with TFS / VSTS we looked at how to commission either TFS or VSTS. In this seventh post in the series we continue from where we left off by creating a Team Project and then within that team project the Git repository that will hold the source code of our sample application. Before we do all that though we'll want to commission a developer workstation in Azure.

Commission a Developer Workstation

Although it's possible to connect an on premises developer workstation to either VSTS (straightforward) or our TFS instance (a bit trickier) to keep things realistic I prefer to create a Windows 10 VM in Azure that is part of the domain. Use the following guide to set up such a workstation:

  • If you are manually creating a VM in the portal it's not wholly clear where to find Windows 10 images. Just search for 10 though and they will appear at the top of the list. I chose Windows 10 Enterprise (x64) -- make sure you select to create in Resource Manager. (Although you can choose an image with Visual Studio already installed I prefer not to so I can have complete control.)
  • I created my VM as a Standard DS3 on premium storage in my PRM-CORE resource group, called PRM-CORE-DEV. Make sure you choose the domain's virtual network and the premium storage account.
  • With the VM created log in and join the domain in the normal way. Install any Windows Updates but note that since this in an Enterprise edition of Windows you will be stuck with the windows version that the image was created with. For me this was version 10.0 (build 10240) even though the latest version is 1511 (build 10586) (run winver at a command prompt to get your version). I've burned quite a few hours trying to upgrade 10.0 to 1511 (using the 1511 ISO) with no joy -- I think this is because Azure doesn't support in-place upgrades.
  • Install the best version of Visual Studio 2015 available to you (for me it's the Enterprise edition) and configure as required. I prefer to register with a licence key as well as logging in with my MSDN account from Help > Register Product.
  • Install any updates from Tools > Extensions and Updates.
Create a Team Project

The procedure for creating a Team Project differs depending on whether you are using VSTS or TFS. If you are using VSTS simply navigate to the Web Portal home page of your subscription and select New:

vsts-new-team-project

This will bring up the Create New Team Project window where you can specify your chosen settings (I'm creating a project called PRM using the Scrum process template and using Git version control):

vsts-create-new-team-project

Click on Create project which will do as it suggests. That button then turns in to a Navigate to project button.

If on the other hand you are using TFS you need to create a team project from within Visual Studio since a TFS project may have other setup work to perform for SSRS and SharePoint that doesn't apply to VSTS. The first step is to connect to TFS by navigating to Team Explorer and choosing the green Manage Connections button on the toolbar. This will display a Manage Connections link which in tun will bring up a popup menu allowing you to Connect to Team Project:

visual-studio-manage-connections

This brings up the Connect to Team Foundation Server dialog where you need to click the Servers button:

visual-studio-connect-to-team-foundation-server

This will bring up the Add/Remove Team Foundation Server dialog where you need to click the Add button:

visual-studio-add-remove-team-foundation-server

Finally we get to the Add Team Foundation Server dialog where we can add details of our TFS instance:

visual-studio-add-team-foundation-server

As long as you are using the standard path, port number and protocol you can simply type the server name and the dialog will build and display the URI in the Preview box. Click OK and work your way back down to the Connect to Team Foundation Server dialog where you can now connect to your TFS instance:

visual-studio-connect-to-team-foundation-server-instance

Ignore the message about Configure your workspace mappings (I chose Don't prompt again) and instead click on Home > Projects and My Teams > New Team Project:

visual-studio-new-team-project

This brings up a wizard to create a new team project. As with VSTS, I created a project called PRM using the Scrum process template and using Git version control. The finishing point of this process is a Team Explorer view that advises that You must clone the repository to open solutions for this project. Don't be tempted since there is another step we need to take!

Create a Repository with a Name of our Choosing

As things stand both VSTS and TFS have initialised the newly minted PRM team projects with a Git repository named PRM. In some cases this may be what you want (where the team project name is the same as the application for example) however that's not the case here since our application is called Contoso University. We'll be taking a closer look at Contoso University in the next post in this series but for now we just need to create a repository with that name. (In later posts we'll be creating other repositories in the PRM team project to hold different Visual Studio solutions.)

The procedure is the same for both VSTS and TFS. Start by navigating to the Web Portal home page for either VSTS or TFS and choose Browse under Recent projects & teams. Navigate to your team project and click on the Code link:

vsts-navigate-to-code-tab

This takes you to the Code tab which will entice you to Add some code!. Resist and instead click on the down arrow next to the PRM repository and then click on New repository:

vsts-create-new-repository

In the window that appears create a new Git repository called ContosoUniversity (no space).

In the interests of keeping things tidy we should delete the PRM repository so click the down arrow next to (the now selected) ContosoUniversity repository and choose Manage repositories. This takes you to the Version Control tab of the DefaultCollection‘s Control Panel. In the Repositories panel click on the down arrow next to the PRM repository and choose Delete repository:

vsts-delete-repository

And that's it! We're all set up for the next post in this series where we start working with the Contoso University sample application.

Cheers -- Graham

Version Control PowerShell Scripts with Visual Studio and Visual Studio Team Services

Posted by Graham Smith on January 6, 20163 Comments (click here to comment)

It's a new year and whilst I'm not a big fan of New Year's resolutions I do try and use this time of year to have a bit of a tidy-up of my working environments and adopt some better ways of working. Despite being a developer who's used version control for years to manage application source code one thing I've been guilty of for some time now is not version controlling my PowerShell code scripts. Shock, horror I know -- but I'm pretty sure I'm not alone. In this post I'll be sharing how I solved this, but first let's take a quick look at the problem.

The Problem

Unless you are a heavyweight PowerShell user and have adopted a specialist editing tool, chances are that you are using the PowerShell ISE (Integrated Script Editor) that comes with Windows for editing and running your scripts. Whilst it's a reasonably capable editor it doesn't have integration points to any version control technologies. Consequently, if you want version control and you want to continue using the ISE you'll need to manage version control from outside the ISE -- which in my book isn't the seamless experience I'm used to. No matter, if you can live without a seamless experience the next question in this scenario is what version control technologies can be used? Probably quite a few, but since Git is the hot topic of the day how about GitHub -- the hosted version of Git that's free for public repositories? Ideal since it's hosted for you and there's the rather nice GitHub Desktop to make things slightly less seamless. Hang on though -- if you are like me you probably have all sorts of stuff in your PowerShells that you don't want being publicly available on GitHub. Not passwords or anything like that, just inner workings that I'd rather keep private. Okay, so not GitHub. How about running your own version of Git server? Nah...

A Solution

If you are a Visual Studio developer then tools you are already likely using offer one solution to this problem. And if you aren't a Visual Studio developer then the same tools can still be used -- very possibly for free. As you've probably already guessed from the blog title, the tools I'm suggesting are Visual Studio (2015, for the script editing experience) and Visual Studio Team Services (VSTS, for version control). Whoa -- Visual Studio supports PowerShell as a language? Since when did that happen? Since Adam Driscoll created the PowerShell Tools for Visual Studio extension is since when.

The aim of this post is to explain how to use Visual Studio and VSTS to version control PowerShell scripts rather than understand how to start using those tools, so if you need a primer then good starting points are here for Visual Studio and here for VSTS. The great thing is that both these tools are free for small teams. If you want to learn about PowerShell Tools for Visual Studio I have a Getting Started blog post with a collection of useful links here. In my implementation below I'm using Git as the version control technology, so please amend accordingly if you are using TFVC.

Implementing the Solution

Now we know that our PowerShell scripts are going to be version controlled by VSTS the next thing to decide is where in VSTS you want them to reside. VSTS is based around team projects, and the key decision is whether you want your scripts located together in one team project or whether you want scripts to live in different team projects -- perhaps because they naturally belong there. It's horses for courses so I'll show both ways.

If you want your scripts to live in associated team projects then you'll want to create a dedicated Git repository to hold the Visual Studio solution. Navigate to the team project in VSTS and then to the Code tab. Click on the down arrow next to the currently selected repository and in the popup that appears click on New repository:

vsts-create-new-git-repo

A Create a new repository dialogue will appear -- I created a new Git repository called PowerShellScripts. You can ignore the Add some code! call to action as we'll address this from Visual Studio.

Alternatively, if you want to go down the route of having all your scripts in one team project then you can simply create a new team project based on Git -- called PowerShellScripts for example. The newly created project will contain a repository of the same name putting you in the same position as above.

The next step is to switch to Visual Studio and ensure you have the PowerShell Tolls for Visual Studio 2015 extension installed. It's possible you do since it can be installed as part of the Visual Studio installation routine, although I can't remember whether it's selected be default. To check if it's installed navigate to Tools > Extensions and Updates > Installed > All and scroll down:

visual-studio-extensions-and-updates-powershell-tools

If you don't see the extension you can install it from Online > Visual Studio Gallery.

With that done it's time to connect to the team project. Still within Visual Studio, from Team Explorer choose the green Plug icon on the menu bar and then Manage Connections, and then Connect to Team Project:

visual-studio-manage-connections

This brings up the Connect to Team Foundation Server dialog which (via the Servers button) allows you to register your VSTS subscription as a ‘server' (the format is https://yoursubscriptionname.visualstudio.com). Once connected you will be able to select your Team Project.

Next up is cloning the repository that will hold the Visual Studio solution. If you are using a dedicated team project with just one Git repository you can just click the Home icon on the Team Explorer menu bar to get the cloning link on the Home panel:

visual-studio-clone-this-repository

If you have created an additional repository in an existing team project you will need to expand the list of repositories and double-click the one you previously created:

visual-studio-select-repository

This will take you directly to the cloning link on the Home panel -- no need to click the Home icon. Whichever way you get there, clicking the link opens up the settings to clone the repository to your local machine. If you are happy with the settings click Clone and you're done.

Solutions, Projects and Files

At the moment we are connected to a blank local repository, and the almost final push is to get our PowerShell scripts added. These will be contained in Visual Studio Projects that in turn are contained in a Visual Studio Solution. I'm a bit fussy about how I organise my projects and solutions -- I'll show you my way but feel free to do whatever makes you happy.

At the bottom of the Home tab click the New link, which brings up the New Project dialog. Navigate to Installed > Templates > Other Project Types > Visual Studio Solutions. I want to create a Blank Solution that is the same name as the repository, but I don't want a folder of the same name to be created which Visual Studio gives me no choice about. A sneaky trick is to provide the Name but delete the folder (of the same name) from the Location text box:

visual-studio-create-blank-solution

Take that Visual Studio! PowerShellScripts.sln now appears in the Solutions list of the Home tab and I can double-click it to open it, although you will need to manually switch to the Solution Explorer window to see the opened solution:

visual-studio-solution-explorer

The solution has no projects so right-click it and choose Add > New Project from the popup menu. This is the same dialog as above and you need to navigate to Installed > Templates > Other Languages > PowerShell and select PowerShell Script Project. At this point it's worth having a think about how you want to organise things. You could have all your scripts in one project, but since a solution can contain many projects you'll probably want to group related scripts in to their own project. I have a few scripts that deal with authorisation to Azure so I gave my first project the name Authorisation.Azure. Additional projects I might need are things like DSC.Azure and ARM.Azure. It's up to you and it can all be changed later of course.

The new project is created with a blank Script.ps1 file -- I usually delete this. There are several ways to get your scripts in -- probably the easiest is to move your existing ps1 scripts in to the project's folder in Windows Explorer, make sure they have the file names you want and then back in Visual Studio right-click the project and choose Add > Existing Item. You should see your script files and be able to select them all for inclusion in the project.

Don't Forget about Version Control!

We're now at the point where we can start to version control our PowerShell scripts. This is a whole topic in itself however you can get much of what you need to know from my Git with Visual Studio 2015 and TFS 2015 blog post and if you want to know more about Git I have a Getting Started post here. For now though the next steps are as follows:

  • In Team Explorer click on the home button then click Changes. Everything we added should be listed under Included Changes, plus a couple of Git helper files.
  • Add a commit comment and then from the Commit dropdown choose Commit and Sync:
    visual-studio-team-explorer-changes
  • This has the effect of committing your changes to the local repository and then syncing them with VSTS. You can confirm that from VSTS by navigating to the Code tab and selecting the repository. You should see the newly added files!

Broadly speaking the previous steps are the ones you'll use to check in any new changes that you make, either newly added files or amendments to scripts. Of course the beauty of Git is that if for whatever reason you don't have access to VSTS you can continue to work locally, committing your changes just to the local repository as frequently as makes sense. When you next have access to VSTS you can sync all the changes as a batch.

Finally, don't loose sight of the fact that as well as well as providing version control capabilities Visual Studio allows you to run and debug your scripts courtesy of the PowerShell Tolls for Visual Studio 2015 extension. Do be sure to check out my blog post that contains links to help you get working with this great tool.

Cheers -- Graham

Getting Started with PowerShell Tools for Visual Studio

Posted by Graham Smith on January 5, 2016No Comments (click here to comment)

If you are still using the PowerShell ISE to edit your PowerShell scripts than there may be a better way, particularly if you are already a Visual Studio user. Adam Driscoll's PowerShell Tools for Visual Studio extension has been around for some time now and is even better in Visual Studio 2015. Here is my pick of the best links to help you get started with this great tool:

Don't forget, if you don't already have access to Visual Studio you can download the community edition from here.

Cheers -- Graham

Continuous Delivery with TFS / VSTS – Commissioning TFS or VSTS

Posted by Graham Smith on December 13, 2015No Comments (click here to comment)

[Please note that I've edited this post since first publishing it to reflect new information and / or better ways of doing things. See revision list at the end of the post.]

It's taken a few posts to get here but finally we have arrived at the point in my blog series on Continuous Delivery with TFS / VSTS where we actually get to start using TFS or VSTS -- depending on the route you choose to go down. The main focus of this post is on getting up-and-running with these tools but it's worth talking a bit first about the merits of each platform.

TFS is Microsoft's on premises ALM platform. It needs planning, provisioning of servers, and care and feeding when it's in production. With TFS you are in complete control and -- disasters not withstanding -- all data stays in your network. VSTS on the other hand is Microsoft's hosted ALM platform. Pretty much everything is taken care of and you can be using VSTS within minutes of signing up. However apart from being able to choose from a couple of datacentres you don't have the same control over your data as you do with TFS. Of course VSTS is secure but if the nature of the business you are in puts restrictions on where your data can live then VSTS may not be an option.

In terms of features it's a mixed story. VSTS gets new features about every three weeks and TFS eventually catches up a few months later -- as long as you upgrade of course. However VSTS is missing some components that are present in TFS, notably reports (as in SQL Server Reporting Services, although Power BI looks like it will fill this gap) and SharePoint integration.

If there is no clear factor that helps you decide something else to bear in mind is that there is increasing adoption of VSTS inside Microsoft itself. So if it's good enough for them...

Getting Started with VSTS

It's trivially easy to get started with VSTS and the first five users are free, so there is almost no reason to not give it a try. The process is described here and within minutes you'll have your first project created. I'll be covering connecting to and working with projects in the next post in this series but if you can't wait I'll tell you now that I'll be using the Scrum process template and choosing Git for version control.

Getting Started with TFS

In contrast to VSTS getting started with TFS is quite a lengthy process. For an on premises installation you'll need to do some planning however for a POC environment in the cloud we can simplify things somewhat by using one VM to host everything. For some time now Ben Day has produced an excellent guide to installing TFS and you can find the 2015 version here. The guide pretty much covers everything you need to know in great detail and there's no point me repeating it. A high-level view of the process is as follows:

  • Create a VM in Azure. I opted for a Standard DS4 VM running Windows Server 2012 R2 Datacentre which I created in my PRM-CORE resource group as PRM-CORE-TFS using the previously created storage account and virtual network.
  • Configure a few Windows settings. The first part of Ben's guide covers installing Windows but you can skip that as it's taken care of for us by Azure. The starting point for us is the bit where Ben describes configuring a new server. One part you can skip is the Remote Desktop configuration as that's already set up for Azure VMs.
  • Install pre-requisites. Nothing much to comment on here -- Ben describes what's required with excellent screenshots.
  • Install SQL Server. Choose the latest supported version. For me this was SQL Server 2014 with SP1.
  • Install TFS 2015. Choose the latest version which for me was TFS 2015 with SP1. Be sure to create (and use!) the three domain accounts for the different parts of TFS. For me this was PRM\TFSSERVICE, PRM\TFSREPORTS and PRM\TFSBUILD. Make sure to check the User cannot change password and Password never expires settings when these accounts are created in Active Directory Users and Computers. Go with Ben's advice and skip setting up SharePoint. It's not critical and is an extra complication that's not needed.
    • One point to be aware of when installing TFS on an Azure VM is the installation wizard might suggest the D volume as the place for the file cache (D:\TfsData\ApplicationTier\_fileCache). This is a very bad thing on Azure Windows VMs as the D volume is strictly for temporary storage and you shouldn't put anything on this volume you aren't willing to lose. Make sure you change to use the C volume or some other attached disk if that's the route you've gone down.
  • Install Visual Studio 2015. Ben's guide doesn't cover this, but the easiest way to ensure all the components are available to build .NET projects is to install VS 2015 -- obviously the latest version.
    • I'm not sure if it's strictly necessary on build servers but I always licence VS installations (Help > Register Product > Unlock with a Product Key) just in case bad things happen after the 30-day trial period expires.
    • Another post-installation task I carry out is to update all the extensions from Tools > Extensions and Updates.
  • Install TFS Build. TFS Build isn't enabled as part of the core setup, which makes sense for scaled-out installations where other servers will take the build strain. (Don't forget that build has completely changed for TFS 2015, and whilst XAML builds are still supported the new way of doing things is the way forward.) For a POC installation you can get away with running build on the application tier, and installation is initiated from the Team Foundation Server Administration Console > $ServerName$ > Build:
    tfs-admin-console-build
    In reality this just opens the Team Portal Control Panel at the Agent pools tab. From here you need to deploy a Windows build agent. The details are described here and it's pretty straightforward. In essence you download a zip (that contains the agent plus a settings file) and unzip to a permanent location eg C:\Agent. Run ConfigureAgent.cmd from an admin command prompt and respond to the prompts (TFS URL will be similar to http://prm-core-tfs:8080/tfs and you want to install as a Windows Service using the build credentials previously created) and you are done. If you chose default settings you should end up with a new agent in the Default pool.

    • Since you'll almost certainly be needing the services of Nuget.exe it's a good idea to make sure it's at the latest version. Open an admin command prompt at C:\Program Files\Microsoft Team Foundation Server 14.0\Tools and run Nuget.exe update -self.

Compared with VSTS, setting up TFS was quite a bit of work -- several hours including all the installations and updates. Much of this activity will of course need to be repeated at every update. Not everyone will be able to go down the VSTS hosted route but if you can it could end up being quite a time saver.

Cheers -- Graham


Revisions:

9/1/2016 -- Updated with type of VM to reflect adoption of premium storage and expanded installing TFS build instructions.

Continuous Delivery with TFS / VSTS – Penny Pinching with Azure Automation

Posted by Graham Smith on December 3, 2015No Comments (click here to comment)

Chances are the day will come when you inadvertently leave your VMs running overnight and you wake up the next day to find a) the complimentary monthly Azure credits associated with your MSDN subscription have been completely used up or b) your MSDN Dev/Test Pay-As-You-Go Azure subscription is facing whopping charges and you have some explaining to do. In this fifth instalment of my blog series on Continuous Delivery with TFS / VSTS we look at how to guard against these situations.

Turn on Billing Alerts

The first line of defence is to activate billing alerts and set up appropriate alerts (there is a maximum of five) for your subscription. My Visual Studio Enterprise with MSDN subscription gets £95 worth of credits a month so I have alerts as follows:

azure-subscriptions-billing-alerts

Each alert allows for two email addresses to be contacted when the appropriate threshold is triggered. The thresholds can either be Monetary Credits or Billing Total and you'll need to choose according to whether your subscription has complimentary credits or whether you are accumulating a debt.

The billing alerts feature isn't foolproof though -- at least not unless you have your phone by your bedside with audible notifications enabled. And it won't help you if you can't get to a computer to shut down your VMs. Enter stage left -- Azure Automation!

The Big Picture

Azure Automation allows you to do, er, automation in Azure. The net effect is a bit like Windows Task Scheduler, in that it allows you to create jobs that can be run on a schedule. The resemblance ends there though as setting up Azure Automation is nothing like setting up Task Scheduler. There are quite a few moving parts to configure to get automation up-and-running and at the time of writing it is further complicated because ARM cmdlets are not recognised in the out-of-the-box automation configuration. We'll be covering the steps needed to run a PowerShell script that will loop through all VMs and turn off ones which are not running and are not a domain controller, and then make a second pass to turn off any running domain controllers. In case you were wondering there's no magic in the domain controller recognition -- just that I name any domain controllers with a DC suffix and don't use DC for anything else.

Create an Organisational Account

Azure Automation doesn't work with Microsoft accounts (typically this is an account that uses your work email address and the one you use to log in to Azure and MSDN). Instead you need to create an organisational account in Azure Active Directory. The instructions for this are here and at the time of writing this functionality is only available in the classic portal. I created an account called Automation.

Create an Automation Account

This is a container to group automation resources. Assuming you are authenticated to Azure via PowerShell a new account is created as follows:

Obviously change the values of the parameters to suit your requirements.

Associate the Organisational Account with the Automation Account

As it stands our automation account doesn't have any permissions to interact with resources such as our VMs. This situation is remedied by associating the organisational account with the automation account. Your code should look similar to this:

Obviously I've changed a couple of things above for security reasons but you get the idea. If you want to observe the effect of the code above head over to the new portal and navigate to Automation Accounts > $YourAutomationAccount$ > Assets > Credentials. You should see that the organisational account has been added and (as we'll see below) can be referenced from a PowerShell script.

Add AzureRM Modules

At the time of writing a newly created automation account doesn't know anything about ARM, although by the time you read this that may have been fixed. By way of further explanation, in the new portal navigate to Automation Accounts > $YourAutomationAccount$ > Assets > Modules. What you see here are the modules that contain PowerShell cmdlets that you can call from the automation account. If this list doesn't have what you need you need to manually add whatever module is missing. In our case we need the AzureRM, AzureRM.Compute and AzureRM.profile modules. We can conveniently get these from the PowerShell Gallery.

At the gallery, search for and then navigate to the AzureRM page, and notice the handy Deploy to Azure Automation button:

powershell-gallery-deploy-to-azure-automation

Clicking this button sends you back to the Azure portal and sets you up with the beginnings of a Custom deployment. This is the technique of deploying to ARM using JSON templates, a technique I'll be using later in this series to create VMs for the continuous deployment pipeline. The template contains a number of parameters and other details that are supplied by working your way through the sections of the custom deployment panel:

azure-portal-custom-deployment

The screenshot above shows the Parameters section being edited. Be sure to save the section by clicking OK before moving to a new section. When all sections are complete clicking Create will deploy the module to the automation account. There isn't any real magic here -- if you look at the JSON template it is just uploading the module from its location on https://devopsgallerystorage.blob.core.windows.net/azureautomationpackages.

You'll need to repeat this process for AzureRM.Compute and AzureRM.profile. At the time of writing there was no button for AzureRM.Compute. No matter, you can achieve the same effect by pasting a URI similar to this in to your browser: https://www.powershellgallery.com/packages/AzureRM.Compute/1.1.0/DeployModuleToAzureAutomation. The preceding URI is specific to version 1.1.0 of AzureRM.Compute -- obviously you may need to tweak for a newer version.

Create a Runbook

At long last we get to the point of being able to do something useful, which in this case is to run a PowerShell script to stop all virtual machines -- member servers first and then any domain controllers. PowerShell scripts live inside a runbook which is created with the following code:

With the runbook created the easiest way to test scripts is in the portal. Navigate to Automation Accounts > $YourAutomationAccount$ > Runbooks > StopAllVirtualMachines and click on the Edit button. This will take you to an editor that will allow you to write, test and finally publish your PowerShell script. The code I'm using is:

Paste your version of the above in to the editor and click on Test pane. From here you can run the code and note any errors. It's then back to Edit to make any changes, which you should save before going back to the test pane. When all the kinks are ironed out click on Publish to make the runbook live.

[As an aside, I'm in two minds as to whether the above code is optimal because it doesn't check to see if a VM is already stopped (as my original version of this did) because of difficulties in determining the status when using the Get-AzureRmVM | ForEach-Object construct . However the Azure cmdlets are now idempotent so they don't really care and given this is running as a job in Azure rather than on my workstation I'm not too concerned about how long it takes.]

Create and Configure a Schedule

We want our runbook to run every day in case we accidentally forget to stop our VMs and for this we set up a Schedule using code similar to this:

The preceding code sets up a schedule to start at 01:00 the next day. You'll need to remove .AddDays(1) if you are working just past midnight.

At the moment the schedule will run but it won't trigger anything. We need to associate the runbook with the schedule using the following code:

As a final test it's probably a good idea to create a scratch A0 VM and leave it running to confirm next day that everything is working as expected.

One Final Tip

A final thought to finish on is concerned with how you will stop your VMs after a session working with them. It might be tempting to use a local version of the code in the runbook however even if you could justify the code duplication this isn't always a great idea when you have a large number of VMs to stop as you'll wait for ages. Instead you can invoke the runbook from PowerShell:

The beauty of this technique is that execution only tales seconds so you don't have to wait for the ISE (or whatever you are using) to finish its business. Of course if you don't have access to PowerShell you can always use the portal.

Cheers -- Graham

Continuous Delivery with TFS / VSTS – Installing a Domain Controller

Posted by Graham Smith on November 25, 20153 Comments (click here to comment)

[Please note that I've edited this post since first publishing it to reflect new information and / or better ways of doing things. See revision list at the end of the post.]

This fourth blog post in my series on on Continuous Delivery with TFS / VSTS picks up from the previous post where we created some common infrastructure and moves on to installing a domain controller. If that seems a little over-the-top bear in mind that one of the aims of this series of blog posts is to help organisations with a traditional on premises TFS installation implement continuous delivery. Typically a domain controller running Active Directory is going to be part of the mix.

Create the PRM-CORE Resource Group

I'm planning to create my enduring servers in a resource group called PRM-CORE so the first step is to create the group:

Create the Domain Controller

My plan to create the domain controller in PowerShell came to an abrupt halt when the code bombed with an error that I couldn't fix. When I first started writing this post there was a bug in the new-style Azure PowerShell cmdlets that stops a VM from being created where the storage account already exists in a different group from the one the VM will be created in. With a newer version of the cmdlets this has now changed to a warning: WARNING: Storage account, prmstorageaccount, is not found. The OS disk may be in a different storage group. As far as I can tell, despite the message VMs are now created correctly. Anyway, all this was too late for me as I had already created the VM (called PRM-CORE-DC) via the portal. If you go down this route do make sure you set the DNS name label of the public IP address to the name of the VM. (See my post here for more details about why you should do this.) Other than that gotcha creating a VM in the portal is pretty straightforward but don't forget to specify the already-created premium storage account (if you have decided to go down the premium route as I have), virtual network and the resource group created above. I created my DM as a Standard DS2 (since I'm planning for it to be doing quite a lot of work) running Windows 2012 R2 Datacenter. Previously my DC would be configured as a Standard A0 and I would leave it turned on (which costs pennies per day) but the DS2 burns through credits much faster so I'll be turning it off. This will all be scripted so the DC can be shut down last (and started up first) and I'll also be showing how to automate this in case a lapse of memory leaves your VMs turned on.

Preparing for the Domain Controller Role

Probably the first thing to know about creating a domain controller in Azure is that it always needs to have the same internal IP address. If you never turn it off then that will work but the recommendation is to set a static internal IP address -- the first available one for the virtual network we are using is 10.0.0.4 . You can do this with the following PowerShell, assuming the VM is turned off and the target IP address isn't already in use:

You can also do this via the new portal. With your VM shut down navigate to its network interface then to Settings > IP addresses. From there you can make the IP address static and set it to 10.0.0.4:

azure portal static ip address

The second thing to know about setting up a domain controller in Azure is that if the AD DS database, logs, and SYSVOL are not stored on a non-OS disk there is a risk of loosing data. For a lightly used POC environment I'm happy to take the risk but if you are not you'll need to add a data disk to your VM and specify this as the location for the AD DS database, logs, and SYSVOL during the installation of the AD role.

Installing Active Directory

This isn't a whole lot different in Azure from an on-premises installation, although there is one crucial step (see below) particular to Azure to ensure success. Of course if you are a developer you may not be installing AD on a regular basis so my previous statement may be less than helpful. Fear not as you can find a complete rundown of most of what you need to know here. In essence though the steps are as follows:

  • Before installing AD in Azure you need to temporarily set the virtual network to a Custom DNS of 127.0.0.1 for the Primary DNS server setting. See this post for more details. It's crucial to do this before you start installing AD.
  • Install the Active Directory Domain Services role via Server Manager > Add roles and features.
  • The wizard steps are mostly straightforward but if you are unfamiliar with the process it may not be obvious that since we are starting from scratch you need to select Add a new forest on the Deployment Configuration step of the wizard.
  • You'll also need to specify a Root domain name. I chose prm.local.
  • With your VM restarted make sure you complete the Reset the DNS server for the Azure virtual network instructions in the documentation. Essentially this is replacing the temporary 127.0.0.1 primary DNS server  setting with the one for the DC, ie 10.0.0.4.
  • With AD up-and-running you'll probably want to navigate to Server Manager > Tools > Active Directory Users and Computers and create a user which you'll use to log on to servers when they have been added to the domain. It's not a best practice but you might find it useful if this user was in the Domain Admins group.

That's it for this time. Use your Domain Admin powers wisely!

Cheers -- Graham


Revisions:

12/12/2015 -- Replaced adding a DNS forwarder pointing to Google's DNS server with setting the virtual network's Secondary DNS Server to 168.63.129.16 to allow access to the Internet.

2/1/2016 -- Updated to reflect my adoption of premium storage and also to remove the change above and replace with a crucial technique for ensuring that the DNS roots hint list on the DC is populated correctly.

 

Continuous Delivery with TFS / VSTS – Laying Foundations in Azure

Posted by Graham Smith on November 19, 2015No Comments (click here to comment)

[Please note that I've edited this post since first publishing it to reflect new information and / or better ways of doing things. See revision list at the end of the post.]

In the previous post in this series on Continuous Delivery with TFS / VSTS we started working with the new-style Azure PowerShell cmdlets. In this post we use the cmdlets to lay some foundational building blocks in Azure.

Resource Groups

One of the new concepts in Azure Resource Manager (ARM) is that all resources live in Resource Groups which are really just containers. Over time best practices for resource groups will undoubtedly emerge but for the moment I'm planning to use resource groups as follows:

  • PRM-COMMON -- this resource group will be the container for shared infrastructure such as a storage account and virtual network and is the focus of this post.
  • PRM-CORE -- this resource group will be the container for enduring servers such as the domain controller and the TFS server.
  • PRM-$ENV$ -- these resource groups will be containers for servers that together form a specific environment. The plan is that these environments can be killed-off and completely recreated through ARM's JSON templates feature and PowerShell DSC.

I'm using PRM as my container prefix but this could be anything that suits you.

Creating PRM-COMMON

Once you have logged in to Azure PowerShell  creating PRM-COMMON is straightforward:

Note that you need to specify a location for the resource group -- something to consider as you'll probably want all your groups close together. If you want to visually verify the creation of the group head over to the Azure Portal and navigate to Resource groups.

Create a Storage Account

In order to keep things simple I prefer just one storage account for all of the virtual hard disks for all the VMs I create, although this isn't necessarily a best practice for some production workloads such as SQL Server. One thing that is worth investigating is premium storage, which uses SSDs rather than magnetic disks. I'd originally discounted this as it looked like it would be expensive against my Azure credits however after experiencing how VS 2015 runs on standard storage (slow on a Standard A4 VM) I investigated and found the extra cost to be marginal and the performance benefit huge. I recommend you run your own tests before committing to premium storage in case you are on a subscription where the costs may not outweigh the performance gains, however I'm sold and most of my VMs from now on will be on premium storage. To take advantage of premium storage you need a dedicated premium storage account:

Even though the resource group is in West Europe you still need to supply the location to the New-AzureRmStorageAccount cmdlet, and note that  I've added a ‘p' on the end of the name to indicate the account is a premium one as I may well create a standard account for other purposes.

Create a Virtual Network

We also need to create a virtual network. Slightly more complicated but only just:

Create a SendGrid Account for Email Services

In later posts we'll want to configure some systems to send email notifications. The simplest way to achieve this in Azure is to sign up for a free SendGrid email service account from the Azure Marketplace. At the time of writing the documentation is still for the classic Azure portal which doesn't allow the SendGrid account to be created in a resource group. The new portal does though so that's the one to go for. Navigate to Marketplace > Web + Mobile and search for sendgrid to display the  SendGrid Email Delivery which is what you want. Creating an account is straightforward -- I called mine PRMSMTP and created it in the PRM-COMMON resource group. Make sure you choose the free option.

Using Azure Resource Explorer

As an alternative to using the portal to examine the infrastructure we have just created we can also use the browser-based Azure Resource Explorer.  This very nifty tool allows you to drill down in to your Azure subscriptions and see all the resources you have created and their properties. Give it a go!

Cheers -- Graham


Revisions:

31/12/15 -- Recommendation to investigate using premium storage.

 

Continuous Delivery with TFS / VSTS – A New Way of Working with Azure Resource Manager

Posted by Graham Smith on November 12, 201510 Comments (click here to comment)

In this second post in my Continuous Delivery with TFS / VSTS series it's time to make a fresh start in Microsoft Azure. Whaddaya mean a fresh start? Well for a little while now there has been a new way to interact with Azure, namely through a feature known as Azure Resource Manager or ARM. This is in contrast to the ‘old' way of doing things which is now referred to as Azure Service Management or ASM. As I mention in a previous blog post ARM is the way of the future and this new series of blog posts is going to be entirely based on using ARM using the new portal (codename Ibiza) where portal interaction is necessary. I have lots of VMs created in ASM but the plan is to clear those down and start again.

However, I'm a big fan of using PowerShell to work with Azure at every possible opportunity and a further reason for making a fresh start is that there is a new set of Azure PowerShell cmdlets for working with ARM (to avoid naming clashes with ASM cmdlets). To me it makes sense to start a new series of posts based on this new functionality.

As usual, the aim of this post isn't to teach you foundational concepts and if you need an introduction to ARM I have a Getting Started blog post with a collection of links to help you get going. Be aware that most of the resources pre-date the arrival of the new Azure PowerShell cmdlets. In the rest of this post we'll look at how to get up-and-running with the new cmdlets.

Install the new Azure PowerShell Cmdlets

First things first you'll need to install the new-style Azure PowerShell cmdlets. At the time of writing these were in preview, and the important thing to note is that the new version (1.0 or later) introduces breaking changes so do consider which machine you are installing them on. In the fullness of time we will be able to perform the installation from Web Platform Installer but initially at least it's a manual process. Details are available from the announcement blog here and Petri has a nice set of instructions as well.

Logging in has Completely Changed

If you have been used to logging in to Azure using the publish settings file method then you need to be aware that this method simply will not work with ARM since certificate-based authentication isn't supported. Instead you use the Login-AzureRmAccount cmdlet, which causes the Sign in to Microsoft Azure PowerShell dialog to display. What happens next depends on the type of account you attempt to log in with. If you use a Microsoft Account (typically this is the account your MSDN subscription is associated with) the dialog will recognise this and redirect you to a Sign in to your Microsoft account dialog. If you log in with an Azure AD account you are logged straight in -- assuming authentication is successful of course.

After a successful login you'll see the current ‘settings' for the login session. If you only have one Azure subscription associated with your login you are good to go. If you have more than one you may need to change to the subscription you want to use. The Get-AzureRMSubscription cmdlet will list your subscriptions and then there are a couple of options for changing subscription:

If using the second version obviously replace with your GUID. In case you were wondering the one above is made up...

The ASM version of Select-AzureRmSubscription takes a -Default parameter to set the default subscription but this seems to be missing in the ARM version -- hopefully only a temporary thing.

But I Don't Want to Type my Password Every Time I use Azure

When you log in using Login-AzureRmAccount it seems a token is set which expires after a period of time -- about 12 hours according to this source. This means that you are going to be logging in manually quite frequently which can get to be a chore and in any case is of little use in automated scripts. There is an answer although it doesn't feel as elegant as the publish settings file method.

The technique involves firstly saving your password to disk in encrypted format (a one-time operation) and then using your login and the encrypted password to create a pscredential object that can be used with the -Credential parameter of Login-AzureRmAccount. All the details you need are explained here however do note that this technique only works with an Azure AD account and also be aware that the PowerShell is pre new-style cmdlets. The resulting new-style code ends up something like this:

If you only have one Azure subscription you can of course simplify the above snippet by removing the subscription details. Is it a good idea to store even an encrypted password on disk? It doesn't feel good to me but it seems for the moment that this is what we need to use. The smart money is probably on using an Azure AD account with very limited privileges and then adding permissions to the account as required. Do let me know in the comments if a better technique emerges!

Cheers -- Graham