Continuous Delivery with Containers – Say Goodbye to IIS Express and LocalDB, with Visual Studio 2017, Docker and Windows Containers

Posted by Graham Smith on May 10, 20174 Comments (click here to comment)

A view I've heard expressed a few times recently, and which I completely agree with, is that we need to be discovering problems with our applications as far to the left as possible since it's much cheaper to fix problems there than further down the line towards—or even in—production. So with this in mind is it just me who feels slightly uneasy that in the Visual Studio world the development and debugging of applications destined for Windows servers tends be on Windows desktop machines using lightweight counterparts of server applications such as IIS Express to host ASP.NET websites and LocalDB to host SQL Server databases? With this setup it seems like we could be storing up trouble for later in the pipeline...

Whether my unease is justified or not, I need feel troubled no more since the world of containers offer us a solution! Since Docker for Windows now supports Windows Containers and Visual Studio 2017 has support for Docker built-in we can now develop server applications on Windows 10 and run and debug them on the exact same operating systems they will run on in production.

In this post I take my version of Contoso University that I've been using for several years now and amend it so that in the developer inner loop phase (ie everything that happens before code is checked in to the build server) the website runs in a Windows Server 2016 container running IIS (rather than IIS Express) and the SQL Server Database Project runs on SQL Server 2016 (rather than LocalDB).

Development Environment

The world of containers is evolving rapidly and the tooling might have changed by the time you read this. At the time of writing my environment is as follows:

  • Windows 10 Professional version 1703 (OS Build 15063.250)
  • Visual Studio Enterprise 2017 version 15.1 (26403.7) with the ASP.NET and web development workload
  • Docker for Windows 17.03.1-ce running Windows containers (I recommend the stable channel as at the time of writing the edge version had a bug that caused a problem for Docker support in Visual Studio)

Depending on the speed of your internet connection you might want to docker pull the following images if you are planning on following along:

It's perhaps worth saying here that I'm using these images for convenience because they are available on Docker Hub. In a production scenario you probably wouldn't want to rely on an image as fully formed as microsoft/aspnet and you would probably start with microsoft/windowsservercore or microsoft/nanoserver and have full control of what is installed. You definitely wouldn't start with microsoft/mssql-server-windows-developer of course.

The Contoso University sample application is essentially the same as Microsoft's version except I've changed the database from Entity Framework Code First to a SQL Server Database Project. I've also changed the application to work with SQL Server authentication (rather than Windows authentication) thus removing the need for a domain controller to supply a domain account. You can get the starting point code from here and the final code here.

Adding Initial Docker Support

The first step towards Dockerizing Contoso University is to add initial Docker support for the ASP.NET web application (out-of-the-box support for SQL Server Database Projects isn't available). This is a simple as right-clicking the ContosoUniversity.Web project and choosing Add > Docker Support. This has three main visible effects:

  • A new docker-compose ‘project' is added at Solution level and is made the Startup Project. This project contains several .yml files.
  • A Dockerfile file and a (nested) .dockerignore file are added to ContosoUniversity.Web.
  • The toolbar button that normally launches a browser has now switched to launching Docker:

The Dockerfile added to ContosoUniversity.Web is based on the microsoft/aspnet image so at this point you should now be able to run the application using the Docker toolbar button and have the website run in a Windows Container based on that image. The database side of things isn't working at this stage of course—Web.config is pointing to LocalDB and the container running the website can't see LocalDB.

To understand what has been created, open a PowerShell session and run docker images followed by docker ps. You should see that an image called contosouniversity.web has been created with a dev tag, and that this image has been used to create a container called something like dockercompose362878786_contosouniversity.web_1.

Adding Docker Support for the SQL Server Database Project

Adding Docker support for the SQL Server Database Project requires the following steps:

  1. Manually add a Dockerfile file and .dockerignore file to the root of ContosoUniversity.Database. Given that these files don't have file extensions and that database projects are quite prescriptive about what they think you should be adding it's easier to add them outside of Visual Studio and then add them in as existing items. (Note that if you are using Windows Explorer you'll need to create .dockerignore as .dockerignore.—Windows will drop the trailing period).
  2. Optionally, close Visual Studio and reopen the solution folder in a text editor such as Visual Studio Code. Open ContosoUniversity.Database.sqlproj and search for the Dockerfile and .dockerignore entries. Change them to look as follows to achieve the nested file effect in Visual Studio:
  3. .dockerignore just needs to contain an asterisk—meaning everything should be ignored.
  4. Dockerfile should contain the following code:
  5. Switching to the docker-compose ‘project', docker-compose.yml should be amended to the following:
  6. A change is also needed to docker-compose.vs.debug.yml which should be amended to the following:

At this point you should be able to run the application using the Docker toolbar button and again see the website running—in a Windows container. However this time a second image (contosouniversity.database, tagged with dev) and corresponding container (named something like dockercompose362878786_contosouniversity.database_1) will have been created, with the container now running SQL Server. This is a newly minted instance of SQL Server and doesn't have a database for our website to connect to, which is the next issue to address.

Connecting the Contoso University Website to its Database

These next steps assume you are following on from the previous section, ie that the website is open in a browser and that Visual Studio is still debugging.

  1. Leave the browser open but stop debugging in Visual Studio.
  2. In ContosoUniversity.Web edit Web.config so that the connection string Data Source points to contosouniversity.database:
  3. In a PowerShell session, find the IP address of the container running SQL Server using docker inspect and passing in enough of the container's ID to make it unique:
  4. In ContosoUniversity.Database edit ContosoUniversity.publish.xml so that the Target database connection points to the IP address of the SQL Server container and change the the authentication to SQL Server Authentication. The User Name should be sa (yes—I know) the password should be the same as the one specified in the Dockerfile used to build the database image. Save the profile and then click Publish.
  5. Back in the web browser running the Contoso University website, click on one of the menu bar links (eg Departments) that causes a database query. If everything has worked you should now have a fully functioning application.

Understanding the Developer Inner Loop Workflow

At this point we have achieved our aim of running and debugging both the website and database components of Contoso University in containers running operating systems that are the same as would be used in production. Once the images and containers have been created they will—as far as my testing is concerned—continue to be used as long as nothing changes. This is the case even if Visual Studio, Docker or even the workstation are restarted. The great thing is that any changes made to the containers—for example updating the database schema—will be preserved. Of course, if something changes in one of the Dockerfile files the images and containers will be rebuilt and in the case of the database the publish file will need to be updated with a new IP address and the database will need to be published again from scratch. Also, if the solution is cleaned (ie Build > Clean Solution) the containers are removed and rebuilt, again necessitating publishing the database from scratch. Overall though, the developer inner loop workflow feels quite slick.

Next Steps

As things stand the compose and Dockerfile files are not ready to be used in a continuous delivery pipeline. The website Dockerfile for example has Contoso University being deployed as the Default Web Site rather than a ContosoUniversity website and the database Dockerfile doesn't cater for any persistent storage. There is also the problem of checking in the database project's publish profile with an IP address specific to one developer's workstation—a real pain for other developers. I'll address these issues as part of getting Contoso University working in a Docker-based continuous delivery pipeline in the next post in this series.

Cheers -- Graham

Continuous Delivery with TFS / VSTS – A Lap Around the Contoso University Sample Application

Posted by Graham Smith on January 27, 20164 Comments (click here to comment)

In the previous post in this series on Continuous Delivery with TFS / VSTS we configured a sample application for Git in Visual Studio 2015. This post continues with that configuration and also examines some of the more interesting parts of Contoso University before firing it up and taking it for a spin.

Build and Update NuGets

Picking up from where we left off last time in Visual Studio, Team Explorer -- Home should be displaying ContosoUniversity.sln in the Solutions section. Double-click this to open Contoso University and navigate to Solution Explorer. The first thing to do is to hit F6 to build the solution. NuGet packages should be restored and the five projects should build. It's worth checking Team Explorer -- Changes at this point to make sure .gitignore is doing its job properly -- if it is there should be no changes as far as version control is concerned.

For non-production applications I usually like to keep bang up-to-date with the latest NuGets so in Solution Explorer right click the solution and choose Manage NuGet Packages for Solution. Click on the Updates link and allow the manager to do its thing. If any updates are found click to Select all packages and then Update:

visual-studio-manage-packages-for-solution

Commit (and sync if you like) these changes (see the previous post for a reminder of the workflow for this) and check the solution still builds.

Test the Unit Tests

The solution contains a couple of unit tests as well as some UI tests. In order to just establish that the unit tests work at this stage open Test Explorer (Test > Windows > Test Explorer) and then click on the down arrow next to Playlist : All Tests. Choose Open Playlist File, open ...\ContosoUniversity.Web.UnitTests\UnitTests.playlist and run the tests which should pass, albeit with very poor coverage (Tests > Analyze Code Coverage > All Tests).

Code Analysis

Code analysis (a feature formerly known as FxCop) is enabled for every project and reports on any rule violations. I've set projects to use the Microsoft Managed Recommended Rules (note though that rule sets don't exist for SQL Server database Projects and in the case of ContosoUniversity.Database everything is selected) however the code as it exists in GitHub doesn't violate any of those rules. To see this feature in action right-click the ContosoUniversity.Web project and choose Properties. On the Code Analysis tab change the rule set to Microsoft All Rules and build the solution. The Error List window should appear and there should be quite a few warnings -- 85 at the time of writing this post. Reverse the change and the solution should once again build without warnings.

SQL Server Data Tools -- A Better Way to Manage Database Changes

There are several ways to manage the schema changes for an application's database and whilst every technique undoubtedly has its place the one I recommend for most scenarios is the declarative approach. The simple explanation is that code files declare how you want the database to look and then a separate ‘engine' has the task of figuring out a script which will make a blank or existing database match the declaration. It's not a magic bullet since it doesn't cope with all scenarios out-of-the-box however there is usually a way to code around the trickier situations. I explained how I converted Contoso University from an Entity Framework Code Migrations way of managing database changes to a declarative one using SQL Server Data Tools here. If you are new to SSDT and want to learn about it I have a Getting Started post here.

As far as Visual Studio solutions are concerned, the declarative approach using SSDT is delivered by way of SQL Server Database Projects. In Contoso University this is the ContosoUniversity.Database project. This contains CREATE scripts for tables and stored procedures as well as for permissions and security. Not only does it contain scripts that represent the database schema, it also contains scripts that can insert reference data in to a newly created database. The significance of this becomes apparent when you realise that a database project can be used to create a database in LocalDB, which can be made fully functioning by inserting reference data. In many cases this could remove the need for a ‘dev' environment, as the local workstation is the dev environment, database and all. This eliminates the problem of config files containing connection strings specific to developers (and the check-in problems that this can cause) because the connection string for LocalDB is generic. This way of working has the potential to wave goodbye to situations where getting an application working on a development machine requires a lengthy setup script and usually some magic.

To see this in action, within the ContosoUniversity.Database project right-click CU-DEV.publish.xml and choose Publish. In the Publish Database CU-DEV.publish.xml dialog click on Publish:

visual-studio-publish-database-to-localdb

Now make sure that the ContosoUniversity.Web project is set as the startup project (if not in bold right-click and choose Set as StartUp Project) and hit F5 to run the application. A fully functioning ContosoUniversity should spring to life. Navigating to (for example) Departments should allow you to perform all CRUD activities. How's that for simplicity?

Tour Stops Here

That's it for our tour of Contoso University. Although there are automated UI tests in the solution I'm purposefully not covering them here as they are very much an advanced topic and I'm also planning some changes. That said, if you have FireFox installed on your development machine you can probably make them run. See my post in a previous blog series here.

In the next post we look at getting a basic CI build up-an-running. Happy coding until then!

Cheers -- Graham

Continuous Delivery with TFS: Our Sample Application

Posted by Graham Smith on December 27, 2014No Comments (click here to comment)

In this post that is part of my series on implementing continuous delivery with TFS we look at the sample application that will be used to illustrate various aspects of the deployment pipeline. I've chosen Microsoft's fictional Contoso University ASP.NET MVC application as it comprises components that need to be deployed to a web server and a database server and it lends itself to (reasonably) easily demonstrating automated acceptance testing. You can download the completed application here and read more about its construction here.

Out of the box Contoso University uses Entity Framework Code First Migrations for database development however this isn't what I would use for enterprise-level software development. Instead I recommend using a tool such as Microsoft's SQL Server Database Tools (SSDT) and more specifically the SQL Server Database Project component of SSDT that can be added to a Visual Studio solution. The main focus of this post is on converting Contoso University to use a SQL Server Database Project and if you are not already up to speed with this technology I have a Getting Started post here. Please note that I don't describe every mouse-click below so some familiarity will be essential. I'm using the version of LocalDb that corresponds to SQL Server 2012 below as this is what Contoso University has been developed against. If you want to use the LocalDb that corresponds to SQL Server 2014 ((localdb)\ProjectsV12) then it will probably work but watch out for glitches. So, there is a little bit of work to do to get Contoso University converted, and this post will take us to the point of readying it for configuration with TFS.

Getting Started
  1. Download the Contoso University application using the link above and unblock and then extract the zip to a convenient temporary location.
  2. Navigate to ContosoUniversity.sln and open the solution. Build the solution which should cause NuGet packages to be restored using the Automatic Package Restore method.
  3. From Package Manager Console issue an Update-Database command (you may have to close down and restart Visual Studio for the command to become available). This should cause a ContosoUniversity2 database (including data) to be created in LocalDb. (You can verify this by opening the SQL Server Object Explorer window and expanding the (LocalDb)\v11.0 node. ContosoUniversity2 should be visible in the Database folder. Check that data has been added to the tables as we're going to need it.)
Remove EF Code First Migrations
  1. Delete SchoolIniializer.cs from the DAL folder.
  2. Delete the DatabaseInitializer configuration from Web.config (this will probably be commented out but I'm removing it for completeness' sake):
  3. Remove the Migrations folder and all its contents.
  4. Expand the ContosoUniversity2 database from  the SQL Server Object Explorer window and delete dbo._MigrationHistory from the Tables folder.
  5. Run the solution to check that it still builds and data can be edited.
Configure the solution to work with a SQL Server Database Project (SSDP)
  1. Add an SSDP called ContosoUniversity.Database to the solution.
  2. Import the ContosoUniversity2 database to the new project using default values.
  3. In the ContosoUniversity.Database properties enable Code Analysis in the Code Analysis tab.
  4. Create and save a publish profile called CU-DEV.publish.xml to publish to a database called CU-DEV on (LocalDb)\v11.0.
  5. In Web.config change the SchoolContext connection string to point to CU-DEV.
  6. Build the solution to check there are no errors.
Add Dummy Data

The next step is to provide the facility to add dummy data to a newly published version of the database. There are a couple of techniques for doing this depending on requirements -- the one I'm demonstrating only adds the dummy data if a table contains no rows, so ensuring that a live database can't get polluted. I'll be extracting the data from ContosoUniversity2 and I'll want to maintain existing referential integrity, so I'll be using SET IDENTITY_INSERT ON | OFF on some tables to insert values to primary key columns that have the identity property set. Firstly create a new folder in the SSDP called ReferenceData (or whatever pleases you) and then add a post deployment script file (Script.PostDeployment.sql) to the root of the ContosoUniversity.database project (note there can only be one of these). Then follow this general procedure for each table:

  1. In the SQL Server Object Explorer window expand the tree to display the ContosoUniversity2 database tables.
  2. Right click a table and choose View Data. From the table's toolbar click the Script icon to create the T-SQL to insert the data (SET IDENTITY_INSERT ON | OFF should be added by the scripting engine where required).
  3. Amend the script with an IF statement so that the insert will only take place if the table is empty. The result script should look similar to the following:
  4. Save the file in the ReferenceData folder in the format TableName.data.sql and add it to the solution as an existing item.
  5. Use the SQLCMD syntax to call the file in the post deployment script file. (The order the table inserts are executed will need to cater for referential integrity. Person, Department, Course, CourseInstructor, Enrollment and OfficeAssignment should work.) When editing Script.PostDeployment.sql the SQLCMD Mode toolbar button will turn off Transact-SQL IntelliSense and stop ‘errors' from being highlighted.
  6. When all the ReferenceData files have been processed the Script.PostDeployment.sql should look something like:

    You should now be able to use CU-DEV.publish.xml to actually publish a database called CU-DEV to LocalDB that contains both schema and data and which works in the same way as the database created by EF Code First Migrations.
Finishing Touches

For the truly fussy among us (that's me) that like neat and tidy project names in VS solutions there is an optional set of configuration steps that can be performed:

  1. Remove the ContosoUniversity ASP.NET MVC project from the solution and rename it to ContosoUniversity.Web. In the file system rename the containing folder to ContosoUniversity.Web.
  2. Add the renamed project back in to the solution and from the Application tab of the project's Properties change the Assembly name and Default namespace to ContosoUniversity.Web.
  3. Perform the following search and replace actions:
    namespace ContosoUniversity > namespace ContosoUniversity.Web
    using ContosoUniversity > using ContosoUniversity .Web
    ContosoUniversity.ViewModels > ContosoUniversity.Web.ViewModels
    ContosoUniversity.Models > ContosoUniversity.Web.Models
  4. You may need to close the solution and reopen it before checking that nothing is broken and the application runs without errors.

That's it for the moment. In the next post in this series I'll explain how to get the solution under version control in TFS and how to implement continuous integration.

Cheers -- Graham

Getting Started with SQL Server Database Projects

Posted by Graham Smith on December 13, 2014No Comments (click here to comment)

By now hopefully all developers understand the importance of keeping their source code under version control and are actually practising this for any non-throwaway code. That's all fine and dandy for your application, but what about your database? In my experience it's pretty rare for databases to be under version control, probably because in the past the tooling has been inadequate or simply off developer radars. There are a number of tools that can help with database version control but one of most readily accessible for Visual Studio developers is the SQL Server Database Project that can be added to a Visual Studio solution. SQL Server Database Projects are part of the Microsoft SQL Server Data Tools ( SSDT) package, which is obviously aimed at developing against SQL Server. You can start with a blank database but most likely you will already have an existing database in which case the database project has the ability to reverse engineer the schema. The result of this process is a series of files containing CREATE statements for the objects that comprise your database (tables, stored procedures and so on), with the files themselves (usually one per object) organised in a folder structure. Since these are essentially text files just like any other code file you can check them in to version control and have any changes recorded just like you would with, for example, a C# file.

In addition to facilitating version control database projects offer a wealth of extra functionality. A declarative approach is used with database projects, ie you state how you want your database to be via CREATE statements and then another process is responsible for making the schema of one or more target databases the same as the schema of your database project. You can also publish your schema to a new database -- ideal if you need to create a LocalDB version on a new development workstation for example. This is really only the tip of the iceberg and I encourage you to use the resources below as a starting point for learning about database projects and SSDT:

Since SSDT is built-in to Visual Studio 2013 the barrier to getting started is very low. Be sure to check for any updates from within Visual Studio (Tools > Extensions and Updates) before you begin. Finally, anyone who has spotted that the SQL Server installation wizard has an option to install SQL Server Data Tools has every right to be confused, since at one point in time this was also the new name for what was once BIDS (Business Intelligence Developer Studio). If you want to know more then this post and also this one will help clarify. Maybe.

Cheers -- Graham