What was I hoping to accomplish by doing this?
The goal was to create a build server that could be used to build and test things developed for Dynamics 365 CE. So that means being able to build and test .NET based plug-ins / workflows, JavaScript / TypeScript, run EasyRepro / Selenium UI tests, and be able to deploy as needed. All that, plus be faster because I’m impatient.
Containers at a high level
In my eyes a container falls nicely in between. Using Azure to run the container you’ll end up paying for storage space for the images which will certainly be more than a Function but probably not more than a VM. A Function and a VM both bill based on compute time. The big different is that when a Function isn’t actually processing something it’s shut off and not adding to the bill. A VM on the other hand is accruing compute time as long as it’s turned on, whether it’s doing work or not. The pricing model for a container is closer to that of a VM but the rates appear to be cheaper and costs are calculated per second as opposed to per hour. Turning things on and off to reduce costs is more suited to containers as they can often be up and running in a few seconds while a VM could easily take a minute or more to full start up and get itself into a state where application can run.
To get an idea of the costs here’s what this is costing to run:
- Azure Container Registry (Standard) – ~3GB of storage used = $.67 / day
- Azure Container Instance (Windows 2 vCPU & 2GB) - $3.11 / when running all day
Management is easier using a container versus a VM. On the VM there is the worry about patching and all the possible ways someone could hack in because of the various services running, open ports, etc. Windows based containers don’t run a full blow copy of the OS but rather a scaled down version (Nano Server or Windows Server Core) based on a specific build of the full OS. Less features, less chance for someone to exploit something. The other point is that these operating systems aren’t made to be patched in the traditional sense of running Windows Update. When it’s time to update you’re basically installing whatever components again from scratch on top of a new version of the OS image. Sounds painful but it’s really not once you’ve got the scripting in place (but up until that point it is very painful).
For more on containers: https://www.docker.com/resources/what-container
Plug-in compile & unit test build time comparison
Hosted VS2017 Agent | |
Task | Time |
Queue time | 1s |
Prepare job | <1s |
Initialize agent | <1s |
Initialize job | 7s |
Checkout | 14s |
NuGet Restore | 1m 7s |
MSBuild - Build Solution | 54s |
Visual Studio Test Platform Installer | 8s |
VsTest – Execute Unit Tests | 35s |
Publish – Test Results | 5s |
Post-job: Checkout | <1s |
Report build status | <1s |
Total | 3m 14s |
Private Agent Azure Container Service | |
Task | Time |
Queue time | 1s |
Prepare job | <1s |
Initialize agent | N/A |
Initialize job | <1s |
Checkout | 3s |
Command Line Script - NuGet Restore | 4s |
MSBuild - Build Solution | 8s |
Visual Studio Test Platform Installer | 2s |
VsTest – Execute Unit Tests | 14s |
Publish – Test Results | 4s |
Post-job: Checkout | <1s |
Report build status | <1s |
Total | 38s |
So what are the differences?
Both were 1 second when only running 1 build at a time. Each agent can only run 1 job at a time by default without getting into parallel builds, multiple agents, etc. When you start lining up multiple builds back-to-back the queue times on the hosted agent are going to be considerably longer.
Initialize Agent
Not applicable for privately hosted agents.
NuGet Restore
These packages needed to be restored for the test I ran:
- FakeItEasy
- FakeXrmEasy.9 (@jordimontana)
- Microsoft.CrmSdk.CoreAssemblies
- Microsoft.CrmSdk.Deployment
- Microsoft.CrmSdk.Workflow
- Microsoft.CrmSdk.XrmTooling.CoreAssembly
- Microsoft.IdentityModel.Clients.ActiveDirectory
- MSTest.TestAdapter
- MSTest.TestFramework
On the container I pre-installed NuGet.exe so instead of using the NuGet build task I used a Command Line Script task and executed something like:
"C:\Program Files\NuGet\nuget.exe" restore $(Build.SourcesDirectory)\TestPlugins.sln -Verbosity Detailed –Noninteractive
After the first run of this build, all those packages were cached locally and available so it took only 4 seconds.
MSBuild - Build Solution
I couldn’t find anything referencing the specifications for the hosted servers. The Azure Container Instance had 2 vCPUs and 2 GB of memory. I suspect that’s more than gets assigned to the hosted agents and as a result the build time is considerably faster.
Visual Studio Test Platform Installer
This is an out of the box build task which installs VSTest.Console.exe needed to run .NET unit tests. In hindsight this step probably wasn’t needed on the hosted agent since it’s already installed by default.
I spent a fair amount of time trying to get this installed on the container image without success. Again in hindsight it would have been easier to install a full copy a Visual Studio 2017 (which would have included this) instead of trying to install the bare minimum number of components I thought I’d need for a capable D365 build & test server. The flip side though is the container image becomes larger, more costly, and more cumbersome to deal with. The bright side is that once it’s installed it’s available for future use without re-downloading and re-installing. The build task is smart like that and first checks if it’s there before blindly installing. That 2 seconds was just to check if it was installed. The bigger reason I wanted to get it installed was to simplify and reduce the number of steps a person would need to go though to create a build. It’s just one more thing for someone new coming in to forget and have to waste time on because the tests won’t run.
VsTest – Execute Unit Tests
I again attribute the difference to the virtual hardware specs likely being better.
Part 2 will cover what went into the creating the container.