I am working on a project, which features a big website with a lot of people (developers, quality engineers, deployment engineers and content editors) working on it. The site has multiple environments, and each of the environments is distributed among several servers.

The pros of continuous integration for such a project is that with each check-in, the code gets pushed to all development servers. From there with a single click, the code goes to staging and afterwards production. If something is wrong with the code, you can rollback to a previous build again with a single click. The only con I can think of is that you will regularly have to cleanup old deployments, which start to take a lot of space after some time.

This post is intended to guide you how to setup your Umbraco for continuous integration and deployment.

Project Prerequisites

Your website is suitable for continuous integration if it has the following features:

  • A single large website project with lots of custom code, source control, nightly builds and tests.
  • Multiple environments – Development, Staging, Production, etc.
  • Relatively large team of developers, quality engineers, deployment engineers and content editors.

The Setup

The following setup is a bit expensive to achieve and requires a team working on it, but once done, it is extremely easy to push code around and do “magic” with one click. If you are going to work on the project for more than a couple of months or a year, it is worth the effort to setup proper deployment.

Source Setup

In your Visual Studio project, add only the Umbraco Core package and not the whole CMS. The project will also contain your custom code, views, scripts and styles. This way you can have you functionality published on top of a fresh Umbraco install with each deploy.

Deployment Tool

We are using Octopus Deploy and it does the job fairly well. Probably setting up the tool and the deployment scripts is the hardest task. The tool deploys the website to a new folder named after the build number and sets it as site root in IIS. For example:

- mysite
- wwwroot
- build1
- build2

With this site structure, you are able to switch build versions very quickly without any downtime. On every deploy, the tool does the following steps to the new folder:

  1. Extract a fresh copy of Umbraco (a zip downloaded from Umbraco).
  2. Extract predefined Umbraco packages (a regular Umbraco plugin package, which is extracted by a custom tool we developed).
  3. Extract the custom code, views and assets, which are dropped by the build.
  4. Run the Umbraco permissions. For security reasons it is best that you use the default app pool account to runt the site, not NETWORK SERVICE. That is why you should be running the script on each deploy.
  5. Set the new folder as site root in IIS.

Distributed Servers

Umbraco has an article on distributed environments. For the continuous deployment scenario, the tool takes care of pushing the new site to each server so you don’t have to worry about setting up DFS on the whole site.

Following are the files that need to be moved out of the site root folder and handles by DFS:

  • Examine Indexes – rebuilding the indexes after each deploy is a costly operation and takes a lot of time.
  • Media, CSS and Scripts.

You can move out this folders by creating virtual directories in IIS. Example of a folder structure with “resources”:

- mysite
- resources
- Indexes
- Media/CSS/Scripts
- wwwroot
- build1
- build2

2 Comments

Gijs van Dam (@gijswijs) · December 5, 2014 at 12:29

Could you tell us a little bit more about the “custom tool” handling Umbraco packages? The problems we encounter in our CI set-up is the fact that you have to install Umbraco packages manually on the staging and production environments.
Becasue package install can do many thigs (Database schema changes, insert data, etc. etc.) It’s not enough to just deploy the files installed by the package.
We never found a good solutions for that (Redgate schema compare, data compare, etc.)
A solution we are currently exlporing is a way to check in the packages into source control, and then the build process checks for packages that aren’t installed yet, but do appear in source control and then trigger an automated package install.

Nikolay Arhangelov · December 5, 2014 at 19:45

Hello! We would first install the package manually so it can do its stuff to the database, and then add the package file to source control. Once the build triggers, the custom tool would open the package from source control and read its manifest. The manifest contains the actual names of the files (the files in the package have GUIDs for names) and their locations, as well as the steps to perform during the installation – one of the steps I recal was a DB step. In other wors, our tool only extract the files from the package with proper names to the deployment location.

If you are looking for something that also runs the installation steps (like the DB step) you’ll need to dig deep in Umbraco’s source – if I recal correctly, there was a package installer class, which did all this. Also I think that the manifest contained the class from the package, which installs it.

Leave a Reply

Avatar placeholder

Your email address will not be published. Required fields are marked *