Skip to main content

As part of my continued effort to improve the page load speed of this website, and to try out new things, I recently decided to setup a CDN (Content Delivery Network) to host any static files like CSS, JS, fonts and SVG files.
And since this is an Umbraco site, it was obvious to use the excellent Azure CDN Toolkit. This package allows not only static files to be hosted from the Azure CDN, but also works together with ImageProcessor to host any media items on Azure Blob Storage and cache them through the CDN.

Once I followed the instructions to setup the CDN Toolkit and setup the right Blob storage and CDN account on Azure, it was all good to go. But I needed a way to deploy any static files into the Blog storage container.
Since I'm using Visual Studio Team Services (VSTS) for the continuous integration and deployment workflow, I was after something to automate this as well.
Note that this article applies to any website hosted on Azure, using Azure CDN with Blob storage as it's source and using VSTS for its' deployment, not just Umbraco websites.

When using VSTS, the workflow is divided into two parts:

  • Build: where the website project is built using the code from the associated Git repository.
  • Release: where the output files of the previous build process are being deployed to a server (or in this case an Azure App Service).

My initial build process was generating a WebDeploy ZIP file, which then got deployed during the release.
I knew I had to expand the build process with an extra task that would take all my static files (CSS, JS, images, etc.) and put them in a separate folder of the build output. This would then allow me to use them during the release.
So I included a Copy Files task after the build task has completed, which copies the various static files into an assets folder in the build output staging directory.

Next up, the release task. Before I started adding the CDN, the release task was quite simple, with only an Azure App Service Deploy task that takes the WebDeploy ZIP file and deploys it to Azure.
While looking through the available options, I spotted the Azure File Copy task. This uses a command-line utility to copy any files into a specific Blob storage container. It was easy enough to setup and after a test run I could confirm it uploaded my files as intended. Great!

However, when I then went to check the website, I noticed none of the CSS, fonts and SVG files seemed to load. After some digging around in the browser developer tools I noticed that the files did download, but they all had the same MIME type: application/octet-stream. And because of that the browser struggles to identify the type of file and won't load the CSS.

After reading a little bit more about Azure Blob storage, I noticed that the default MIME type for newly uploaded blobs is application/octet-stream.
But it turns out that the Azure File Copy utility has got a command argument you can pass in that will select the correct MIME type based on the file extension of the file you're trying to upload: /SetContentType.
So I updated my task with this parameter and ran it again. I checked a few of the files that were uploaded using Visual Studio's Cloud Explorer to confirm that the MIME type now was correct. Then I refreshed the website and this looked more promising: the CSS and JS files were loading fine. But it seemed that the web font files (.WOFF2, .WOFF) and SVG files were still displaying the wrong MIME type.

So it seems that the SetContentType argument does not work well for all file types. But you can tweak the argument to specifically set a MIME type. For example, /SetContentType:image/svg+xml would set the correct MIME type for SVG images.
Since you can only specify a single argument, I had to change from a single Azure File Copy task that would upload all asset files, to a number of separate ones. Each one would only upload a single type of files by including the Pattern argument as well. So for the example above for SVG files, the argument would be /Pattern:*.svg.

Release setup with numerous Azure File Copy tasks
Release setup with numerous Azure File Copy tasks

Once that was setup I ran everything again and this time all the files were loaded through the CDN using the right MIME types. And then I thought I was done.

But no, when I ran a Google PageSpeed test it revealed that none of the static files served through the CDN had a Cache-Control header set. This means that the browser will just download the files again when you reload the page. Since static files shouldn't change very often, it's recommended to set an expiry date of at least 7 days in the Cache-Control header that gets sent back with the file.
After some investigation I found out that you can specify this value for a blob file, but that it's not possible to set this header value through the Azure File Copy task.
The documentation that I found about it suggested to either use PowerShell or write some code to do it. And since VSTS also has an Azure PowerShell task available, this seemed the way to go.

When creating the PowerShell script I soon realised that I could probably do everything in this one script. So get all the asset files, upload them to Azure Blob storage with the correct MIME type and cache header. That would mean I could get rid of the long list of Azure File Copy tasks completely.
I ended up creating and using the following script:

$StorageAccountName = "my-account"
$StorageAccountKey = "my-account-key"
$StorageContainerName = "assets"

# get Azure blob storage context
$context = New-AzureStorageContext `
    -StorageAccountName $StorageAccountName `
    -StorageAccountKey $StorageAccountKey

# get folder paths
$rootFolder = Split-Path -Path $PSScriptRoot -Parent
$assetsFolder = "$($rootFolder)\assets"

# get assets files
$files = Get-ChildItem $assetsFolder -Recurse -File

Write-Host "Found $($files.Count) files."

foreach ($file in $files)
{
    Write-Host "Processing $($file.FullName)"

    # get MIME type for current file
    $ContentType = "application/octetstream"

    switch ($file.Extension)
    {
        ".js" { 
            $ContentType = "application/javascript" 
        }
        ".css" { 
            $ContentType = "text/css" 
        }
        ".map" { 
            $ContentType = "application/json" 
        }
        ".svg" { 
            $ContentType = "image/svg+xml" 
        }
        ".png" { 
            $ContentType = "image/png" 
        }
        ".jpg" { 
            $ContentType = "image/jpg" 
        }
        ".woff2" { 
            $ContentType = "application/font-woff2" 
        }
        ".woff" { 
            $ContentType = "application/x-font-woff" 
        }
        ".ttf" { 
            $ContentType = "application/font-sfnt" 
        }
        ".eot" { 
            $ContentType = "application/vnd.ms-fontobject" 
        }
        ".json" { 
            $ContentType = "application/json" 
        }
        ".xml" { 
            $ContentType = "text/xml" 
        }
        ".ico" { 
            $ContentType = "image/x-icon" 
        }
    }

    # set Properties
    $Properties = @{"ContentType" = $ContentType; "CacheControl" = "public, max-age=31536000"}

    # upload blob
    Set-AzureStorageBlobContent `
        -File $file.FullName `
        -Blob $file.FullName.Replace("$($assetsFolder)\", "") `
        -Context $context `
        -Container $StorageContainerName `
        -Properties $Properties `
        -Force
}

And with that I finally got the setup I wanted: a single PowerShell script task that will deploy my static assets, followed by the Azure deploy task that will take care of deploying the rest of the website.
While it was a bit of work to get to this point, it was also worth it: the static resources load quicker and overall page load time has gone down.

I noticed that a website hosted on Azure Web Apps was generating multiple Umbraco trace log files for the same day. I went to investigate what caused it, and also came up with a suggested change to resolve this.

Read more

Application Insights is a great way to track a website's performance, but I was after a setup that allowed me to disable this on my dev environment and make sure it's enabled on live.

Read more