Easy Sitecore 9 Azure PAAS no downtime deployments

Disclaimer: It’s not really easy, just easier than alternatives.

This is also known as Blue/Green deployments

Having your site stay up during downtime is a very common ask for any website.  Sitecore comes with it’s own set of challenges, however with a few simple tips and tricks in Azure you can get a very robust solution for a minimal effort.

Attributes of this approach

  • There is downtime for authors in the CM environment.
  • There is no content authoring freeze. (however while a deployment is going on there is a publishing freeze, mitigated by an optional search index swap covered later)
  • Azure assets are created on demand so there is no offline environment hanging out doing nothing but costing money.
  • Orchestrated by powershell
  • The Issues

    Primary Issue

    A deploy is a 2 step process.  You need to publish new templates, renderings, or other developer owned Sitecore items to the content delivery database and you need to deploy the code that knows how to work those new templates. No matter how hard you try, you can’t do these things at the same time perfectly. This leaves the possibility of end users seeing server errors.

    Secondary Issues

    There are two secondary issues that are optional which will be discussed later Search index and Xconnect. These are secondary because they lead to some potentially annoying results, but not likely a server error.

    Solving these problems

    I’m going to focus on solving the primary problem for simplicity. Note that the diagram below has steps for Search Index replication. For simplicity in this blog i’ll focus on blue/green without search index handling.

    Sitecore 9 blue-green model.png

    To accomplish many of these tasks we’ll heavily be utilizing Kudu. Basically it’s a rest api suite for Azure app services.

    Process outline

    How will this process effect specific groups?

    I’m an author

    1. There will be brief downtime in the content management environment
    2. Content management will come back up with the new code and templates
    3. Content editing is allowed
    4. Publishing will send changes to the staging slot content delivery URL (NOTE: if you’re not duplicating a search index, this could impact your end users if your components are sourced by the search index)
    5. Once blue/green completes the changes that you published will be end user facing
    6. Business continues as usual

    I’m a dev-ops professional

    1. Authors have been warned about a brief downtime in CM
    2. Blue/green process is kicked off
    3. CM and CD deployments are done
    4. Alert testers and wait for testing to complete
    5. On successful test, swap staging slots to production, on failure wait for hotfix and deploy to the environments again, on catastrophic failure initialize rollback
    6. If Success, initialize finalize to clean up unused offline environments

    I’m a tester

    1. Get alerted by dev-ops team that the deployment to the staging slot is complete
    2. Break it
    3. Alert development team an emergency hotfix is needed
    4. Wait for dev-ops team to report the hotfix has been deployed
    5. Test again, no breaking this time
    6. Report to dev-ops team all is well

    The Powershell

    NOTE: These powershell functions all require the powershell context to have an authenticated Azure connection to perform it’s tasks. I recommend using a service principal for this.

    Utility functions

    This set of utility functions is mostly for file I/O with the Azure App Services. Saved in a file called “Get-KuduUtility.ps1”

    function Get-AzureRmWebAppPublishingCredentials($resourceGroupName, $webAppName, $slotName = $null){
    	if ([string]::IsNullOrWhiteSpace($slotName) -or $slotName.ToLower() -eq "production"){
    		$resourceType = "Microsoft.Web/sites/config"
    		$resourceName = "$webAppName/publishingcredentials"
    	}
    	else{
    		$resourceType = "Microsoft.Web/sites/slots/config"
    		$resourceName = "$webAppName/$slotName/publishingcredentials"
    	}
    	$publishingCredentials = Invoke-AzureRmResourceAction -ResourceGroupName $resourceGroupName -ResourceType $resourceType -ResourceName $resourceName -Action list -ApiVersion 2015-08-01 -Force
        	return $publishingCredentials
    }
    
    function Get-KuduApiAuthorisationHeaderValue($resourceGroupName, $webAppName, $slotName = $null){
        $publishingCredentials = Get-AzureRmWebAppPublishingCredentials $resourceGroupName $webAppName $slotName
        $ret = @{}
        $ret.header = ("Basic {0}" -f [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $publishingCredentials.Properties.PublishingUserName, $publishingCredentials.Properties.PublishingPassword))))
        $ret.url = $publishingCredentials.Properties.scmUri
        return $ret
    }
    
    function Get-FileFromWebApp($resourceGroupName, $webAppName, $slotName = string::Empty, $kuduPath){
        $KuduAuth = Get-KuduApiAuthorisationHeaderValue $resourceGroupName $webAppName $slotName
        $kuduApiAuthorisationToken = $KuduAuth.header
        $kuduApiUrl = $KuduAuth.url + "/api/vfs/site/wwwroot/$kuduPath"
    
        Write-Host " Downloading File from WebApp. Source: '$kuduApiUrl'." -ForegroundColor DarkGray
        $tmpPath = "$($env:TEMP)\$([guid]::NewGuid()).xml"
        $null = Invoke-RestMethod -Uri $kuduApiUrl `
                            -Headers @{"Authorization"=$kuduApiAuthorisationToken;"If-Match"="*"} `
                            -Method GET `
                            -ContentType "multipart/form-data" `
                            -OutFile $tmpPath
        $ret = Get-Content $tmpPath | Out-String
        Remove-Item $tmpPath -Force
        return $ret
    }
    
    function Write-FileToWebApp($resourceGroupName, $webAppName, $slotName = string::Empty, $fileContent, $kuduPath){
        $KuduAuth = Get-KuduApiAuthorisationHeaderValue $resourceGroupName $webAppName $slotName
        $kuduApiAuthorisationToken = $KuduAuth.header
        $kuduApiUrl = $KuduAuth.url + "/api/vfs/site/wwwroot/$kuduPath"
    
        Write-Host " Writing File to WebApp. Destination: '$kuduApiUrl'." -ForegroundColor DarkGray
    
        Invoke-RestMethod -Uri $kuduApiUrl `
                            -Headers @{"Authorization"=$kuduApiAuthorisationToken;"If-Match"="*"} `
                            -Method Put `
                            -ContentType "multipart/form-data"`
                            -Body $fileContent
    }
    
    function Write-ZipToWebApp($resourceGroupName, $webAppName, $slotName = string::Empty, $zipFile, $kuduPath){
        $KuduAuth = Get-KuduApiAuthorisationHeaderValue $resourceGroupName $webAppName $slotName
        $kuduApiAuthorisationToken = $KuduAuth.header
        $kuduApiUrl = $KuduAuth.url + "/api/zip/site/wwwroot/$kuduPath"
    
        Write-Host " Writing Zip to WebApp. Destination: '$kuduApiUrl'." -ForegroundColor DarkGray
    
        Invoke-RestMethod -Uri $kuduApiUrl `
                            -Headers @{"Authorization"=$kuduApiAuthorisationToken;"If-Match"="*"} `
                            -Method Put `
                            -ContentType "multipart/form-data"`
                            -InFile $zipFile
    }
    
    function Copy-AppServiceToStaging($resourceGroupName, $webAppName){
        $KuduAuth = Get-KuduApiAuthorisationHeaderValue $resourceGroupName $webAppName
        $kuduApiAuthorisationToken = $KuduAuth.header
        $KuduStagingAuth = Get-KuduApiAuthorisationHeaderValue $resourceGroupName $webAppName "Staging"
        $kuduStagingApiAuthorisationToken = $KuduStagingAuth.header
    #NOTE: you must copy all paths of the webroot that aren't involved in your deployment
    #For example if you also wanted to copy the Sitecore folder you could change this to:
    # @("App_Config", "App_Data", "Sitecore")
        @("App_Config", "App_Data") | ForEach-Object {
            $kuduConfigApiUrl = $KuduAuth.url + "/api/zip/site/wwwroot/$_/"
            $tmpPath = "$($env:TEMP)\$([guid]::NewGuid()).zip"
            try{
                $WebClient = New-Object System.Net.WebClient
                $WebClient.Headers.Add("Authorization", $kuduApiAuthorisationToken)
                $WebClient.Headers.Add("ContentType", "multipart/form-data")
    
                $WebClient.DownloadFile($kuduConfigApiUrl, $tmpPath)
    
                $kuduConfigApiUrl = $KuduStagingAuth.url + "/api/zip/site/wwwroot/$_/"
                $kuduApiFolderUrl = $KuduStagingAuth.url + "/api/vfs/site/wwwroot/$_/"
                Invoke-RestMethod -Uri $kuduApiFolderUrl `
                    -Headers @{"Authorization"=$kuduStagingApiAuthorisationToken;"If-Match"="*"} `
                    -Method PUT `
                    -ContentType "multipart/form-data"
                #need a sleep due to a race condition if this folder is utilized too quickly after creating
                Start-Sleep -Seconds 2
                Invoke-RestMethod -Uri $kuduConfigApiUrl `
                    -Headers @{"Authorization"=$kuduStagingApiAuthorisationToken;"If-Match"="*"} `
                    -Method PUT `
                    -ContentType "multipart/form-data" `
                    -InFile $tmpPath
            }finally{
                if (Test-Path $tmpPath){
                    Remove-Item $tmpPath
                }
            }
        }
    }
    function Get-DatabaseNames{
    	param(
    		[Parameter(Mandatory = $true)]
    		[string]$ResourceGroupName,
    		[Parameter(Mandatory = $true)]
    		[string]$AppServiceName,
    		[Parameter(Mandatory = $true)]
    		[string]$DatabaseNameRoot,
    		[string]$SlotName = string::Empty
    
    	)
    	$contents = (Get-FileFromWebApp -resourceGroupName $ResourceGroupName -webAppName $AppServiceName -slotName $SlotName -kuduPath "App_Config/ConnectionStrings.config") | Out-String
    	if ($contents.Contains("$DatabaseNameRoot-2")){
    		$ret = @{
    			InactiveDatabase = $DatabaseNameRoot
    			ActiveDatabase = $DatabaseNameRoot + '-2'
    		}
    	}elseif ($contents.Contains("$DatabaseNameRoot")){
    		$ret = @{
    			InactiveDatabase = $DatabaseNameRoot + '-2'
    			ActiveDatabase = $DatabaseNameRoot
    		}
    	}else{
            throw "unable to find $DatabaseNameRoot OR $DatabaseNameRoot-2"
        }
    	return $ret
    }
    
    

    Step 1

    Copy the Production slot to a Staging slot

    param(
        [Parameter(Mandatory=$true)]
        [string]$ResourceGroupName,
        [Parameter(Mandatory=$true)]
        [string]$AppServiceName
    )
    . "$PSScriptRoot\Get-KuduUtility.ps1"
    $existingSlot = Get-AzureRmWebAppSlot -ResourceGroupName $ResourceGroupName -Name $AppServiceName -Slot "Staging" -ErrorAction SilentlyContinue
    if ($null -ne $existingSlot){
        write-host "Removing existing Staging slot"
        Remove-AzureRmWebAppSlot -ResourceGroupName $ResourceGroupName -Name $AppServiceName -Slot "Staging" -Force
        Start-Sleep -s 10
    }
    $slot = Get-AzureRmWebAppSlot -ResourceGroupName $ResourceGroupName -Name $AppServiceName -Slot "Production"
    New-AzureRmWebAppSlot -ResourceGroupName $ResourceGroupName -Name $AppServiceName -Slot "Staging" -AppServicePlan $slot.ServerFarmId
    
    Copy-AppServiceToStaging -ResourceGroupName $ResourceGroupName -WebAppName $AppServiceName
    

    Step 2

    Make copies of all of your content delivery databases and wire your CM environment to the new databases

    param(
        [Parameter(Mandatory=$true)]
        [string]$ResourceGroupName,
        [Parameter(Mandatory=$true)]
        [string]$AppServiceName,
        [Parameter(Mandatory=$true)]
        [string]$CDAppServiceName,
        [string]$SlotName = string::Empty,
        [Parameter(Mandatory=$true)]
        [string]$DatabaseNameRoot,
        [Parameter(Mandatory=$true)]
        [string]$SqlServerName
    )
    
    . "$PSScriptRoot\Get-KuduUtility.ps1"
    $contents = (Get-FileFromWebApp -resourceGroupName $ResourceGroupName -webAppName $AppServiceName -slotName $SlotName -kuduPath "App_Config/ConnectionStrings.config") | Out-String
    $db = Get-DatabaseNames -ResourceGroupName $ResourceGroupName -AppServiceName $CDAppServiceName -DatabaseNameRoot $DatabaseNameRoot -SlotName $SlotName
    $contents = $contents.Replace("Catalog=$($db.ActiveDatabase);", "Catalog=$($db.InactiveDatabase);")
    
    
    $tst = Get-AzureRmSqlDatabase -DatabaseName $db.InactiveDatabase -ServerName $SqlServerName -ResourceGroupName $ResourceGroupName -ErrorAction SilentlyContinue
    if ($null -ne $tst){
        throw "Unable to copy database when the CM environment is referencing $($db.ActiveDatabase) and $($db.InactiveDatabase) already exist.  Make sure that both the tenant CD AND the CM environment are using the same database before this operation and delete the unused database and try again."
    }
    $tst = Get-AzureRmSqlDatabase -DatabaseName $db.ActiveDatabase -ServerName $SqlServerName -ResourceGroupName $ResourceGroupName -ErrorAction SilentlyContinue
    write-host "Copying database $($db.ActiveDatabase) to $($db.InactiveDatabase)"
    $parameters = @{
        ResourceGroupName = $ResourceGroupName
        DatabaseName = $db.ActiveDatabase
        ServerName = $SqlServerName
        CopyResourceGroupName = $ResourceGroupName
        CopyServerName = $SqlServerName
        CopyDatabaseName = $db.InactiveDatabase
    }
    if (-not [string]::IsNullOrWhitespace($tst.ElasticPoolName)){
        $parameters["ElasticPoolName"] = $tst.ElasticPoolName
    }
    New-AzureRmSqlDatabaseCopy @parameters
    Write-FileToWebApp -resourceGroupName $ResourceGroupName -webAppName $AppServiceName -fileContent $contents -slotName $SlotName -kuduPath "App_Config/ConnectionStrings.config"
    
    

    NOTE: This accesses the CD app service to determine the offline database. It toggles between {name} and {name}-2
    NOTE: DatabaseNameRoot refers to the database name without the -2 on it.
    NOTE: This assumes the assets all share a resource group, if that’s not true, add some more parameters

    Step 3

    Copy all content delivery app services to a Staging slot.

    param(
        [Parameter(Mandatory=$true)]
        [string]$ResourceGroupName,
        [Parameter(Mandatory=$true)]
        [string]$AppServiceName
    )
    . "$PSScriptRoot\Get-KuduUtility.ps1"
    $existingSlot = Get-AzureRmWebAppSlot -ResourceGroupName $ResourceGroupName -Name $AppServiceName -Slot "Staging" -ErrorAction SilentlyContinue
    if ($null -ne $existingSlot){
        write-host "Removing existing Staging slot"
        Remove-AzureRmWebAppSlot -ResourceGroupName $ResourceGroupName -Name $AppServiceName -Slot "Staging" -Force
        Start-Sleep -s 10
    }
    $slot = Get-AzureRmWebAppSlot -ResourceGroupName $ResourceGroupName -Name $AppServiceName -Slot "Production"
    New-AzureRmWebAppSlot -ResourceGroupName $ResourceGroupName -Name $AppServiceName -Slot "Staging" -AppServicePlan $slot.ServerFarmId
    
    Copy-AppServiceToStaging -ResourceGroupName $ResourceGroupName -WebAppName $AppServiceName
    

    NOTE: This assumes that your deployment process deploys all of Sitecore except things in App_Data and environment specific App_Config config files like ConnectionStrings.config

    Step 4

    Execute a deploy as you normally would while targeting the production slot for CM and the staging slot for CD.

    Step 5

    Test your changes on CM and the CD Staging slots

    NOTE: these scripts assume that your deployment process handles all non-dynamically generated assets. So things like the Sitecore folder would be included in your deployment process whereas things like your license.xml or ConnectionStrings.config would not be. These things would be handled by the app service copy in Step 3
    NOTE: if you need to hotfix, you can repeat step 4

    Step 6

    Swap production slot and staging slots in your CD app services

    param(
        [Parameter(Mandatory=$true)]
        [string]$ResourceGroupName,
        [Parameter(Mandatory=$true)]
        [string]$AppServiceName,
        [string]$SlotName = "Staging"
    )
    
    Switch-AzureRmWebAppSlot -ResourceGroupName $ResourceGroupName -Name $AppServiceName -DestinationSlotName "Production" -SourceSlotName $SlotName
    

    NOTE: At this time your change is live and there has been no downtime

    Step 7

    Clean up.
    It’s important to note that you can delay this step to provide a rapid rollback if needed. Before completing this step, swapping the slots again will give us a rollback in seconds.
    Remove the old content delivery databases.

    
    param(
        [Parameter(Mandatory=$true)]
        [string]$ResourceGroupName,
        [Parameter(Mandatory=$true)]
        [string]$AppServiceName,
        [Parameter(Mandatory=$true)]
        [string]$SqlServerName,
        [Parameter(Mandatory=$true)]
        [string]$DatabaseNameRoot,
        [switch]$DeleteActive = $false,
        [string]$SlotName = ""
    )
    . "$PSScriptRoot\Get-KuduUtility.ps1"
    
    $db = Get-DatabaseNames -ResourceGroupName $ResourceGroupName -AppServiceName $AppServiceName -DatabaseNameRoot $DatabaseNameRoot -SlotName $SlotName
    
    if ($DeleteActive){
        Remove-AzureRmSqlDatabase -ResourceGroupName $ResourceGroupName -ServerName $SqlServerName -DatabaseName $db.ActiveDatabase -Force
    }else{
        Remove-AzureRmSqlDatabase -ResourceGroupName $ResourceGroupName -ServerName $SqlServerName -DatabaseName $db.InactiveDatabase -Force
    }
    

    Remove the staging slots for each environment, CD app services and CM app service

    param(
        [string]$ResourceGroupName,
        [string]$AppServiceName,
        [string]$SlotName
    )
    
    Remove-AzureRmWebAppSlot -ResourceGroupName $ResourceGroupName -Name $AppServiceName -Slot $SlotName -Force
    

    That’s it, you’re done and there was no downtime, feels good doesn’t it?

    What if we need to roll back?

    If you’ve gotten to step 7 and find that the code is flawed and won’t be able to hotfix or you need an emergency content change this is how you roll back.

    Step 1

    Swap the CM staging slot to production (this will contain the database connections to the old database)

    param(
        [Parameter(Mandatory=$true)]
        [string]$ResourceGroupName,
        [Parameter(Mandatory=$true)]
        [string]$AppServiceName,
        [string]$SlotName = "Staging"
    )
    
    Switch-AzureRmWebAppSlot -ResourceGroupName $ResourceGroupName -Name $AppServiceName -DestinationSlotName "Production" -SourceSlotName $SlotName
    
    

    Step 2

    Then delete the NEW databases you created in step 2

    param(
        [Parameter(Mandatory=$true)]
        [string]$ResourceGroupName,
        [Parameter(Mandatory=$true)]
        [string]$AppServiceName,
        [Parameter(Mandatory=$true)]
        [string]$SqlServerName,
        [Parameter(Mandatory=$true)]
        [string]$DatabaseNameRoot,
        [switch]$DeleteActive = $false,
        [string]$SlotName = string::Empty
    )
    . "$PSScriptRoot\Get-KuduUtility.ps1"
    
    $db = Get-DatabaseNames -ResourceGroupName $ResourceGroupName -AppServiceName $AppServiceName -DatabaseNameRoot $DatabaseNameRoot -SlotName $SlotName
    
    if ($DeleteActive){
        Remove-AzureRmSqlDatabase -ResourceGroupName $ResourceGroupName -ServerName $SqlServerName -DatabaseName $db.ActiveDatabase -Force
    }else{
        Remove-AzureRmSqlDatabase -ResourceGroupName $ResourceGroupName -ServerName $SqlServerName -DatabaseName $db.InactiveDatabase -Force
    }
    

    Step 3

    then remove the stanging slots

    
    param(
        [string]$ResourceGroupName,
        [string]$AppServiceName,
        [string]$SlotName
    )
    
    Remove-AzureRmWebAppSlot -ResourceGroupName $ResourceGroupName -Name $AppServiceName -Slot $SlotName -Force
    

    Step 4

    Finally you’d run either a TDS sync or Unicorn sync to get your developer owned assets back to pre-deployment state.

    Solving secondary issues

    Search index

    For the search index you’ll want to treat it in the same way we treated Databases. At the start of the blue/green process we’ll create a new one as a clone from the production facing one and rewire the CM environment and CD staging slots to use the new Search index.

    You would do this if you use the search index to source content to end users. Primarily you’d see this in a site search. The issue would be if you add pages to your offline web database the search index would pick those up and possibly return links to end users that are 404s or if the code tries to get the Sitecore item you may end up with null exceptions.

    This is optional because with proper governance you can mitigate the risk. Things like content freezes or publishing freezes would work fine.

    With this safely implemented you could potentially lift any content author freezes. As long as the authors know that during a blue/green deployment their changes are published to the offline environment until the swap is completed.

    XConnect

    You would want to have 2 parallel XConnect environments, one that’s always customer facing and one that’s always not customer facing. During a blue/green deployment you’d want to have your CM environment and the CD staging slots pointed at the offline always XConnect environment. Then immediately before the swap you’ll want to rewire the CM and CD staging slots to point to the online only XConnect environment.

    You would do this if you wanted to be certain that there was no testing data in XConnect.

    This is optional because most people wouldn’t mind a bit of testing data in their analytics data.

    SXA Tenant Specific Field Validation

    SXA includes a large number of extremely helpful modules and it would be wise to utilize as many as you can.  However a problem arises if you have multiple tenants or sites who want to both apply different field validation rules to fields that come from an SXA base template.

    An extra huge thanks to @Sitecorey for a huge amount of help in working through this problem.

    By default validation rules are applied on the template field item under the template.  This means that every other template that inherits yours will automatically get the validation rules applied to it.

    Installation instructions and full source

    Download the Sitecore package

    My solution is to pull the validation rule definitions optionally out of the template field and into a global library of items that contain a template to template field mapping. The field can be defined in the template or in any of the base templates or the base templates base templates and so on.

    validatorDiagram

    An Example

    I have two tenants both using the SEO Metadata module to get keywords and page description fields on their page template. Using this technique i was able to have one tenant define a 125 character limit while having the other tenant not have any validation. This was done by specifying the base template for pages as the template target and the SXA meta description field as the field. Even though the template doesn’t directly define this field we’re still able to apply validation to it.
    fieldvalidationitem

    How it’s done

    The magic is done by overriding the default validation manager and adding functionality on top of it.  Basically what we want to do is augment the default functionality of the validator by looking into the library of global validators defined in the settings section of our SXA site.  To do that we have to follow a few steps:

    1. Get the root of the applicable site’s global field validator definitions root.
    2. Grab all the validator definitions from under that root.
    3. Build validators for all the definitions
    4. Return those validators
    using System;
    using System.Collections.Generic;
    using System.Linq;
    using Sitecore.Abstractions;
    using Sitecore.CodeDom.Scripts;
    using Sitecore.Data.Fields;
    using Sitecore.Data.Items;
    using Sitecore.Data.Validators;
    using Sitecore.XA.Foundation.Multisite;
    
    namespace JeffDarchuk.Foundation.ContentValidation
    {
    	public class GlobalFieldValidatorManager : DefaultValidatorManager
    	{
    		private readonly IMultisiteContext _multisiteContext;
    		private readonly BaseTemplateManager _templateManager;
    
    		public GlobalFieldValidatorManager(BaseItemScripts itemScripts, IMultisiteContext multisiteContext, BaseTemplateManager templateManager) : base(itemScripts)
    		{
    			_multisiteContext = multisiteContext ?? throw new ArgumentNullException(nameof(multisiteContext));
    			_templateManager = templateManager ?? throw new ArgumentNullException(nameof(templateManager));
    		}
    
    		public override ValidatorCollection BuildValidators(ValidatorsMode mode, Item item)
    		{
    			var validators = base.BuildValidators(mode, item);
    			var globalFieldRulesFolder = GetGlobalFieldRulesFolder(item);
    			if (globalFieldRulesFolder == null) return validators;
    			foreach (var validator in GetAdditionalValidators(item, globalFieldRulesFolder, mode))
    			{
    				validators.Add(validator);
    			}
    			return validators;
    		}
    
    		private Item GetGlobalFieldRulesFolder(Item item)
    		{
    			return _multisiteContext.GetSettingsItem(item)?.Children.FirstOrDefault(x =>
    				x.TemplateID.ToString() == Templates.GlobalFieldRuleFolder.Id);
    		}
    
    		private IEnumerable GetAdditionalValidators(Item item, Item globalFieldRulesFolder, ValidatorsMode mode)
    		{
    			var baseTemplates = new HashSet(_templateManager.GetTemplate(item).GetBaseTemplates().Select(x => x.ID.ToString()));
    			foreach (var globalFieldRule in GetGlobalFieldRules(globalFieldRulesFolder))
    			{
    				var template = globalFieldRule[Templates.GlobalFieldRule.Fields.Template];
    				if (!FieldRuleAppliesToItem(item, globalFieldRule, template, baseTemplates)) continue;
    				MultilistField validators = null;
    				switch (mode)
    				{
    					case ValidatorsMode.Gutter:
    						validators = globalFieldRule.Fields[Templates.FieldTypeValidationRules.Fields.QuickValidationBar];
    						break;
    					case ValidatorsMode.ValidateButton:
    						validators = globalFieldRule.Fields[Templates.FieldTypeValidationRules.Fields.ValidateButton];
    						break;
    					case ValidatorsMode.ValidatorBar:
    						validators = globalFieldRule.Fields[Templates.FieldTypeValidationRules.Fields.ValidatorBar];
    						break;
    					case ValidatorsMode.Workflow:
    						validators = globalFieldRule.Fields[Templates.FieldTypeValidationRules.Fields.Workflow];
    						break;
    				}
    				foreach (var validator in validators?.GetItems() ?? Enumerable.Empty())
    				{
    					var baseValidator = BuildValidator(validator, item);
    					baseValidator.FieldID = item.Fields[globalFieldRule[Templates.GlobalFieldRule.Fields.Field]].ID;
    					yield return baseValidator;
    				}
    			}
    		}
    
    		private IEnumerable GetGlobalFieldRules(Item globalFieldRulesFolder)
    		{
    			return globalFieldRulesFolder.Axes.GetDescendants().Where(x => x.TemplateID.ToString() == Templates.GlobalFieldRule.Id);
    		}
    
    		private bool FieldRuleAppliesToItem(Item item, Item globalFieldRule, string template, HashSet baseTemplates)
    		{
    			var useInheritedTemplates = ((CheckboxField)globalFieldRule.Fields[Templates.GlobalFieldRule.Fields.ApplyToInheritedTemplates]).Checked;
    			return item.TemplateID.ToString() == template || useInheritedTemplates && baseTemplates.Contains(template);
    		}
    	}
    }
    

    The end result is that we get to define validation rules in whatever template we wish. Even if that template doesn’t directly define the field but only inherits it.

    Ensure all projects use the same nuget versions

    If you’re using nuget packages in Sitecore helix it’s very easy to inadvertently use different versions of nuget packages between separate helix layers. This can cause very strange and hard to diagnose issues. However with a little validation we can avoid this problem entirely before it hits your webroot. Applying a little powershell script validation method to your publish process we can get clear and concise output of what the problem is.

    function NuGetPackageValidation {
    	param(
    		[string]$solutionPath
    	)
    	write-host "Beginning Nuget validation."
    	$tracker = @{}
    	Get-ChildItem (split-path $solutionPath) -recurse packages.config | ForEach-Object {
    	  $fullFileName = $_.FullName
    	  $csProjName = Split-Path (Resolve-Path "$(Split-Path $fullFileName)\*.csproj").Path -Leaf
    	  [xml]$curConfigFile = Get-Content $fullFilename
    	  $curConfigFile.packages.package | ForEach-Object {
    		  if ($null -eq $tracker[$_.id]){
    			$tracker[$_.id] = @{
    				id = $_.id
    				versions = @{}
    				}
    			$tracker[$_.id].versions[$_.version] = @{
    				version = $_.version
    				project = @($csProjName)
    				}
    			}
    			elseif ($null -eq $tracker[$_.id].versions[$_.version]){
    				$tracker[$_.id].versions[$_.version] = @{
    					version = $_.version
    					project = @($csProjName)
    					}
    			}else{
    				$tracker[$_.id].versions[$_.version].project += $csProjName
    			}
    	  }
    	}
    	$ret = $true
    	$tracker.Keys | ForEach-Object {
    		if ($tracker[$_].versions.Count -gt 1){
    			if ($ret){
    				Write-Host "Problems found with Nuget packages, ensure that the same Nuget package versions are used across projects."
    			}
    			$ret = $false
    			Write-Host @"
    ----------------------------------
      $_
    ----------------------------------
    "@
    			$versions = $tracker[$_].versions
    			$versions.Keys | ForEach-Object{
    				Write-Host $versions[$_].version
    				$versions[$_].project | ForEach-Object {
    					Write-Host "      $_"
    				}
    				Write-Host ""
    			}
    		}
    	}
    	return $ret
      }
    

    You can then take the results from this function and halt the build with some detailed results for what projects are involved in the mismatch. You can expect to see results like this:

    nugetValidation

    Using this data, you can easily track down and fix anomalies before they become hard to diagnose problems.

    Transform configs on Azure

    It can be a powerful tool to give your release process the ability to manipulate configuration files on the server that aren’t directly controlled by your source code.

    The primary use cases i’ve used this for is to at deployment time re-wire connection strings on the fly or to have the stock Sitecore web.config be deployed stock and managed by the release process through remote transforms. Whatever your devops constraints are this technique could come in handy.

    The technique is fairly simple:

  • Connect to Azure, here we’re using a service principal
  • Generate kudu credentials from publishsettings
  • Download xml file
  • Transform file using XDTs and optionally tokens
  • Upload xml file back to app service
  • Note: This requires the loading of the Microsoft.Web.XmlTransform.dll DLL, so make sure that’s available.

    param(
        [string]$KuduPath,
        [string[]]$XDTs,
        [string]$TenantId,
        [string]$SubscriptionId,
        [string]$ResourceGroupName,
        [string]$WebAppServiceName,
        [string]$SlotName = "",
        [string]$ServicePrincipalID,
        [string]$ServicePrincipalKey,
        [hashtable]$Tokens
    )
    
    function Get-AzureRmWebAppPublishingCredentials($resourceGroupName, $webAppName, $slotName = $null){
    	if ([string]::IsNullOrWhiteSpace($slotName)){
    		$resourceType = "Microsoft.Web/sites/config"
    		$resourceName = "$webAppName/publishingcredentials"
    	}
    	else{
    		$resourceType = "Microsoft.Web/sites/slots/config"
    		$resourceName = "$webAppName/$slotName/publishingcredentials"
    	}
    	$publishingCredentials = Invoke-AzureRmResourceAction -ResourceGroupName $resourceGroupName -ResourceType $resourceType -ResourceName $resourceName -Action list -ApiVersion 2015-08-01 -Force
        	return $publishingCredentials
    }
    
    function Get-KuduApiAuthorisationHeaderValue($resourceGroupName, $webAppName, $slotName = $null){
        $publishingCredentials = Get-AzureRmWebAppPublishingCredentials $resourceGroupName $webAppName $slotName
        $ret = @{}
        $ret.header = ("Basic {0}" -f [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $publishingCredentials.Properties.PublishingUserName, $publishingCredentials.Properties.PublishingPassword))))
        $ret.url = $publishingCredentials.Properties.scmUri
        return $ret
    }
    
    function Get-FileFromWebApp($resourceGroupName, $webAppName, $slotName = "", $kuduPath){
        $KuduAuth = Get-KuduApiAuthorisationHeaderValue $resourceGroupName $webAppName $slotName
        $kuduApiAuthorisationToken = $KuduAuth.header
        $kuduApiUrl = $KuduAuth.url + "/api/vfs/site/wwwroot/$kuduPath"
    
        Write-Host " Downloading File from WebApp. Source: '$kuduApiUrl'." -ForegroundColor DarkGray
        $tmpPath = "$($env:TEMP)\$([guid]::NewGuid()).xml"
        $null = Invoke-RestMethod -Uri $kuduApiUrl `
                            -Headers @{"Authorization"=$kuduApiAuthorisationToken;"If-Match"="*"} `
                            -Method GET `
                            -ContentType "multipart/form-data" `
                            -OutFile $tmpPath
        $ret = Get-Content $tmpPath | Out-String
        Remove-Item $tmpPath -Force
        return $ret
    }
    function Write-FileToWebApp($resourceGroupName, $webAppName, $slotName = "", $fileContent, $kuduPath){
        $KuduAuth = Get-KuduApiAuthorisationHeaderValue $resourceGroupName $webAppName $slotName
        $kuduApiAuthorisationToken = $KuduAuth.header
        $kuduApiUrl = $KuduAuth.url + "/api/vfs/site/wwwroot/$kuduPath"
    
        Write-Host " Writing File to WebApp. Destination: '$kuduApiUrl'." -ForegroundColor DarkGray
    
        Invoke-RestMethod -Uri $kuduApiUrl `
                            -Headers @{"Authorization"=$kuduApiAuthorisationToken;"If-Match"="*"} `
                            -Method Put `
                            -ContentType "multipart/form-data"`
                            -Body $fileContent
    }
    function Update-XmlDocTransform($xml, $xdt, $tokens)
    {
        Add-Type -LiteralPath "$PSScriptRoot\Microsoft.Web.XmlTransform.dll"
    
        $xmldoc = New-Object Microsoft.Web.XmlTransform.XmlTransformableDocument;
        $xmldoc.PreserveWhitespace = $true
        $xmldoc.LoadXml($xml);
        $useTokens = $false
        if ($tokens -ne $null -and $tokens.Count -gt 0){
            $useTokens = $true
            $sb = [System.Text.StringBuilder]::new((Get-Content -Path $xdt))
            $tmpPath = "$($env:TEMP)\$([guid]::NewGuid()).xdt"
            $tokens.Keys | ForEach-Object{
                $null = $sb.Replace($_, $tokens[$_])
            }
            Set-Content -Path $tmpPath -Value $sb.ToString()
            $xdt = $tmpPath
        }
        
        $transf = New-Object Microsoft.Web.XmlTransform.XmlTransformation($xdt);
        if ($transf.Apply($xmldoc) -eq $false)
        {
            throw "Transformation failed."
        }
        if ($useTokens){
            Remove-Item -Path $xdt -Force
        }
        return $xmldoc.OuterXml
    }
    $Credential = New-Object -TypeName PSCredential($ServicePrincipalID, (ConvertTo-SecureString -String $ServicePrincipalKey -AsPlainText -Force))
    
    
    # Connect to Azure using SP
    $connectParameters = @{
        Credential     = $Credential
        TenantId       = $TenantId
        SubscriptionId = $SubscriptionId
    }
    
    Write-Host 'Connecting to Azure.'
    
    $null = Add-AzureRmAccount @connectParameters -ServicePrincipal
    
    $contents = Get-FileFromWebApp `
        -resourceGroupName $ResourceGroupName `
        -webAppName $WebAppServiceName `
        -slotName $SlotName `
        -kuduPath $KuduPath
    $XDTs | Foreach-Object{
        $contents = Update-XmlDocTransform -xml $contents -xdt "$PSScriptRoot\XDT\$_.xdt" -tokens $Tokens
    }
    Write-FileToWebApp `
        -resourceGroupName $ResourceGroupName `
        -webAppName $WebAppServiceName `
        -slotName $SlotName `
        -kuduPath $KuduPath `
        -fileContent $contents
    

    Additionally with some simple adaptations of this code you can use Kudu to perform whatever kind of file manipulations you want using a similar technique.

    Tokenize your XDTs with Powershell

    XDTs have gotten a bad rep over the years for being difficult to use and hard to understand. However despite that, they’re still the most reliable and consistent way to transform configurations. I’ve come up with a way to tokenize those XDTs to make them able to be used in a more flexible way.

    For example say we have different cookie domains per environment that we want to patch in and out.

    note: This code requires the dll Microsoft.Web.XmlTransform.dll to be in the same folder as the powershell script

    param(
        [string]$Path,
        [string[]]$XDTs,
        [hashtable]$Tokens
    )
    
    function Update-XmlDocTransform($xml, $xdt, $tokens)
    {
        Add-Type -LiteralPath "$PSScriptRoot\Microsoft.Web.XmlTransform.dll"
    
        $xmldoc = New-Object Microsoft.Web.XmlTransform.XmlTransformableDocument;
        $xmldoc.PreserveWhitespace = $true
        $xmldoc.LoadXml($xml);
        $useTokens = $false
        if ($tokens -ne $null -and $tokens.Count -gt 0){
            $useTokens = $true
            $sb = [System.Text.StringBuilder]::new((Get-Content -Path $xdt))
            $tmpPath = "$($env:TEMP)\$([guid]::NewGuid()).xdt"
            $tokens.Keys | ForEach-Object{
                $null = $sb.Replace($_, $tokens[$_])
            }
            Set-Content -Path $tmpPath -Value $sb.ToString()
            $xdt = $tmpPath
        }
        
        $transf = New-Object Microsoft.Web.XmlTransform.XmlTransformation($xdt);
        if ($transf.Apply($xmldoc) -eq $false)
        {
            throw "Transformation failed."
        }
        if ($useTokens){
            Remove-Item -Path $xdt -Force
        }
        return $xmldoc.OuterXml
    }
    
    $contents = Get-Content $Path | Out-String
    $XDTs | Foreach-Object{
        $contents = Update-XmlDocTransform -xml $contents -xdt $_ -tokens $Tokens
    }
    Set-Content $path -Value $contents
    

    Here is an example usage:

    LocalXmlTransform.ps1 -Path "C:\inetpub\wwwroot\sc901.local" -XDTs "C:\xdt\AddBindingRedirects.xdt","C:\xdt\AddSessionCookie" -Tokens @{_ShareSessionCookie_="mysite.local";_RedirectName_="mydependency"}

    In this example we’re running two XDT files against the web.config and replacing a couple of tokens in the XDT.

    Here is an example of an XDT with tokens to ensure a connection string exists:

    <?xml version="1.0" encoding="utf-8"?>
    <connectionStrings xmlns:xdt="http://schemas.microsoft.com/XML-Document-Transform" xmlns:asmv1="urn:schemas-microsoft-com:asm.v1">
      <add name="_name_" xdt:Transform="Remove" xdt:Locator="Match(name)" />
      <add name="_name_" xdt:Transform="InsertIfMissing" xdt:Locator="Match(name)" connectionString="Encrypt=True;TrustServerCertificate=False;Data Source=_fqdn_;Initial Catalog=_databasename_;User Id=_username_;Password=_password_;" />
    </connectionStrings>
    

    To use this XDT your parameters would look something like this:

    LocalXmlTransform.ps1 -Path "C:\inetpub\wwwroot\App_Config\ConnectionStrings.config" -XDTs "C:\xdt\EnsureConnectionString.xdt" -Tokens @{_name_="mySpecialDatabase";_fqdn_="myazurestuff.database.windows.net,1433";_databasename_="specialdatabase";_username_="secretuser";_password_="secretpassword"}

    Hopefully this will help your devops process

    PAAS Sitecore 9 with an ASE ARM template errors

    ase
    Sitecore 9 works great in PAAS and the arm templates are an enormous help. However if you’re like me and need to use an ASE then you find that your deployments are regularly and mysteriously failing. I poured over the arm templates searching for any reason that this might be happening. After about a month i accepted the unfortunate truth that Azure was incorrectly reporting success before it should.

    I started pulling apart the templates searching for more information. I utilized a custom powershell containment system to manage the ARM template parameters

    The errors originated from the application deployments. These are the parts that use web deploy to restore databases, create users, and push files to your Sitecore server.

    How to stabilize the arm templates

    For this i will assume that you’ve already added the hostingEnvironmentProfile parameter to the Microsoft.Web/sites ARM resources
    Warning: this process is very time consuming
    The first step is to pull them apart. I was able to achieve a high success rate by doing the following:

    1. Take the main azuredeploy.json and remove all of the resources, we’re going to be manually executing them
    2. In each of the nested ARM template json make sure that the required parameters are defined in the parameters and variables section, you can refer back to the azuredeploy.json for how these should be setup
    3. The application.json file is the primary culprit that’s causing our failures, we need to split this one up just like we did the azuredeploy.json except this time we’re going to be creating new ARM template json files for each of the 4 web deploy deployments that reside in application.json
    4. Now that we have the ARM templates separated out into their individual parts we need to create a new powershell wrapper for the process
    5. Note, for security reasons i’m largely omitting things of a sensitive nature here. Make sure you apply user names and passwords to your input parameters either in a parameters.json or in the parameters powershell hashtable as described below

      Powershell Magic

      powershell
      You can find the scripts Here

      Due to the lack of a central ARM template to orchestrate parameters we need to do that ourselves, this comes in a few steps

      1. populate all starting parameters in a hashtable. see Execute.ps1 for an example. Note that you will need to pass in several more parameters or you can include them in a parameters.json that’s loaded here
      2. Scan the arm templates and gather their accepted parameters as they won’t take any extra. See Get-ValidParameters in Utilities.ps1
      3. Based on each ARM template gather up the parameters needed for the deployment and generate a new hashtable of parameters and their values. See Get-Parameters in Utilities.ps1
      4. Execute ARM template using a modified version of Sitecore’s ARM template execution code. See Start-SitecoreAzureDeployment in Utilities.ps1
      5. After completion extract out populated parameters and outputs and save them using Get-ValidParameters from Utilities.ps1
      6. repeat until finished. You can see how the arm templates are ordered here.
        Note, depending on your specific case, you may need to adjust some timing between deployments if some deployments need more time to settle

    Sitecore Helix Powershell Filewatch

    Most likely if you’re developing with Sitecore you have your webroot and your source separated and publish to your site with webdeploy or some other kind of publishing technology.  This is a fine way to do it, but it’s far easier if it just happens automatically.  that’s what these scripts aim to do with Powershell!

    Filewatch

    Here is a zip file for the completed solution that is covered below: HelixFileWatcher. Or check out the source View on Github

    First we need to define a few parameters:

    #where your solution is
    $SourceDirectory = "D:\Source\SitecoreSource\SolutionRoot"
    #where your webroot is
    $DeployTargetWebPath = "C:\inetpub\wwwroot\sc90.local"
    
    

    Next we define how files are moved from your solution to the webroot, this is done through a hashtable matching a file extension to a script block. Note that the views are being deployed to a “Demo” MVC Area as an example.

    $global:FileWatchActions = @{}
    function Get-ProjectRoot{
    	param(
    		[string]$Path
    	)
    	if ($path -eq [string]::Empty){
    		return [string]::Empty
    	}
    	if (-Not (Test-Path $Path)){
    		return Get-ProjectRoot -Path (split-Path $Path)
    	}
    	$PathItem = Get-Item -Path $Path
    	if (-Not ($PathItem -is [System.IO.DirectoryInfo])){
    		return Get-ProjectRoot -Path (Split-Path $Path)
    	}
    	if ((resolve-path "$Path\*.csproj").Count -gt 0){
    		return $Path
    	}elseif($PathItem.Parent -ne $null){
    		return Get-ProjectRoot -Path $PathItem.Parent.FullName
    	}
    	return [string]::Empty
    }
    function Copy-ItemToWebroot{
    	param(
    		$Path,
    		$OldPath,
    		$Delete,
    		$Index,
    		$IntermediatePath
    	)
    	if ($Index -lt 0){
    		return
    	}
    	
    	$TargetPath = $DeployTargetWebPath + $IntermediatePath + $Path.Substring($Index)
    	if ($Delete -and (Test-Path $TargetPath)){
    		write-host "Removing file $TargetPath" -ForegroundColor Red
    		Remove-Item $TargetPath -Force -Recurse
    	}elseif (-Not (Test-Path $Path) -and (Test-Path $TargetPath)){
    		write-host "Removing file $TargetPath" -ForegroundColor Red
    		Remove-Item $TargetPath -Force -Recurse
    	}elseif(Test-Path $Path){
    		if ($OldPath -ne [string]::Empty){
    			$OldTargetPath = $DeployTargetWebPath + $IntermediatePath + $OldPath.Substring($Index)
    			if ((Test-Path $OldTargetPath) -and ((Split-Path $Path) -eq (Split-Path $OldPath) )){
    				$newName = Split-Path $Path -Leaf -Resolve
    				write-host "Renaming Item" -ForegroundColor Yellow
    				write-host "    $OldTargetPath" -ForegroundColor Yellow
    				write-host "    =>$TargetPath" -ForegroundColor Yellow
    				Rename-Item $OldTargetPath $newName -Force
    				return
    			}
    		}
    		if (-Not (Test-Path $TargetPath) -or (Compare-Object (ls $Path) (ls $TargetPath) -Property Name, Length, LastWriteTime)){
    			write-host "Copying Item" -ForegroundColor Green
    			write-host "    $Path" -ForegroundColor Green
    			write-host "    =>$TargetPath" -ForegroundColor Green
    			New-Item -Path "$(Split-Path $TargetPath)" -ItemType Directory -Force
    			Copy-Item -Path $Path -Destination $TargetPath -Recurse -Force
    		}
    	}
    }
    
    #Add watcher action configurations
    #Based on extension define how to process the files that are changed
    $global:FileWatchActions.Add(".cshtml", {
    	param(
    		$Path,
    		$OldPath,
    		$Delete
    	)
    	$index = $Path.IndexOf("\Views", 5)
    	Copy-ItemToWebroot -Path $Path -OldPath $OldPath -Delete $Delete -Index $index -IntermediatePath "\Areas\Demo"
    } )
    
    $global:FileWatchActions.Add(".config", {
    	param(
    		$Path,
    		$OldPath,
    		$Delete
    	)
    	$index = $Path.IndexOf("\App_Config\Include", 5)
    	Copy-ItemToWebroot -Path $Path -OldPath $OldPath -Delete $Delete -Index $index
    	if ($index -eq -1){
    		$fileName = Split-Path $Path -Leaf
    		$FileDirectory = Get-ProjectRoot -Path $Path
    		if ($fileName.StartsWith("web", "CurrentCultureIgnoreCase")){
    			Copy-ItemToWebroot -Path $Path -OldPath $OldPath -Delete $Delete -Index $FileDirectory.Length -IntermediatePath "\Areas\Demo"		
    		}
    	}
    } )
    
    $global:FileWatchActions.Add(".dll", {
    	param(
    		$Path,
    		$OldPath,
    		$Delete
    	)
    	$index = $Path.IndexOf("\bin", 5)
    	Copy-ItemToWebroot -Path $Path -OldPath $OldPath -Delete $Delete -Index $index	
    } )
    
    $global:FileWatchActions.Add("folder", {
    	param(
    		$Path,
    		$OldPath,
    		$Delete
    	)
    	if (-Not( $delete -or $OldPath -ne [string]::Empty)){
    		return
    	}
    	$index = $Path.IndexOf("\Views", 5)
    	if ($index -ne -1){
    		Copy-ItemToWebroot -Path $Path -OldPath $OldPath -Delete $Delete -Index $index -IntermediatePath "\Areas\Demo"
    		return		
    	}
    	$index = $Path.IndexOf("\App_Config\Include", 5)
    	if ($index -ne -1){
    		Copy-ItemToWebroot -Path $Path -OldPath $OldPath -Delete $Delete -Index $index -IntermediatePath "\App_Config\Include"
    		return		
    	}
    })
    
    

    Then we set up the file watchers to watch the important parts of our code

    $global:LastEvent = ((Get-Date).AddSeconds(2).ToString('HH:mm:ss.fff'))
    function global:Send-ChangesToWebroot{
    	param(
    	[string]$Path = [string]::Empty,
    	[string]$OldPath = [string]::Empty,
    	[bool]$Delete = $false
    	)
    	$extension = [IO.Path]::GetExtension($Path)
    	$IsDirectory = $false
    	if (Test-Path $Path){
    		$IsDirectory= (Get-Item -Path $Path) -is [System.IO.DirectoryInfo]
    	}elseif ($Delete -and $extension -eq [string]::Empty){
    		$IsDirectory = $true;
    	}
    	try{
    		if (-Not $IsDirectory -and $global:FileWatchActions.ContainsKey($extension)){
    			$global:LastEvent = ((Get-Date).AddSeconds(2).ToString('HH:mm:ss.fff'))
    			$global:FileWatchActions.Get_Item($extension).Invoke($Path, $OldPath, $Delete)
    		}elseif ($IsDirectory){
    			$global:LastEvent = ((Get-Date).AddSeconds(2).ToString('HH:mm:ss.fff'))
    			$global:FileWatchActions.Get_Item("folder").Invoke($Path, $OldPath, $Delete)
    		}
    	}catch [System.Exception]{
    		Write-Host "An error has occurred while attempting to run the processor for $extension" -ForegroundColor Red
    		Write-Host "Path: $Path" -ForegroundColor Red
    		Write-Host "OldPath: $OldPath" -ForegroundColor Red
    		Write-Host $_.Exception.ToString() -ForegroundColor Red
    	}
    }
    function Add-Watcher{
    	param(
    		$Directory
    	)
    	$Watcher = New-Object IO.FileSystemWatcher $Directory, "*" -Property @{IncludeSubdirectories = $true;NotifyFilter = [IO.NotifyFilters]'FileName, DirectoryName, LastWrite, Size'}
    	
    	Register-ObjectEvent $Watcher Changed -SourceIdentifier "$Directory FileChanged" -Action {Send-ChangesToWebroot -Path $Event.SourceEventArgs.FullPath}
    	
    	Register-ObjectEvent $Watcher Renamed -SourceIdentifier "$Directory FileRenamed" -Action {Send-ChangesToWebroot -Path $Event.SourceEventArgs.FullPath -OldPath $Event.SourceEventArgs.OldFullPath}
    	
    	Register-ObjectEvent $Watcher Deleted -SourceIdentifier "$Directory FileDeleted" -Action {Send-ChangesToWebroot -Path $Event.SourceEventArgs.FullPath -Delete $true}
    	
    	Register-ObjectEvent $Watcher Created -SourceIdentifier "$Directory FileCreated" -Action {Send-ChangesToWebroot -Path $Event.SourceEventArgs.FullPath}
    	
    	$Watcher.EnableRaisingEvents = $true
    }
    Resolve-Path "$SourceDirectory/*/App_Config/Include" | ForEach-Object{
    	Write-Host "Adding watch location: $_" -ForegroundColor Yellow
    	Add-Watcher $_ | Out-Null
    }
    
    Resolve-Path "$SourceDirectory/*/Views" | ForEach-Object{
    	Write-Host "Adding watch location: $_" -ForegroundColor Yellow	
    	Add-Watcher $_ | Out-Null
    }
    
    Resolve-Path "$SourceDirectory/*/bin" | ForEach-Object{
    	Write-Host "Adding watch location: $_" -ForegroundColor Yellow
    	Add-Watcher $_ | Out-Null
    }
    
    Resolve-Path "$SourceDirectory/*/Assets" | ForEach-Object {
    	Write-Host "Adding watch location: $_" -ForegroundColor Yellow
    	Add-Watcher $_ | Out-Null
    }
    
    Write-Host [string]::Empty
    Write-Host "Now watching for changes made in the repo." -ForegroundColor Yellow
    Write-Host "Any changes made will be delivered to the Webroot automatically" -ForegroundColor Yellow
    Write-Host "***************************************************************" -ForegroundColor Yellow
    while($true){
    	#sleep more quickly when changes are happening
    	if ($global:LastEvent -gt ((Get-Date).ToString('HH:mm:ss.fff'))){
    		Start-Sleep -m 5
    	}else{
    		Start-Sleep 1
    	}
    }