SXA Tenant Specific Field Validation

SXA includes a large number of extremely helpful modules and it would be wise to utilize as many as you can.  However a problem arises if you have multiple tenants or sites who want to both apply different field validation rules to fields that come from an SXA base template.

An extra huge thanks to @Sitecorey for a huge amount of help in working through this problem.

By default validation rules are applied on the template field item under the template.  This means that every other template that inherits yours will automatically get the validation rules applied to it.

Installation instructions and full source

Download the Sitecore package

My solution is to pull the validation rule definitions optionally out of the template field and into a global library of items that contain a template to template field mapping. The field can be defined in the template or in any of the base templates or the base templates base templates and so on.

validatorDiagram

An Example

I have two tenants both using the SEO Metadata module to get keywords and page description fields on their page template. Using this technique i was able to have one tenant define a 125 character limit while having the other tenant not have any validation. This was done by specifying the base template for pages as the template target and the SXA meta description field as the field. Even though the template doesn’t directly define this field we’re still able to apply validation to it.
fieldvalidationitem

How it’s done

The magic is done by overriding the default validation manager and adding functionality on top of it.  Basically what we want to do is augment the default functionality of the validator by looking into the library of global validators defined in the settings section of our SXA site.  To do that we have to follow a few steps:

  1. Get the root of the applicable site’s global field validator definitions root.
  2. Grab all the validator definitions from under that root.
  3. Build validators for all the definitions
  4. Return those validators
using System;
using System.Collections.Generic;
using System.Linq;
using Sitecore.Abstractions;
using Sitecore.CodeDom.Scripts;
using Sitecore.Data.Fields;
using Sitecore.Data.Items;
using Sitecore.Data.Validators;
using Sitecore.XA.Foundation.Multisite;

namespace JeffDarchuk.Foundation.ContentValidation
{
	public class GlobalFieldValidatorManager : DefaultValidatorManager
	{
		private readonly IMultisiteContext _multisiteContext;
		private readonly BaseTemplateManager _templateManager;

		public GlobalFieldValidatorManager(BaseItemScripts itemScripts, IMultisiteContext multisiteContext, BaseTemplateManager templateManager) : base(itemScripts)
		{
			_multisiteContext = multisiteContext ?? throw new ArgumentNullException(nameof(multisiteContext));
			_templateManager = templateManager ?? throw new ArgumentNullException(nameof(templateManager));
		}

		public override ValidatorCollection BuildValidators(ValidatorsMode mode, Item item)
		{
			var validators = base.BuildValidators(mode, item);
			var globalFieldRulesFolder = GetGlobalFieldRulesFolder(item);
			if (globalFieldRulesFolder == null) return validators;
			foreach (var validator in GetAdditionalValidators(item, globalFieldRulesFolder, mode))
			{
				validators.Add(validator);
			}
			return validators;
		}

		private Item GetGlobalFieldRulesFolder(Item item)
		{
			return _multisiteContext.GetSettingsItem(item)?.Children.FirstOrDefault(x =>
				x.TemplateID.ToString() == Templates.GlobalFieldRuleFolder.Id);
		}

		private IEnumerable GetAdditionalValidators(Item item, Item globalFieldRulesFolder, ValidatorsMode mode)
		{
			var baseTemplates = new HashSet(_templateManager.GetTemplate(item).GetBaseTemplates().Select(x => x.ID.ToString()));
			foreach (var globalFieldRule in GetGlobalFieldRules(globalFieldRulesFolder))
			{
				var template = globalFieldRule[Templates.GlobalFieldRule.Fields.Template];
				if (!FieldRuleAppliesToItem(item, globalFieldRule, template, baseTemplates)) continue;
				MultilistField validators = null;
				switch (mode)
				{
					case ValidatorsMode.Gutter:
						validators = globalFieldRule.Fields[Templates.FieldTypeValidationRules.Fields.QuickValidationBar];
						break;
					case ValidatorsMode.ValidateButton:
						validators = globalFieldRule.Fields[Templates.FieldTypeValidationRules.Fields.ValidateButton];
						break;
					case ValidatorsMode.ValidatorBar:
						validators = globalFieldRule.Fields[Templates.FieldTypeValidationRules.Fields.ValidatorBar];
						break;
					case ValidatorsMode.Workflow:
						validators = globalFieldRule.Fields[Templates.FieldTypeValidationRules.Fields.Workflow];
						break;
				}
				foreach (var validator in validators?.GetItems() ?? Enumerable.Empty())
				{
					var baseValidator = BuildValidator(validator, item);
					baseValidator.FieldID = item.Fields[globalFieldRule[Templates.GlobalFieldRule.Fields.Field]].ID;
					yield return baseValidator;
				}
			}
		}

		private IEnumerable GetGlobalFieldRules(Item globalFieldRulesFolder)
		{
			return globalFieldRulesFolder.Axes.GetDescendants().Where(x => x.TemplateID.ToString() == Templates.GlobalFieldRule.Id);
		}

		private bool FieldRuleAppliesToItem(Item item, Item globalFieldRule, string template, HashSet baseTemplates)
		{
			var useInheritedTemplates = ((CheckboxField)globalFieldRule.Fields[Templates.GlobalFieldRule.Fields.ApplyToInheritedTemplates]).Checked;
			return item.TemplateID.ToString() == template || useInheritedTemplates && baseTemplates.Contains(template);
		}
	}
}

The end result is that we get to define validation rules in whatever template we wish. Even if that template doesn’t directly define the field but only inherits it.

Ensure all projects use the same nuget versions

If you’re using nuget packages in Sitecore helix it’s very easy to inadvertently use different versions of nuget packages between separate helix layers. This can cause very strange and hard to diagnose issues. However with a little validation we can avoid this problem entirely before it hits your webroot. Applying a little powershell script validation method to your publish process we can get clear and concise output of what the problem is.

function NuGetPackageValidation {
	param(
		[string]$solutionPath
	)
	write-host "Beginning Nuget validation."
	$tracker = @{}
	Get-ChildItem (split-path $solutionPath) -recurse packages.config | ForEach-Object {
	  $fullFileName = $_.FullName
	  $csProjName = Split-Path (Resolve-Path "$(Split-Path $fullFileName)\*.csproj").Path -Leaf
	  [xml]$curConfigFile = Get-Content $fullFilename
	  $curConfigFile.packages.package | ForEach-Object {
		  if ($null -eq $tracker[$_.id]){
			$tracker[$_.id] = @{
				id = $_.id
				versions = @{}
				}
			$tracker[$_.id].versions[$_.version] = @{
				version = $_.version
				project = @($csProjName)
				}
			}
			elseif ($null -eq $tracker[$_.id].versions[$_.version]){
				$tracker[$_.id].versions[$_.version] = @{
					version = $_.version
					project = @($csProjName)
					}
			}else{
				$tracker[$_.id].versions[$_.version].project += $csProjName
			}
	  }
	}
	$ret = $true
	$tracker.Keys | ForEach-Object {
		if ($tracker[$_].versions.Count -gt 1){
			if ($ret){
				Write-Host "Problems found with Nuget packages, ensure that the same Nuget package versions are used across projects."
			}
			$ret = $false
			Write-Host @"
----------------------------------
  $_
----------------------------------
"@
			$versions = $tracker[$_].versions
			$versions.Keys | ForEach-Object{
				Write-Host $versions[$_].version
				$versions[$_].project | ForEach-Object {
					Write-Host "      $_"
				}
				Write-Host ""
			}
		}
	}
	return $ret
  }

You can then take the results from this function and halt the build with some detailed results for what projects are involved in the mismatch. You can expect to see results like this:

nugetValidation

Using this data, you can easily track down and fix anomalies before they become hard to diagnose problems.

Transform configs on Azure

It can be a powerful tool to give your release process the ability to manipulate configuration files on the server that aren’t directly controlled by your source code.

The primary use cases i’ve used this for is to at deployment time re-wire connection strings on the fly or to have the stock Sitecore web.config be deployed stock and managed by the release process through remote transforms. Whatever your devops constraints are this technique could come in handy.

The technique is fairly simple:

  • Connect to Azure, here we’re using a service principal
  • Generate kudu credentials from publishsettings
  • Download xml file
  • Transform file using XDTs and optionally tokens
  • Upload xml file back to app service
  • Note: This requires the loading of the Microsoft.Web.XmlTransform.dll DLL, so make sure that’s available.

    param(
        [string]$KuduPath,
        [string[]]$XDTs,
        [string]$TenantId,
        [string]$SubscriptionId,
        [string]$ResourceGroupName,
        [string]$WebAppServiceName,
        [string]$SlotName = "",
        [string]$ServicePrincipalID,
        [string]$ServicePrincipalKey,
        [hashtable]$Tokens
    )
    
    function Get-AzureRmWebAppPublishingCredentials($resourceGroupName, $webAppName, $slotName = $null){
    	if ([string]::IsNullOrWhiteSpace($slotName)){
    		$resourceType = "Microsoft.Web/sites/config"
    		$resourceName = "$webAppName/publishingcredentials"
    	}
    	else{
    		$resourceType = "Microsoft.Web/sites/slots/config"
    		$resourceName = "$webAppName/$slotName/publishingcredentials"
    	}
    	$publishingCredentials = Invoke-AzureRmResourceAction -ResourceGroupName $resourceGroupName -ResourceType $resourceType -ResourceName $resourceName -Action list -ApiVersion 2015-08-01 -Force
        	return $publishingCredentials
    }
    
    function Get-KuduApiAuthorisationHeaderValue($resourceGroupName, $webAppName, $slotName = $null){
        $publishingCredentials = Get-AzureRmWebAppPublishingCredentials $resourceGroupName $webAppName $slotName
        $ret = @{}
        $ret.header = ("Basic {0}" -f [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $publishingCredentials.Properties.PublishingUserName, $publishingCredentials.Properties.PublishingPassword))))
        $ret.url = $publishingCredentials.Properties.scmUri
        return $ret
    }
    
    function Get-FileFromWebApp($resourceGroupName, $webAppName, $slotName = "", $kuduPath){
        $KuduAuth = Get-KuduApiAuthorisationHeaderValue $resourceGroupName $webAppName $slotName
        $kuduApiAuthorisationToken = $KuduAuth.header
        $kuduApiUrl = $KuduAuth.url + "/api/vfs/site/wwwroot/$kuduPath"
    
        Write-Host " Downloading File from WebApp. Source: '$kuduApiUrl'." -ForegroundColor DarkGray
        $tmpPath = "$($env:TEMP)\$([guid]::NewGuid()).xml"
        $null = Invoke-RestMethod -Uri $kuduApiUrl `
                            -Headers @{"Authorization"=$kuduApiAuthorisationToken;"If-Match"="*"} `
                            -Method GET `
                            -ContentType "multipart/form-data" `
                            -OutFile $tmpPath
        $ret = Get-Content $tmpPath | Out-String
        Remove-Item $tmpPath -Force
        return $ret
    }
    function Write-FileToWebApp($resourceGroupName, $webAppName, $slotName = "", $fileContent, $kuduPath){
        $KuduAuth = Get-KuduApiAuthorisationHeaderValue $resourceGroupName $webAppName $slotName
        $kuduApiAuthorisationToken = $KuduAuth.header
        $kuduApiUrl = $KuduAuth.url + "/api/vfs/site/wwwroot/$kuduPath"
    
        Write-Host " Writing File to WebApp. Destination: '$kuduApiUrl'." -ForegroundColor DarkGray
    
        Invoke-RestMethod -Uri $kuduApiUrl `
                            -Headers @{"Authorization"=$kuduApiAuthorisationToken;"If-Match"="*"} `
                            -Method Put `
                            -ContentType "multipart/form-data"`
                            -Body $fileContent
    }
    function Update-XmlDocTransform($xml, $xdt, $tokens)
    {
        Add-Type -LiteralPath "$PSScriptRoot\Microsoft.Web.XmlTransform.dll"
    
        $xmldoc = New-Object Microsoft.Web.XmlTransform.XmlTransformableDocument;
        $xmldoc.PreserveWhitespace = $true
        $xmldoc.LoadXml($xml);
        $useTokens = $false
        if ($tokens -ne $null -and $tokens.Count -gt 0){
            $useTokens = $true
            $sb = [System.Text.StringBuilder]::new((Get-Content -Path $xdt))
            $tmpPath = "$($env:TEMP)\$([guid]::NewGuid()).xdt"
            $tokens.Keys | ForEach-Object{
                $null = $sb.Replace($_, $tokens[$_])
            }
            Set-Content -Path $tmpPath -Value $sb.ToString()
            $xdt = $tmpPath
        }
        
        $transf = New-Object Microsoft.Web.XmlTransform.XmlTransformation($xdt);
        if ($transf.Apply($xmldoc) -eq $false)
        {
            throw "Transformation failed."
        }
        if ($useTokens){
            Remove-Item -Path $xdt -Force
        }
        return $xmldoc.OuterXml
    }
    $Credential = New-Object -TypeName PSCredential($ServicePrincipalID, (ConvertTo-SecureString -String $ServicePrincipalKey -AsPlainText -Force))
    
    
    # Connect to Azure using SP
    $connectParameters = @{
        Credential     = $Credential
        TenantId       = $TenantId
        SubscriptionId = $SubscriptionId
    }
    
    Write-Host 'Connecting to Azure.'
    
    $null = Add-AzureRmAccount @connectParameters -ServicePrincipal
    
    $contents = Get-FileFromWebApp `
        -resourceGroupName $ResourceGroupName `
        -webAppName $WebAppServiceName `
        -slotName $SlotName `
        -kuduPath $KuduPath
    $XDTs | Foreach-Object{
        $contents = Update-XmlDocTransform -xml $contents -xdt "$PSScriptRoot\XDT\$_.xdt" -tokens $Tokens
    }
    Write-FileToWebApp `
        -resourceGroupName $ResourceGroupName `
        -webAppName $WebAppServiceName `
        -slotName $SlotName `
        -kuduPath $KuduPath `
        -fileContent $contents
    

    Additionally with some simple adaptations of this code you can use Kudu to perform whatever kind of file manipulations you want using a similar technique.

    Tokenize your XDTs with Powershell

    XDTs have gotten a bad rep over the years for being difficult to use and hard to understand. However despite that, they’re still the most reliable and consistent way to transform configurations. I’ve come up with a way to tokenize those XDTs to make them able to be used in a more flexible way.

    For example say we have different cookie domains per environment that we want to patch in and out.

    note: This code requires the dll Microsoft.Web.XmlTransform.dll to be in the same folder as the powershell script

    param(
        [string]$Path,
        [string[]]$XDTs,
        [hashtable]$Tokens
    )
    
    function Update-XmlDocTransform($xml, $xdt, $tokens)
    {
        Add-Type -LiteralPath "$PSScriptRoot\Microsoft.Web.XmlTransform.dll"
    
        $xmldoc = New-Object Microsoft.Web.XmlTransform.XmlTransformableDocument;
        $xmldoc.PreserveWhitespace = $true
        $xmldoc.LoadXml($xml);
        $useTokens = $false
        if ($tokens -ne $null -and $tokens.Count -gt 0){
            $useTokens = $true
            $sb = [System.Text.StringBuilder]::new((Get-Content -Path $xdt))
            $tmpPath = "$($env:TEMP)\$([guid]::NewGuid()).xdt"
            $tokens.Keys | ForEach-Object{
                $null = $sb.Replace($_, $tokens[$_])
            }
            Set-Content -Path $tmpPath -Value $sb.ToString()
            $xdt = $tmpPath
        }
        
        $transf = New-Object Microsoft.Web.XmlTransform.XmlTransformation($xdt);
        if ($transf.Apply($xmldoc) -eq $false)
        {
            throw "Transformation failed."
        }
        if ($useTokens){
            Remove-Item -Path $xdt -Force
        }
        return $xmldoc.OuterXml
    }
    
    $contents = Get-Content $Path | Out-String
    $XDTs | Foreach-Object{
        $contents = Update-XmlDocTransform -xml $contents -xdt $_ -tokens $Tokens
    }
    Set-Content $path -Value $contents
    

    Here is an example usage:

    LocalXmlTransform.ps1 -Path "C:\inetpub\wwwroot\sc901.local" -XDTs "C:\xdt\AddBindingRedirects.xdt","C:\xdt\AddSessionCookie" -Tokens @{_ShareSessionCookie_="mysite.local";_RedirectName_="mydependency"}

    In this example we’re running two XDT files against the web.config and replacing a couple of tokens in the XDT.

    Here is an example of an XDT with tokens to ensure a connection string exists:

    <?xml version="1.0" encoding="utf-8"?>
    <connectionStrings xmlns:xdt="http://schemas.microsoft.com/XML-Document-Transform" xmlns:asmv1="urn:schemas-microsoft-com:asm.v1">
      <add name="_name_" xdt:Transform="Remove" xdt:Locator="Match(name)" />
      <add name="_name_" xdt:Transform="InsertIfMissing" xdt:Locator="Match(name)" connectionString="Encrypt=True;TrustServerCertificate=False;Data Source=_fqdn_;Initial Catalog=_databasename_;User Id=_username_;Password=_password_;" />
    </connectionStrings>
    

    To use this XDT your parameters would look something like this:

    LocalXmlTransform.ps1 -Path "C:\inetpub\wwwroot\App_Config\ConnectionStrings.config" -XDTs "C:\xdt\EnsureConnectionString.xdt" -Tokens @{_name_="mySpecialDatabase";_fqdn_="myazurestuff.database.windows.net,1433";_databasename_="specialdatabase";_username_="secretuser";_password_="secretpassword"}

    Hopefully this will help your devops process

    PAAS Sitecore 9 with an ASE ARM template errors

    ase
    Sitecore 9 works great in PAAS and the arm templates are an enormous help. However if you’re like me and need to use an ASE then you find that your deployments are regularly and mysteriously failing. I poured over the arm templates searching for any reason that this might be happening. After about a month i accepted the unfortunate truth that Azure was incorrectly reporting success before it should.

    I started pulling apart the templates searching for more information. I utilized a custom powershell containment system to manage the ARM template parameters

    The errors originated from the application deployments. These are the parts that use web deploy to restore databases, create users, and push files to your Sitecore server.

    How to stabilize the arm templates

    For this i will assume that you’ve already added the hostingEnvironmentProfile parameter to the Microsoft.Web/sites ARM resources
    Warning: this process is very time consuming
    The first step is to pull them apart. I was able to achieve a high success rate by doing the following:

    1. Take the main azuredeploy.json and remove all of the resources, we’re going to be manually executing them
    2. In each of the nested ARM template json make sure that the required parameters are defined in the parameters and variables section, you can refer back to the azuredeploy.json for how these should be setup
    3. The application.json file is the primary culprit that’s causing our failures, we need to split this one up just like we did the azuredeploy.json except this time we’re going to be creating new ARM template json files for each of the 4 web deploy deployments that reside in application.json
    4. Now that we have the ARM templates separated out into their individual parts we need to create a new powershell wrapper for the process
    5. Note, for security reasons i’m largely omitting things of a sensitive nature here. Make sure you apply user names and passwords to your input parameters either in a parameters.json or in the parameters powershell hashtable as described below

      Powershell Magic

      powershell
      You can find the scripts Here

      Due to the lack of a central ARM template to orchestrate parameters we need to do that ourselves, this comes in a few steps

      1. populate all starting parameters in a hashtable. see Execute.ps1 for an example. Note that you will need to pass in several more parameters or you can include them in a parameters.json that’s loaded here
      2. Scan the arm templates and gather their accepted parameters as they won’t take any extra. See Get-ValidParameters in Utilities.ps1
      3. Based on each ARM template gather up the parameters needed for the deployment and generate a new hashtable of parameters and their values. See Get-Parameters in Utilities.ps1
      4. Execute ARM template using a modified version of Sitecore’s ARM template execution code. See Start-SitecoreAzureDeployment in Utilities.ps1
      5. After completion extract out populated parameters and outputs and save them using Get-ValidParameters from Utilities.ps1
      6. repeat until finished. You can see how the arm templates are ordered here.
        Note, depending on your specific case, you may need to adjust some timing between deployments if some deployments need more time to settle

    Sitecore Helix Powershell Filewatch

    Most likely if you’re developing with Sitecore you have your webroot and your source separated and publish to your site with webdeploy or some other kind of publishing technology.  This is a fine way to do it, but it’s far easier if it just happens automatically.  that’s what these scripts aim to do with Powershell!

    Filewatch

    Here is a zip file for the completed solution that is covered below: HelixFileWatcher. Or check out the source View on Github

    First we need to define a few parameters:

    #where your solution is
    $SourceDirectory = "D:\Source\SitecoreSource\SolutionRoot"
    #where your webroot is
    $DeployTargetWebPath = "C:\inetpub\wwwroot\sc90.local"
    
    

    Next we define how files are moved from your solution to the webroot, this is done through a hashtable matching a file extension to a script block. Note that the views are being deployed to a “Demo” MVC Area as an example.

    $global:FileWatchActions = @{}
    function Get-ProjectRoot{
    	param(
    		[string]$Path
    	)
    	if ($path -eq [string]::Empty){
    		return [string]::Empty
    	}
    	if (-Not (Test-Path $Path)){
    		return Get-ProjectRoot -Path (split-Path $Path)
    	}
    	$PathItem = Get-Item -Path $Path
    	if (-Not ($PathItem -is [System.IO.DirectoryInfo])){
    		return Get-ProjectRoot -Path (Split-Path $Path)
    	}
    	if ((resolve-path "$Path\*.csproj").Count -gt 0){
    		return $Path
    	}elseif($PathItem.Parent -ne $null){
    		return Get-ProjectRoot -Path $PathItem.Parent.FullName
    	}
    	return [string]::Empty
    }
    function Copy-ItemToWebroot{
    	param(
    		$Path,
    		$OldPath,
    		$Delete,
    		$Index,
    		$IntermediatePath
    	)
    	if ($Index -lt 0){
    		return
    	}
    	
    	$TargetPath = $DeployTargetWebPath + $IntermediatePath + $Path.Substring($Index)
    	if ($Delete -and (Test-Path $TargetPath)){
    		write-host "Removing file $TargetPath" -ForegroundColor Red
    		Remove-Item $TargetPath -Force -Recurse
    	}elseif (-Not (Test-Path $Path) -and (Test-Path $TargetPath)){
    		write-host "Removing file $TargetPath" -ForegroundColor Red
    		Remove-Item $TargetPath -Force -Recurse
    	}elseif(Test-Path $Path){
    		if ($OldPath -ne [string]::Empty){
    			$OldTargetPath = $DeployTargetWebPath + $IntermediatePath + $OldPath.Substring($Index)
    			if ((Test-Path $OldTargetPath) -and ((Split-Path $Path) -eq (Split-Path $OldPath) )){
    				$newName = Split-Path $Path -Leaf -Resolve
    				write-host "Renaming Item" -ForegroundColor Yellow
    				write-host "    $OldTargetPath" -ForegroundColor Yellow
    				write-host "    =>$TargetPath" -ForegroundColor Yellow
    				Rename-Item $OldTargetPath $newName -Force
    				return
    			}
    		}
    		if (-Not (Test-Path $TargetPath) -or (Compare-Object (ls $Path) (ls $TargetPath) -Property Name, Length, LastWriteTime)){
    			write-host "Copying Item" -ForegroundColor Green
    			write-host "    $Path" -ForegroundColor Green
    			write-host "    =>$TargetPath" -ForegroundColor Green
    			New-Item -Path "$(Split-Path $TargetPath)" -ItemType Directory -Force
    			Copy-Item -Path $Path -Destination $TargetPath -Recurse -Force
    		}
    	}
    }
    
    #Add watcher action configurations
    #Based on extension define how to process the files that are changed
    $global:FileWatchActions.Add(".cshtml", {
    	param(
    		$Path,
    		$OldPath,
    		$Delete
    	)
    	$index = $Path.IndexOf("\Views", 5)
    	Copy-ItemToWebroot -Path $Path -OldPath $OldPath -Delete $Delete -Index $index -IntermediatePath "\Areas\Demo"
    } )
    
    $global:FileWatchActions.Add(".config", {
    	param(
    		$Path,
    		$OldPath,
    		$Delete
    	)
    	$index = $Path.IndexOf("\App_Config\Include", 5)
    	Copy-ItemToWebroot -Path $Path -OldPath $OldPath -Delete $Delete -Index $index
    	if ($index -eq -1){
    		$fileName = Split-Path $Path -Leaf
    		$FileDirectory = Get-ProjectRoot -Path $Path
    		if ($fileName.StartsWith("web", "CurrentCultureIgnoreCase")){
    			Copy-ItemToWebroot -Path $Path -OldPath $OldPath -Delete $Delete -Index $FileDirectory.Length -IntermediatePath "\Areas\Demo"		
    		}
    	}
    } )
    
    $global:FileWatchActions.Add(".dll", {
    	param(
    		$Path,
    		$OldPath,
    		$Delete
    	)
    	$index = $Path.IndexOf("\bin", 5)
    	Copy-ItemToWebroot -Path $Path -OldPath $OldPath -Delete $Delete -Index $index	
    } )
    
    $global:FileWatchActions.Add("folder", {
    	param(
    		$Path,
    		$OldPath,
    		$Delete
    	)
    	if (-Not( $delete -or $OldPath -ne [string]::Empty)){
    		return
    	}
    	$index = $Path.IndexOf("\Views", 5)
    	if ($index -ne -1){
    		Copy-ItemToWebroot -Path $Path -OldPath $OldPath -Delete $Delete -Index $index -IntermediatePath "\Areas\Demo"
    		return		
    	}
    	$index = $Path.IndexOf("\App_Config\Include", 5)
    	if ($index -ne -1){
    		Copy-ItemToWebroot -Path $Path -OldPath $OldPath -Delete $Delete -Index $index -IntermediatePath "\App_Config\Include"
    		return		
    	}
    })
    
    

    Then we set up the file watchers to watch the important parts of our code

    $global:LastEvent = ((Get-Date).AddSeconds(2).ToString('HH:mm:ss.fff'))
    function global:Send-ChangesToWebroot{
    	param(
    	[string]$Path = [string]::Empty,
    	[string]$OldPath = [string]::Empty,
    	[bool]$Delete = $false
    	)
    	$extension = [IO.Path]::GetExtension($Path)
    	$IsDirectory = $false
    	if (Test-Path $Path){
    		$IsDirectory= (Get-Item -Path $Path) -is [System.IO.DirectoryInfo]
    	}elseif ($Delete -and $extension -eq [string]::Empty){
    		$IsDirectory = $true;
    	}
    	try{
    		if (-Not $IsDirectory -and $global:FileWatchActions.ContainsKey($extension)){
    			$global:LastEvent = ((Get-Date).AddSeconds(2).ToString('HH:mm:ss.fff'))
    			$global:FileWatchActions.Get_Item($extension).Invoke($Path, $OldPath, $Delete)
    		}elseif ($IsDirectory){
    			$global:LastEvent = ((Get-Date).AddSeconds(2).ToString('HH:mm:ss.fff'))
    			$global:FileWatchActions.Get_Item("folder").Invoke($Path, $OldPath, $Delete)
    		}
    	}catch [System.Exception]{
    		Write-Host "An error has occurred while attempting to run the processor for $extension" -ForegroundColor Red
    		Write-Host "Path: $Path" -ForegroundColor Red
    		Write-Host "OldPath: $OldPath" -ForegroundColor Red
    		Write-Host $_.Exception.ToString() -ForegroundColor Red
    	}
    }
    function Add-Watcher{
    	param(
    		$Directory
    	)
    	$Watcher = New-Object IO.FileSystemWatcher $Directory, "*" -Property @{IncludeSubdirectories = $true;NotifyFilter = [IO.NotifyFilters]'FileName, DirectoryName, LastWrite, Size'}
    	
    	Register-ObjectEvent $Watcher Changed -SourceIdentifier "$Directory FileChanged" -Action {Send-ChangesToWebroot -Path $Event.SourceEventArgs.FullPath}
    	
    	Register-ObjectEvent $Watcher Renamed -SourceIdentifier "$Directory FileRenamed" -Action {Send-ChangesToWebroot -Path $Event.SourceEventArgs.FullPath -OldPath $Event.SourceEventArgs.OldFullPath}
    	
    	Register-ObjectEvent $Watcher Deleted -SourceIdentifier "$Directory FileDeleted" -Action {Send-ChangesToWebroot -Path $Event.SourceEventArgs.FullPath -Delete $true}
    	
    	Register-ObjectEvent $Watcher Created -SourceIdentifier "$Directory FileCreated" -Action {Send-ChangesToWebroot -Path $Event.SourceEventArgs.FullPath}
    	
    	$Watcher.EnableRaisingEvents = $true
    }
    Resolve-Path "$SourceDirectory/*/App_Config/Include" | ForEach-Object{
    	Write-Host "Adding watch location: $_" -ForegroundColor Yellow
    	Add-Watcher $_ | Out-Null
    }
    
    Resolve-Path "$SourceDirectory/*/Views" | ForEach-Object{
    	Write-Host "Adding watch location: $_" -ForegroundColor Yellow	
    	Add-Watcher $_ | Out-Null
    }
    
    Resolve-Path "$SourceDirectory/*/bin" | ForEach-Object{
    	Write-Host "Adding watch location: $_" -ForegroundColor Yellow
    	Add-Watcher $_ | Out-Null
    }
    
    Resolve-Path "$SourceDirectory/*/Assets" | ForEach-Object {
    	Write-Host "Adding watch location: $_" -ForegroundColor Yellow
    	Add-Watcher $_ | Out-Null
    }
    
    Write-Host [string]::Empty
    Write-Host "Now watching for changes made in the repo." -ForegroundColor Yellow
    Write-Host "Any changes made will be delivered to the Webroot automatically" -ForegroundColor Yellow
    Write-Host "***************************************************************" -ForegroundColor Yellow
    while($true){
    	#sleep more quickly when changes are happening
    	if ($global:LastEvent -gt ((Get-Date).ToString('HH:mm:ss.fff'))){
    		Start-Sleep -m 5
    	}else{
    		Start-Sleep 1
    	}
    }
    
    

    Unicorn Automated Operations Authentication

    I was attempting to set up some powershell scripts to deploy unicorn content and i was running into a strange issue where the Unicorn powershell module’s Unicorn sync request was getting redirected to the Sitecore login page.UnicornPowershellError

    Through some debugging i was able to track this down to the fact that another developer had checked in a different shared secret.  Once the shared secret was fixed i had smooth sailing.

    UnicornPowershellClear.png

    Really the take away is always double check your shared secrets whenever there’s a problem with an automated tool.