Transform configs on Azure

It can be a powerful tool to give your release process the ability to manipulate configuration files on the server that aren’t directly controlled by your source code.

The primary use cases i’ve used this for is to at deployment time re-wire connection strings on the fly or to have the stock Sitecore web.config be deployed stock and managed by the release process through remote transforms. Whatever your devops constraints are this technique could come in handy.

The technique is fairly simple:

  • Connect to Azure, here we’re using a service principal
  • Generate kudu credentials from publishsettings
  • Download xml file
  • Transform file using XDTs and optionally tokens
  • Upload xml file back to app service
  • Note: This requires the loading of the Microsoft.Web.XmlTransform.dll DLL, so make sure that’s available.

    param(
        [string]$KuduPath,
        [string[]]$XDTs,
        [string]$TenantId,
        [string]$SubscriptionId,
        [string]$ResourceGroupName,
        [string]$WebAppServiceName,
        [string]$SlotName = "",
        [string]$ServicePrincipalID,
        [string]$ServicePrincipalKey,
        [hashtable]$Tokens
    )
    
    function Get-AzureRmWebAppPublishingCredentials($resourceGroupName, $webAppName, $slotName = $null){
    	if ([string]::IsNullOrWhiteSpace($slotName)){
    		$resourceType = "Microsoft.Web/sites/config"
    		$resourceName = "$webAppName/publishingcredentials"
    	}
    	else{
    		$resourceType = "Microsoft.Web/sites/slots/config"
    		$resourceName = "$webAppName/$slotName/publishingcredentials"
    	}
    	$publishingCredentials = Invoke-AzureRmResourceAction -ResourceGroupName $resourceGroupName -ResourceType $resourceType -ResourceName $resourceName -Action list -ApiVersion 2015-08-01 -Force
        	return $publishingCredentials
    }
    
    function Get-KuduApiAuthorisationHeaderValue($resourceGroupName, $webAppName, $slotName = $null){
        $publishingCredentials = Get-AzureRmWebAppPublishingCredentials $resourceGroupName $webAppName $slotName
        $ret = @{}
        $ret.header = ("Basic {0}" -f [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $publishingCredentials.Properties.PublishingUserName, $publishingCredentials.Properties.PublishingPassword))))
        $ret.url = $publishingCredentials.Properties.scmUri
        return $ret
    }
    
    function Get-FileFromWebApp($resourceGroupName, $webAppName, $slotName = "", $kuduPath){
        $KuduAuth = Get-KuduApiAuthorisationHeaderValue $resourceGroupName $webAppName $slotName
        $kuduApiAuthorisationToken = $KuduAuth.header
        $kuduApiUrl = $KuduAuth.url + "/api/vfs/site/wwwroot/$kuduPath"
    
        Write-Host " Downloading File from WebApp. Source: '$kuduApiUrl'." -ForegroundColor DarkGray
        $tmpPath = "$($env:TEMP)\$([guid]::NewGuid()).xml"
        $null = Invoke-RestMethod -Uri $kuduApiUrl `
                            -Headers @{"Authorization"=$kuduApiAuthorisationToken;"If-Match"="*"} `
                            -Method GET `
                            -ContentType "multipart/form-data" `
                            -OutFile $tmpPath
        $ret = Get-Content $tmpPath | Out-String
        Remove-Item $tmpPath -Force
        return $ret
    }
    function Write-FileToWebApp($resourceGroupName, $webAppName, $slotName = "", $fileContent, $kuduPath){
        $KuduAuth = Get-KuduApiAuthorisationHeaderValue $resourceGroupName $webAppName $slotName
        $kuduApiAuthorisationToken = $KuduAuth.header
        $kuduApiUrl = $KuduAuth.url + "/api/vfs/site/wwwroot/$kuduPath"
    
        Write-Host " Writing File to WebApp. Destination: '$kuduApiUrl'." -ForegroundColor DarkGray
    
        Invoke-RestMethod -Uri $kuduApiUrl `
                            -Headers @{"Authorization"=$kuduApiAuthorisationToken;"If-Match"="*"} `
                            -Method Put `
                            -ContentType "multipart/form-data"`
                            -Body $fileContent
    }
    function Update-XmlDocTransform($xml, $xdt, $tokens)
    {
        Add-Type -LiteralPath "$PSScriptRoot\Microsoft.Web.XmlTransform.dll"
    
        $xmldoc = New-Object Microsoft.Web.XmlTransform.XmlTransformableDocument;
        $xmldoc.PreserveWhitespace = $true
        $xmldoc.LoadXml($xml);
        $useTokens = $false
        if ($tokens -ne $null -and $tokens.Count -gt 0){
            $useTokens = $true
            $sb = [System.Text.StringBuilder]::new((Get-Content -Path $xdt))
            $tmpPath = "$($env:TEMP)\$([guid]::NewGuid()).xdt"
            $tokens.Keys | ForEach-Object{
                $null = $sb.Replace($_, $tokens[$_])
            }
            Set-Content -Path $tmpPath -Value $sb.ToString()
            $xdt = $tmpPath
        }
        
        $transf = New-Object Microsoft.Web.XmlTransform.XmlTransformation($xdt);
        if ($transf.Apply($xmldoc) -eq $false)
        {
            throw "Transformation failed."
        }
        if ($useTokens){
            Remove-Item -Path $xdt -Force
        }
        return $xmldoc.OuterXml
    }
    $Credential = New-Object -TypeName PSCredential($ServicePrincipalID, (ConvertTo-SecureString -String $ServicePrincipalKey -AsPlainText -Force))
    
    
    # Connect to Azure using SP
    $connectParameters = @{
        Credential     = $Credential
        TenantId       = $TenantId
        SubscriptionId = $SubscriptionId
    }
    
    Write-Host 'Connecting to Azure.'
    
    $null = Add-AzureRmAccount @connectParameters -ServicePrincipal
    
    $contents = Get-FileFromWebApp `
        -resourceGroupName $ResourceGroupName `
        -webAppName $WebAppServiceName `
        -slotName $SlotName `
        -kuduPath $KuduPath
    $XDTs | Foreach-Object{
        $contents = Update-XmlDocTransform -xml $contents -xdt "$PSScriptRoot\XDT\$_.xdt" -tokens $Tokens
    }
    Write-FileToWebApp `
        -resourceGroupName $ResourceGroupName `
        -webAppName $WebAppServiceName `
        -slotName $SlotName `
        -kuduPath $KuduPath `
        -fileContent $contents
    

    Additionally with some simple adaptations of this code you can use Kudu to perform whatever kind of file manipulations you want using a similar technique.

    Tokenize your XDTs with Powershell

    XDTs have gotten a bad rep over the years for being difficult to use and hard to understand. However despite that, they’re still the most reliable and consistent way to transform configurations. I’ve come up with a way to tokenize those XDTs to make them able to be used in a more flexible way.

    For example say we have different cookie domains per environment that we want to patch in and out.

    note: This code requires the dll Microsoft.Web.XmlTransform.dll to be in the same folder as the powershell script

    param(
        [string]$Path,
        [string[]]$XDTs,
        [hashtable]$Tokens
    )
    
    function Update-XmlDocTransform($xml, $xdt, $tokens)
    {
        Add-Type -LiteralPath "$PSScriptRoot\Microsoft.Web.XmlTransform.dll"
    
        $xmldoc = New-Object Microsoft.Web.XmlTransform.XmlTransformableDocument;
        $xmldoc.PreserveWhitespace = $true
        $xmldoc.LoadXml($xml);
        $useTokens = $false
        if ($tokens -ne $null -and $tokens.Count -gt 0){
            $useTokens = $true
            $sb = [System.Text.StringBuilder]::new((Get-Content -Path $xdt))
            $tmpPath = "$($env:TEMP)\$([guid]::NewGuid()).xdt"
            $tokens.Keys | ForEach-Object{
                $null = $sb.Replace($_, $tokens[$_])
            }
            Set-Content -Path $tmpPath -Value $sb.ToString()
            $xdt = $tmpPath
        }
        
        $transf = New-Object Microsoft.Web.XmlTransform.XmlTransformation($xdt);
        if ($transf.Apply($xmldoc) -eq $false)
        {
            throw "Transformation failed."
        }
        if ($useTokens){
            Remove-Item -Path $xdt -Force
        }
        return $xmldoc.OuterXml
    }
    
    $contents = Get-Content $Path | Out-String
    $XDTs | Foreach-Object{
        $contents = Update-XmlDocTransform -xml $contents -xdt $_ -tokens $Tokens
    }
    Set-Content $path -Value $contents
    

    Here is an example usage:

    LocalXmlTransform.ps1 -Path "C:\inetpub\wwwroot\sc901.local" -XDTs "C:\xdt\AddBindingRedirects.xdt","C:\xdt\AddSessionCookie" -Tokens @{_ShareSessionCookie_="mysite.local";_RedirectName_="mydependency"}

    In this example we’re running two XDT files against the web.config and replacing a couple of tokens in the XDT.

    Here is an example of an XDT with tokens to ensure a connection string exists:

    <?xml version="1.0" encoding="utf-8"?>
    <connectionStrings xmlns:xdt="http://schemas.microsoft.com/XML-Document-Transform" xmlns:asmv1="urn:schemas-microsoft-com:asm.v1">
      <add name="_name_" xdt:Transform="Remove" xdt:Locator="Match(name)" />
      <add name="_name_" xdt:Transform="InsertIfMissing" xdt:Locator="Match(name)" connectionString="Encrypt=True;TrustServerCertificate=False;Data Source=_fqdn_;Initial Catalog=_databasename_;User Id=_username_;Password=_password_;" />
    </connectionStrings>
    

    To use this XDT your parameters would look something like this:

    LocalXmlTransform.ps1 -Path "C:\inetpub\wwwroot\App_Config\ConnectionStrings.config" -XDTs "C:\xdt\EnsureConnectionString.xdt" -Tokens @{_name_="mySpecialDatabase";_fqdn_="myazurestuff.database.windows.net,1433";_databasename_="specialdatabase";_username_="secretuser";_password_="secretpassword"}

    Hopefully this will help your devops process

    PAAS Sitecore 9 with an ASE ARM template errors

    ase
    Sitecore 9 works great in PAAS and the arm templates are an enormous help. However if you’re like me and need to use an ASE then you find that your deployments are regularly and mysteriously failing. I poured over the arm templates searching for any reason that this might be happening. After about a month i accepted the unfortunate truth that Azure was incorrectly reporting success before it should.

    I started pulling apart the templates searching for more information. I utilized a custom powershell containment system to manage the ARM template parameters

    The errors originated from the application deployments. These are the parts that use web deploy to restore databases, create users, and push files to your Sitecore server.

    How to stabilize the arm templates

    For this i will assume that you’ve already added the hostingEnvironmentProfile parameter to the Microsoft.Web/sites ARM resources
    Warning: this process is very time consuming
    The first step is to pull them apart. I was able to achieve a high success rate by doing the following:

    1. Take the main azuredeploy.json and remove all of the resources, we’re going to be manually executing them
    2. In each of the nested ARM template json make sure that the required parameters are defined in the parameters and variables section, you can refer back to the azuredeploy.json for how these should be setup
    3. The application.json file is the primary culprit that’s causing our failures, we need to split this one up just like we did the azuredeploy.json except this time we’re going to be creating new ARM template json files for each of the 4 web deploy deployments that reside in application.json
    4. Now that we have the ARM templates separated out into their individual parts we need to create a new powershell wrapper for the process
    5. Note, for security reasons i’m largely omitting things of a sensitive nature here. Make sure you apply user names and passwords to your input parameters either in a parameters.json or in the parameters powershell hashtable as described below

      Powershell Magic

      powershell
      You can find the scripts Here

      Due to the lack of a central ARM template to orchestrate parameters we need to do that ourselves, this comes in a few steps

      1. populate all starting parameters in a hashtable. see Execute.ps1 for an example. Note that you will need to pass in several more parameters or you can include them in a parameters.json that’s loaded here
      2. Scan the arm templates and gather their accepted parameters as they won’t take any extra. See Get-ValidParameters in Utilities.ps1
      3. Based on each ARM template gather up the parameters needed for the deployment and generate a new hashtable of parameters and their values. See Get-Parameters in Utilities.ps1
      4. Execute ARM template using a modified version of Sitecore’s ARM template execution code. See Start-SitecoreAzureDeployment in Utilities.ps1
      5. After completion extract out populated parameters and outputs and save them using Get-ValidParameters from Utilities.ps1
      6. repeat until finished. You can see how the arm templates are ordered here.
        Note, depending on your specific case, you may need to adjust some timing between deployments if some deployments need more time to settle

    Sitecore Helix Powershell Filewatch

    Most likely if you’re developing with Sitecore you have your webroot and your source separated and publish to your site with webdeploy or some other kind of publishing technology.  This is a fine way to do it, but it’s far easier if it just happens automatically.  that’s what these scripts aim to do with Powershell!

    Filewatch

    Here is a zip file for the completed solution that is covered below: HelixFileWatcher. Or check out the source View on Github

    First we need to define a few parameters:

    #where your solution is
    $SourceDirectory = "D:\Source\SitecoreSource\SolutionRoot"
    #where your webroot is
    $DeployTargetWebPath = "C:\inetpub\wwwroot\sc90.local"
    
    

    Next we define how files are moved from your solution to the webroot, this is done through a hashtable matching a file extension to a script block. Note that the views are being deployed to a “Demo” MVC Area as an example.

    $global:FileWatchActions = @{}
    function Get-ProjectRoot{
    	param(
    		[string]$Path
    	)
    	if ($path -eq [string]::Empty){
    		return [string]::Empty
    	}
    	if (-Not (Test-Path $Path)){
    		return Get-ProjectRoot -Path (split-Path $Path)
    	}
    	$PathItem = Get-Item -Path $Path
    	if (-Not ($PathItem -is [System.IO.DirectoryInfo])){
    		return Get-ProjectRoot -Path (Split-Path $Path)
    	}
    	if ((resolve-path "$Path\*.csproj").Count -gt 0){
    		return $Path
    	}elseif($PathItem.Parent -ne $null){
    		return Get-ProjectRoot -Path $PathItem.Parent.FullName
    	}
    	return [string]::Empty
    }
    function Copy-ItemToWebroot{
    	param(
    		$Path,
    		$OldPath,
    		$Delete,
    		$Index,
    		$IntermediatePath
    	)
    	if ($Index -lt 0){
    		return
    	}
    	
    	$TargetPath = $DeployTargetWebPath + $IntermediatePath + $Path.Substring($Index)
    	if ($Delete -and (Test-Path $TargetPath)){
    		write-host "Removing file $TargetPath" -ForegroundColor Red
    		Remove-Item $TargetPath -Force -Recurse
    	}elseif (-Not (Test-Path $Path) -and (Test-Path $TargetPath)){
    		write-host "Removing file $TargetPath" -ForegroundColor Red
    		Remove-Item $TargetPath -Force -Recurse
    	}elseif(Test-Path $Path){
    		if ($OldPath -ne [string]::Empty){
    			$OldTargetPath = $DeployTargetWebPath + $IntermediatePath + $OldPath.Substring($Index)
    			if ((Test-Path $OldTargetPath) -and ((Split-Path $Path) -eq (Split-Path $OldPath) )){
    				$newName = Split-Path $Path -Leaf -Resolve
    				write-host "Renaming Item" -ForegroundColor Yellow
    				write-host "    $OldTargetPath" -ForegroundColor Yellow
    				write-host "    =>$TargetPath" -ForegroundColor Yellow
    				Rename-Item $OldTargetPath $newName -Force
    				return
    			}
    		}
    		if (-Not (Test-Path $TargetPath) -or (Compare-Object (ls $Path) (ls $TargetPath) -Property Name, Length, LastWriteTime)){
    			write-host "Copying Item" -ForegroundColor Green
    			write-host "    $Path" -ForegroundColor Green
    			write-host "    =>$TargetPath" -ForegroundColor Green
    			New-Item -Path "$(Split-Path $TargetPath)" -ItemType Directory -Force
    			Copy-Item -Path $Path -Destination $TargetPath -Recurse -Force
    		}
    	}
    }
    
    #Add watcher action configurations
    #Based on extension define how to process the files that are changed
    $global:FileWatchActions.Add(".cshtml", {
    	param(
    		$Path,
    		$OldPath,
    		$Delete
    	)
    	$index = $Path.IndexOf("\Views", 5)
    	Copy-ItemToWebroot -Path $Path -OldPath $OldPath -Delete $Delete -Index $index -IntermediatePath "\Areas\Demo"
    } )
    
    $global:FileWatchActions.Add(".config", {
    	param(
    		$Path,
    		$OldPath,
    		$Delete
    	)
    	$index = $Path.IndexOf("\App_Config\Include", 5)
    	Copy-ItemToWebroot -Path $Path -OldPath $OldPath -Delete $Delete -Index $index
    	if ($index -eq -1){
    		$fileName = Split-Path $Path -Leaf
    		$FileDirectory = Get-ProjectRoot -Path $Path
    		if ($fileName.StartsWith("web", "CurrentCultureIgnoreCase")){
    			Copy-ItemToWebroot -Path $Path -OldPath $OldPath -Delete $Delete -Index $FileDirectory.Length -IntermediatePath "\Areas\Demo"		
    		}
    	}
    } )
    
    $global:FileWatchActions.Add(".dll", {
    	param(
    		$Path,
    		$OldPath,
    		$Delete
    	)
    	$index = $Path.IndexOf("\bin", 5)
    	Copy-ItemToWebroot -Path $Path -OldPath $OldPath -Delete $Delete -Index $index	
    } )
    
    $global:FileWatchActions.Add("folder", {
    	param(
    		$Path,
    		$OldPath,
    		$Delete
    	)
    	if (-Not( $delete -or $OldPath -ne [string]::Empty)){
    		return
    	}
    	$index = $Path.IndexOf("\Views", 5)
    	if ($index -ne -1){
    		Copy-ItemToWebroot -Path $Path -OldPath $OldPath -Delete $Delete -Index $index -IntermediatePath "\Areas\Demo"
    		return		
    	}
    	$index = $Path.IndexOf("\App_Config\Include", 5)
    	if ($index -ne -1){
    		Copy-ItemToWebroot -Path $Path -OldPath $OldPath -Delete $Delete -Index $index -IntermediatePath "\App_Config\Include"
    		return		
    	}
    })
    
    

    Then we set up the file watchers to watch the important parts of our code

    $global:LastEvent = ((Get-Date).AddSeconds(2).ToString('HH:mm:ss.fff'))
    function global:Send-ChangesToWebroot{
    	param(
    	[string]$Path = [string]::Empty,
    	[string]$OldPath = [string]::Empty,
    	[bool]$Delete = $false
    	)
    	$extension = [IO.Path]::GetExtension($Path)
    	$IsDirectory = $false
    	if (Test-Path $Path){
    		$IsDirectory= (Get-Item -Path $Path) -is [System.IO.DirectoryInfo]
    	}elseif ($Delete -and $extension -eq [string]::Empty){
    		$IsDirectory = $true;
    	}
    	try{
    		if (-Not $IsDirectory -and $global:FileWatchActions.ContainsKey($extension)){
    			$global:LastEvent = ((Get-Date).AddSeconds(2).ToString('HH:mm:ss.fff'))
    			$global:FileWatchActions.Get_Item($extension).Invoke($Path, $OldPath, $Delete)
    		}elseif ($IsDirectory){
    			$global:LastEvent = ((Get-Date).AddSeconds(2).ToString('HH:mm:ss.fff'))
    			$global:FileWatchActions.Get_Item("folder").Invoke($Path, $OldPath, $Delete)
    		}
    	}catch [System.Exception]{
    		Write-Host "An error has occurred while attempting to run the processor for $extension" -ForegroundColor Red
    		Write-Host "Path: $Path" -ForegroundColor Red
    		Write-Host "OldPath: $OldPath" -ForegroundColor Red
    		Write-Host $_.Exception.ToString() -ForegroundColor Red
    	}
    }
    function Add-Watcher{
    	param(
    		$Directory
    	)
    	$Watcher = New-Object IO.FileSystemWatcher $Directory, "*" -Property @{IncludeSubdirectories = $true;NotifyFilter = [IO.NotifyFilters]'FileName, DirectoryName, LastWrite, Size'}
    	
    	Register-ObjectEvent $Watcher Changed -SourceIdentifier "$Directory FileChanged" -Action {Send-ChangesToWebroot -Path $Event.SourceEventArgs.FullPath}
    	
    	Register-ObjectEvent $Watcher Renamed -SourceIdentifier "$Directory FileRenamed" -Action {Send-ChangesToWebroot -Path $Event.SourceEventArgs.FullPath -OldPath $Event.SourceEventArgs.OldFullPath}
    	
    	Register-ObjectEvent $Watcher Deleted -SourceIdentifier "$Directory FileDeleted" -Action {Send-ChangesToWebroot -Path $Event.SourceEventArgs.FullPath -Delete $true}
    	
    	Register-ObjectEvent $Watcher Created -SourceIdentifier "$Directory FileCreated" -Action {Send-ChangesToWebroot -Path $Event.SourceEventArgs.FullPath}
    	
    	$Watcher.EnableRaisingEvents = $true
    }
    Resolve-Path "$SourceDirectory/*/App_Config/Include" | ForEach-Object{
    	Write-Host "Adding watch location: $_" -ForegroundColor Yellow
    	Add-Watcher $_ | Out-Null
    }
    
    Resolve-Path "$SourceDirectory/*/Views" | ForEach-Object{
    	Write-Host "Adding watch location: $_" -ForegroundColor Yellow	
    	Add-Watcher $_ | Out-Null
    }
    
    Resolve-Path "$SourceDirectory/*/bin" | ForEach-Object{
    	Write-Host "Adding watch location: $_" -ForegroundColor Yellow
    	Add-Watcher $_ | Out-Null
    }
    
    Resolve-Path "$SourceDirectory/*/Assets" | ForEach-Object {
    	Write-Host "Adding watch location: $_" -ForegroundColor Yellow
    	Add-Watcher $_ | Out-Null
    }
    
    Write-Host [string]::Empty
    Write-Host "Now watching for changes made in the repo." -ForegroundColor Yellow
    Write-Host "Any changes made will be delivered to the Webroot automatically" -ForegroundColor Yellow
    Write-Host "***************************************************************" -ForegroundColor Yellow
    while($true){
    	#sleep more quickly when changes are happening
    	if ($global:LastEvent -gt ((Get-Date).ToString('HH:mm:ss.fff'))){
    		Start-Sleep -m 5
    	}else{
    		Start-Sleep 1
    	}
    }
    
    

    Unicorn Automated Operations Authentication

    I was attempting to set up some powershell scripts to deploy unicorn content and i was running into a strange issue where the Unicorn powershell module’s Unicorn sync request was getting redirected to the Sitecore login page.UnicornPowershellError

    Through some debugging i was able to track this down to the fact that another developer had checked in a different shared secret.  Once the shared secret was fixed i had smooth sailing.

    UnicornPowershellClear.png

    Really the take away is always double check your shared secrets whenever there’s a problem with an automated tool.

    Migrating Sidekick App 1.(1-2) to 1.4

    Upgrading your Sidekick binaries and finding that you no longer can resolve your ScsHttpHandler type?  You will need to migrate.

    Why did this happen?  Previously the ScsHttpHandler was a custom built HTTP handler built from the ground up, this allowed a guaranteed isolated context.  However i decided that the security and feature set of MVC was a far better medium for these apps and easier to understand than something custom.  Unfortunately this means some changes are required in order to convert your current Apps to use the new model.

    The old way

    namespace ScsJobViewer
    {
    	public class JobViewerHandler : ScsHttpHandler
    	{
    		//THE BELOW STUFF IS REGISTRATION STUFF, IT NOW BELONGS IN THE REGISTRATION FILE
    		public JobViewerHandler(string roles, string isAdmin, string users)
    			: base(roles, isAdmin, users)
    		{
    		}
    
    		public override string Directive { get; set; } = "jvdirective";
    		public override NameValueCollection DirectiveAttributes { get; set; }
    		public override string ResourcesPath { get; set; } = "ScsJobViewer.Resources";
    		public override string Icon => "/scs/jvgearwheels.png";
    		public override string Name => "Job Viewer";
    		public override string CssStyle => "width:600px";
    		//THE BELOW STUFF IS THE CONTROLLER STUFF, IT NOW BELONGS IN THE CONTROLLER
    		public override void ProcessRequest(HttpContextBase context)
    		{
    			string file = this.GetFile(context);
    
    			if (file == "jvgetjobs.json")
    			{
    				this.ReturnJson(context, this.GetJobs(context));
    			}
    			else
    			{
    				this.ProcessResourceRequest(context);
    			}
    		}
    		private object GetJobs(HttpContextBase context)
    		{
    			var data = GetPostData(context);
    			var model = JobManager.GetJobs().Where(x => data.running ? !x.IsDone : x.IsDone).OrderBy(x => x.QueueTime);
    			return model.Select(x => new JobModel(x));
    		}
    	}
    }
    

    Step 1

    Take the routing part of your ScsHttpHandler and move it into a controller implementing ScsController.
    Your old code:

    		public override void ProcessRequest(HttpContextBase context)
    		{
    			string file = this.GetFile(context);
    
    			if (file == "jvgetjobs.json")
    			{
    				this.ReturnJson(context, this.GetJobs(context));
    			}
    			else
    			{
    				this.ProcessResourceRequest(context);
    			}
    		}
    		private object GetJobs(HttpContextBase context)
    		{
    			var data = GetPostData(context);
    			var model = JobManager.GetJobs().Where(x => data.running ? !x.IsDone : x.IsDone).OrderBy(x => x.QueueTime);
    			return model.Select(x => new JobModel(x));
    		}
    

    IMPORTANT
    the controller should be decorated with an ActionName attribute that matches the request name made from the angular factory.
    Should be mapped like this:

    	class ScsJobViewerController : ScsController
    	{
    		//The action name should match what the angular factory is calling, note that case sensitivity isn't an issue.
    		[ActionName("jvgetjobs.json")]
    		public ActionResult GetJobs(bool running)
    		{
    			var model = JobManager.GetJobs().Where(x => running ? !x.IsDone : x.IsDone).OrderBy(x => x.QueueTime);
    			return model.Select(x => new JobModel(x));
    		}
    	}
    

    Step 2

    Take the registration part of your ScsHttpHandler and move it into a class that implements ScsRegistration.
    your old code:

    		public JobViewerHandler(string roles, string isAdmin, string users)
    			: base(roles, isAdmin, users)
    		{
    		}
    
    		public override string Directive { get; set; } = "jvdirective";
    		public override NameValueCollection DirectiveAttributes { get; set; }
    		public override string ResourcesPath { get; set; } = "ScsJobViewer.Resources";
    		public override string Icon => "/scs/jvgearwheels.png";
    		public override string Name => "Job Viewer";
    		public override string CssStyle => "width:600px";
    

    Should be translated like so
    IMPORTANT
    There is a new field for Identifier, this is a 2 letter code that is unique to your app, this is used for routing (which we will address later). Additionally you need to define your controller type.

    
    	class ScsJobViewerRegistration : ScsRegistration
    	{
    		public ScsJobViewerRegistration(string roles, string isAdmin, string users) : base(roles, isAdmin, users)
    		{
    		}
    
    		public override string Identifier => "jv";
    		public override string Directive => "jvmasterdirective";
    		public override NameValueCollection DirectiveAttributes { get; set; }
    		public override string ResourcesPath => "ScsJobViewer.Resources";
    		public override Type Controller => typeof(ScsJobViewerController);
    		public override string Icon => "/scs/jv/resources/jvgearwheels.png";
    		public override string Name => "Job Viewer";
    		public override string CssStyle => "min-width:600px;";
    	}
    

    Step 3

    Update your relative paths
    In order to automate the routing a more specific route is defined which requires adjustment for all your paths defined in the angular factory.

    /scs/jvgearwheels.png => /scs/jv/resources/jvgearwheels.png

    /scs/jvgetjobs.json => /scs/jv/jvgetjobs.json

    Step 4

    UPdate your config file

    <processor type="ScsJobViewer.JobViewerHandler, ScsJobViewer" >
    

    should now point to the registration type

    <processor type="ScsJobViewer.ScsJobViewerRegistration, ScsJobViewer" >
    

    Sharing Header/Footer across platforms

    I had a requirement that a site i was building was required to have the headers and footers sourced from a site owned by the parent company on a separate platform.  Sounds a bit insane but doable.

    Make certain your site is extremely clean for JS and CSS

    The first thing you’re going to want to verify is that you have nothing targeting general elements.  For example all your styling should be done by very specific class targeting.  Something like “my-secret-class” is great whereas “form” not so much.  Even worse would be to style root level elements such as assigning styling to the li element.

    In short, don’t use any JS/CSS that could interfere with things coming from their other domain.

    Scrape and cache

    Next you’ll want to scrape the source site and parse out their header/footer and all CSS/JS using HtmlAgilityPack

    		private readonly Dictionary<string, string> _referrerHeaders = new Dictionary<string, string>();
    		private readonly Dictionary<string, string> _referrerFooter = new Dictionary<string, string>();
    		private readonly object _refreshLocker = new object();
    
    		public virtual string GetHeader()
    		{
    			lock (_refreshLocker)
    			{
    				ValidateUrl(GetOriginModel()?.ReturnUrl);
    				string ret;
    				_referrerHeaders.TryGetValue(GetOriginModel()?.ReturnUrl ?? "", out ret);
    				return ret ?? "";
    			}
    		}
    		public virtual string GetFooter()
    		{
    			lock (_refreshLocker)
    			{
    				ValidateUrl(GetOriginModel()?.ReturnUrl);
    				string ret;
    				_referrerFooter.TryGetValue(GetOriginModel()?.ReturnUrl ?? "", out ret);
    				return ret ?? "";
    			}
    		}
    
    		public virtual void ValidateUrl(string url)
    		{
    			if (string.IsNullOrWhiteSpace(url) || url.StartsWith("/"))
    				return;
    			if (!_referrerHeaders.ContainsKey(url))
    			{
    				HtmlDocument doc = new HtmlDocument();
    				using (WebClient wc = new WebClient())
    				{
    					wc.Encoding = Encoding.UTF8;
    					doc.LoadHtml(wc.DownloadString(url));
    				}
    				_referrerHeaders[url] = GenerateHeader(url, doc);
    				_referrerFooter[url] = GenerateFooter(doc);
    			}
    		}
    		public virtual string GenerateFooter(HtmlDocument doc)
    		{
    			return GetNodesByAttribute(doc, "class", "site-footer").FirstOrDefault()?.OuterHtml;
    		}
    
    		public virtual string GenerateHeader(string url, HtmlDocument doc)
    		{
    			Uri uri = new Uri(url);
    			string markup =  GetNodesByAttribute(doc, "class", "site-header").FirstOrDefault()?.OuterHtml.Replace("action=\"/", $"action=\"https://{uri.Host}/");
    			string svg = GetNodesByAttribute(doc, "class", "svg-legend").FirstOrDefault()?.OuterHtml;
    			string stylesheets =
    				GetNodesByAttribute(doc, "rel", "stylesheet")
    					.Aggregate(new StringBuilder(), (tags, cur) => tags.Append(cur.OuterHtml.Replace("href=\"/bundles", $"href=\"https://{uri.Host}/bundles")))
    					.ToString();
    			string javascripts =
    				doc.DocumentNode.SelectNodes("//script")
    					.Aggregate(new StringBuilder(), (tags, cur) =>
    					{
    						if (cur.OuterHtml.Contains("gtm.js"))
    							return tags;
    					  return tags.Append(cur.OuterHtml.Replace("src=\"/bundles", $"src=\"https://{uri.Host}/bundles"));
    
    					})
    					.ToString();
    
    			return $"{svg}{stylesheets}{markup}{javascripts}";
    		}
    
    		public virtual HtmlNodeCollection GetNodesByAttribute(HtmlDocument doc, string attribute, string value)
    		{
    			return doc.DocumentNode.SelectNodes($"//*[contains(@{attribute},'{value}')]");
    		}
    

    NOTE: You’ll likely need to heavily customize your GenerateHeader and GenerateFooter methods.

    Lets break this down a bit as it’s a bit hard to follow.

    1. You pass in a URL that you want to source your headers and footers from
    2. Checks the cache to see if we already have that header/footer
    3. Using a WebClient it scrapes the markup off the source page
    4. Using whatever means we can we identify where the markup comes from for the header and footer, in this case it’s identifiable from a class of “site-footer” and “site-header” which makes it easier
    5. We make sure we turn any relative links into absolute links, since relative won’t work anymore since the thing is operating on a separate domain
    6. We grab their SVG sprite definition, we’ll need that or their icons will be blank
    7. Grab all stylesheets and scripts making sure to trip out the things that don’t make sense on a case by case basis like the the other domains tracking libraries
    8. Store this information in the cache

    Make sure you periodically clear the caches to pick up changes from the source.  I did this simply like this

    		public SiteComponentShareService()
    		{
    			Timer t = new Timer(600 * 1000);
    			t.Elapsed += (sender, args) =>
    			{
    				lock (_refreshLocker)
    				{
    					_referrerHeaders.Clear();
    					_referrerFooter.Clear();
    				}
    			};
    			t.Start();
    		}
    

     This clears the cache objects every 10 minutes with thread lockers to make sure it doesn’t clear the cache as something is trying to use it.

    Finishing Touches

    The acquired header and footer may have fancy XHR needs that need to be accounted for. Very likely for this you’ll need to proxy requests. For example i needed to catch search suggestions and pass it through to their servers endpoint for hawksearch

    		[Route("hawksearch/proxyautosuggest/{target}")]
    		public ActionResult RerouteAutosuggest(string target)
    		{
    			WebClient wc = new WebClient();
    			string ret = wc.DownloadString(
    				$"https://www.parentsitewherewefoundtheheaders.org/hawksearch/proxyAutoSuggest/{target}?{Request.QueryString}");
    			return Content(ret);
    
    		}
    

    As you can see, we’re simply catching it and passing it along to their domain’s endpoint. Since we have their same javascript code and their same headers this simple pass-through allows us to seem like we have the exact same header.