Dude, Where’re my logs? (Azure)

If you’re new to the world of Sitecore in Azure PaaS then there’s a good chance that you popped open kudu and browsed to the App_Data/Logs folder and said to yourself “oh yeah, it’s in Application Insights or something…”. Then after going to Application insights and pushing buttons haphazardly arrived at something that kind of looks like log. It can be confusing and concerning to feel like you have an inability to debug problem. I’m going to go over the various ways of retrieving debug information for your Sitecore App Services.

Application Insights

This is where the vast majority of your logs are going to be, it’s not a great format and leaves me wanting more from the tool, but here’s how you use it:

  1. Navigate to your sites Application insights Azure resource
  2. In the Overview tab select the Analytics buttonAIAnalytics
  3. Under the table traces execute a query
  4. traces
    | where customDimensions.Role == “CM” and severityLevel == 3
    AIQuery
  5. The results will not be ordered properly, make sure you click the column header for timestamp to order by date
  6. Application insights has some handy auto-complete features to help you build a custom query to get exactly the data you’re looking for

NOTEWhile Application insights provides a good way to track and query log data, there does seem to be particular cases where the application does not properly submit log data to Application Insights. This leads us to the next Method.

Log Streaming

A more root level logging solution is the log streaming option offered by the App Service. This can provide a more reliable but less pleasant source of logs, this is good if you have an easily reproducible scenario. This will provide appropriate data in a more traditional format that many Sitecore developers will be more comfortable with. This option can give you more accurate and complete logging. It is important to note however that the logs get placed on the filesystem, so that will effect your filesystem size.

  1. Open the Diagnostics logs tab and turn on all the streaming logs settings.DiagnosticsLogs.png
  2. In the log stream you will now see logs coming in at real timeLogStream

Application logs

Some IIS level events and errors will find their way into the underlying filesystem, you can use Kudu to access them.

  1. First you need to access Kudukudu.png
  2. Using either the cmd or powershell Debug console navigate to D:\home\LogFiles and open eventlog.xmleventlog
  3. Here you will find IIS events and errors that may uncover more catastrophic errors that fail to be recorded in Application Insights

Azure App Service Logging

Sometimes despite all other options the problem persists, this is when we must view Azure health as on occasion without notification Azure events will impact our environments negatively.

  1. On the App Service select the Diagnose and solve problems tabDiagnoseAndSolve
  2. There are several reports in this interface that are definately worth an in depth look. I’ll focus on the Web App Restarted report.
    If you find that your app pool seems to be recycling too often, this is probably where you need to look.webappRestarted
  3. This report will give you any reason that Azure will have restarted your App Service

Remotely triggering Sitecore Operations in Azure

Sometimes you want your build/release system to execute some Sitecore process.  I’ve found that this method works pretty well.  To date I’ve used this same model to:

  1. Sitecore Publish
  2. Index rebuild
  3. Package install

Note: this is an adaptation of the strategy pioneered in this blog post

There are three separate concerns in this technique

  1. Web service to perform the operation
  2. Kudu for dynamic service installation
  3. Powershell to execute the operation

Step 1 – create the service

This code will run a publish


using System.Collections.Generic;
using System.Linq;
using System.Web.Services;
using Sitecore.Jobs;


[WebService(Namespace = "http://tempuri.org/")]
[WebServiceBinding(ConformsTo = WsiProfiles.BasicProfile1_1)]
[System.ComponentModel.ToolboxItem(false)]
public class PublishManager : System.Web.Services.WebService
{
	[WebMethod(Description = "Publishes all content")]
	public bool PublishAll(string token)
	{
		if (string.IsNullOrEmpty(token))
			return false;
		if (token != "[TOKEN]")
			return false;
		var db = Sitecore.Configuration.Factory.GetDatabase("master");
		var item = db.GetRootItem();
		var publishingTargets = Sitecore.Publishing.PublishManager.GetPublishingTargets(item.Database);

		foreach (var publishingTarget in publishingTargets)
		{
			var targetDatabaseName = publishingTarget["Target database"];
			if (string.IsNullOrEmpty(targetDatabaseName))
				continue;

			var targetDatabase = Sitecore.Configuration.Factory.GetDatabase(targetDatabaseName);
			if (targetDatabase == null)
				continue;

			var publishOptions = new Sitecore.Publishing.PublishOptions(
				item.Database,
				targetDatabase,
				Sitecore.Publishing.PublishMode.Smart,
				item.Language,
				System.DateTime.Now);

			var publisher = new Sitecore.Publishing.Publisher(publishOptions);
			publisher.Options.RootItem = item;
			publisher.Options.Deep = true;
			publisher.PublishAsync();
		}
		return true;
	}
	[WebMethod(Description = "Checks publish status")]
	public string[] PublishStatus()
	{
		return JobManager.GetJobs().Where(x => !x.IsDone && x.Name.StartsWith("Publish")).Select(x =>
			x.Status.Processed + " -> " + x.Name).ToArray();
	}
}

This asmx service has two methods. The first method initiates the publish to all publishing targets using the root Sitecore item as starting point. The second method checks the status. It’s important to do it this way because Azure has an shortish forced timeout that’s something like 3-5 minutes, which a publish can easilly surpass. To avoid this, we trigger the publish asynchronously then use the next method to check the status until the publish is completed.

Step 2 – Kudu powershell scripts

Note: This powershell code requires that you have an authenticated azure session to the appropriate Azure subscription.

function Get-AzureRmWebAppPublishingCredentials($resourceGroupName, $webAppName, $slotName = $null){
	if ([string]::IsNullOrWhiteSpace($slotName) -or $slotName.ToLower() -eq "production"){
		$resourceType = "Microsoft.Web/sites/config"
		$resourceName = "$webAppName/publishingcredentials"
	}
	else{
		$resourceType = "Microsoft.Web/sites/slots/config"
		$resourceName = "$webAppName/$slotName/publishingcredentials"
	}
	$publishingCredentials = Invoke-AzureRmResourceAction -ResourceGroupName $resourceGroupName -ResourceType $resourceType -ResourceName $resourceName -Action list -ApiVersion 2015-08-01 -Force
    	return $publishingCredentials
}

function Get-KuduApiAuthorisationHeaderValue($resourceGroupName, $webAppName, $slotName = $null){
    $publishingCredentials = Get-AzureRmWebAppPublishingCredentials $resourceGroupName $webAppName $slotName
    $ret = @{}
    $ret.header = ("Basic {0}" -f [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $publishingCredentials.Properties.PublishingUserName, $publishingCredentials.Properties.PublishingPassword))))
    $ret.url = $publishingCredentials.Properties.scmUri
    return $ret
}

function Get-FileFromWebApp($resourceGroupName, $webAppName, $slotName = "", $kuduPath){
    $KuduAuth = Get-KuduApiAuthorisationHeaderValue $resourceGroupName $webAppName $slotName
    $kuduApiAuthorisationToken = $KuduAuth.header
    $kuduApiUrl = $KuduAuth.url + "/api/vfs/site/wwwroot/$kuduPath"

    Write-Host " Downloading File from WebApp. Source: '$kuduApiUrl'." -ForegroundColor DarkGray
    $tmpPath = "$($env:TEMP)\$([guid]::NewGuid()).xml"
    $null = Invoke-RestMethod -Uri $kuduApiUrl `
                        -Headers @{"Authorization"=$kuduApiAuthorisationToken;"If-Match"="*"} `
                        -Method GET `
                        -ContentType "multipart/form-data" `
                        -OutFile $tmpPath
    $ret = Get-Content $tmpPath | Out-String
    Remove-Item $tmpPath -Force
    return $ret
}

function Write-FileToWebApp($resourceGroupName, $webAppName, $slotName = "", $fileContent, $kuduPath){
    $KuduAuth = Get-KuduApiAuthorisationHeaderValue $resourceGroupName $webAppName $slotName
    $kuduApiAuthorisationToken = $KuduAuth.header
    $kuduApiUrl = $KuduAuth.url + "/api/vfs/site/wwwroot/$kuduPath"

    Write-Host " Writing File to WebApp. Destination: '$kuduApiUrl'." -ForegroundColor DarkGray

    Invoke-RestMethod -Uri $kuduApiUrl `
                        -Headers @{"Authorization"=$kuduApiAuthorisationToken;"If-Match"="*"} `
                        -Method Put `
                        -ContentType "multipart/form-data"`
                        -Body $fileContent
}
function Write-FileFromPathToWebApp($resourceGroupName, $webAppName, $slotName = "", $filePath, $kuduPath){
    $KuduAuth = Get-KuduApiAuthorisationHeaderValue $resourceGroupName $webAppName $slotName
    $kuduApiAuthorisationToken = $KuduAuth.header
    $kuduApiUrl = $KuduAuth.url + "/api/vfs/site/wwwroot/$kuduPath"

    Write-Host " Writing File to WebApp. Destination: '$kuduApiUrl'." -ForegroundColor DarkGray

    Invoke-RestMethod -Uri $kuduApiUrl `
                        -Headers @{"Authorization"=$kuduApiAuthorisationToken;"If-Match"="*"} `
                        -Method Put `
                        -ContentType "multipart/form-data"`
                        -InFile $filePath
}

function Write-ZipToWebApp($resourceGroupName, $webAppName, $slotName = "", $zipFile, $kuduPath){
    $KuduAuth = Get-KuduApiAuthorisationHeaderValue $resourceGroupName $webAppName $slotName
    $kuduApiAuthorisationToken = $KuduAuth.header
    $kuduApiUrl = $KuduAuth.url + "/api/zip/site/wwwroot/$kuduPath"

    Write-Host " Writing Zip to WebApp. Destination: '$kuduApiUrl'." -ForegroundColor DarkGray

    Invoke-RestMethod -Uri $kuduApiUrl `
                        -Headers @{"Authorization"=$kuduApiAuthorisationToken;"If-Match"="*"} `
                        -Method Put `
                        -ContentType "multipart/form-data"`
                        -InFile $zipFile
}
function Remove-FileFromWebApp($resourceGroupName, $webAppName, $slotName = "", $kuduPath){
    $KuduAuth = Get-KuduApiAuthorisationHeaderValue $resourceGroupName $webAppName $slotName
    $kuduApiAuthorisationToken = $KuduAuth.header
    $kuduApiUrl = $KuduAuth.url + "/api/vfs/site/wwwroot/$kuduPath"

    Write-Host " Writing File to WebApp. Destination: '$kuduApiUrl'." -ForegroundColor DarkGray

    Invoke-RestMethod -Uri $kuduApiUrl `
                        -Headers @{"Authorization"=$kuduApiAuthorisationToken;"If-Match"="*"} `
                        -Method Delete `
                        -ContentType "multipart/form-data"
}

This is a collection of kudu utilities to manage files on an app service, which is important for what we’re going to do next.

Step 3 – Manage the publish

param(
	[Parameter(Mandatory=$true)]
    [string]$ResourceGroupName,
    [Parameter(Mandatory=$true)]
    [string]$AppServiceName
)
. "$PSScriptRoot\Get-KuduUtility.ps1"


$folderKey = -join ((97..122) | Get-Random -Count 10 | ForEach-Object {[char]$_})
$accessKey = -join ((97..122) | Get-Random -Count 10 | ForEach-Object {[char]$_})
try{
    (Get-Content "$PSScriptRoot\PublishManager.asmx").Replace("[TOKEN]", $accessKey) | Set-Content "$PSScriptRoot\tmp.asmx"
	Write-FileFromPathToWebApp -resourceGroupName $ResourceGroupName -webAppName $AppServiceName -slotName "" -filePath "$PSScriptRoot\tmp.asmx" -kuduPath "PublishManager/$folderKey/PublishManager.asmx"
	Remove-Item "$PSScriptRoot\tmp.asmx" -Force
	$site = Get-AzureRmWebApp -ResourceGroupName $ResourceGroupName -Name $AppServiceName
	$webURI= "https://$($site.HostNames | Select-Object -Last 1)/PublishManager/$folderKey/PublishManager.asmx?WSDL"
    try{
        $null = Invoke-WebRequest -Uri $webURI -UseBasicParsing
    }catch{
        $null = Invoke-WebRequest -Uri $webURI -UseBasicParsing 
    }
	$proxy = New-WebServiceProxy -uri $webURI
    $proxy.Timeout = 1800000
    $ready = $proxy.PublishAll($accessKey)

	if (-not $ready){
		throw "Unable to publish, check server logs for details."
	}
    Write-Host "Starting publish process and scanning for progress."
	for ($i = 0; $i -lt 180; $i++) {
		$done = $true
		$proxy.PublishStatus() | ForEach-Object {
			$done = $false
			write-host $_
		}
		write-host "***********  $($i * 20) Seconds **********"
		if ($done){
            Write-Host "Publish Completed."
			break
		}
		Start-Sleep -Seconds 20
		if ($i -eq 179){
			write-host "Sitecore Publish Timeout."
		}
	}
}finally{
	Write-Host "Removing Sitecore Publish service"
	Remove-FileFromWebApp -resourceGroupName $ResourceGroupName -webAppName $AppServiceName -slotName "" -kuduPath "PublishManager/$folderKey/PublishManager.asmx"
	Remove-FileFromWebApp -resourceGroupName $ResourceGroupName -webAppName $AppServiceName -slotName "" -kuduPath "PublishManager/$folderKey"
}

There’re a few important components to this script:

  1. The script generates a couple guids, first obfuscates the service path, the second is used as a security key that must be passed into the service. The service is modified to write this security key into the asmx file prior to uploading to Azure.
  2. Uploads the modified asmx file to be dynamically compiled and utilized
  3. Executes the initialize publish service method
  4. Calls the status method every 20 seconds and outputs the current status
  5. Removal of the service for security purposes

Sitecore Sidekick 1.5 – The Need for Speed

A huge thank you to Michael West for sparking many of these ideas.

The Release Candidate for Sidekick 1.5 is now available.  Bringing with it many improvements.

Importantly this includes a fix that lets Sitecore Sidekick function properly with the latest version of Rainbow (used by the latest version of Unicorn).

We can make it go FASTER

speedy.gif

Introducing Data Blaster integration.

Data blaster is a tool for Sitecore that manages bulk item creation in a different way. In a nutshell it stages all the items into a temp table and merges them all in at the same time dramatically reducing item creation time.

The results are incredible.  Below is a test of 4095 item creates.

scStats

However Data Blaster is for bulk inserts primarily and starts to become less effective for updates.  This is the same test of 4095, but with updates rather than inserts.  As you can see the benefit of Data blaster drops off and Content Migrator shines.  Also noteworthy is that SC packages seem to be much worse at updates.  This is why Content Migrator only uses Data Blaster when items need to be created, and not updated.

scUpdateStats.png

With the introduction of Data blaster there is a new advanced option.  If for some reason you don’t want to use it.

newAdvancedOptions

Other speed improvements

Rev id for initial comparison

Using the rev id for comparisons.  previously content would be brought down from the remote server at which point the system decided if it needed to install it or skip it.  However now it sends the content revision id along with the request, if the revision ids match the items are equivalent (the vast majority of the time, hence the ignore rev id option in the image above).  This especially makes media syncs faster if there are media files that should be skipped they aren’t even brought over the network.

Larger content packages

Content migrator now batches item requests together.  Parents and children are combined (unless media items, which still go one at a time).  The result is a dramatic decrease in requests made to the server and a resulting performance increase.

Notable bug fixes

Diff Generation

There was a pesky bug in the generation of the diff that would result in some unusual occurrences that would most likely lead to false positives for changes.  This has been fixed along with smarter diff generation that will result in less processing time.  It still retains it’s immediately available diff functionality.

Easy Sitecore 9 Azure PAAS no downtime deployments

Disclaimer: It’s not really easy, just easier than alternatives.

This is also known as Blue/Green deployments

Having your site stay up during downtime is a very common ask for any website.  Sitecore comes with it’s own set of challenges, however with a few simple tips and tricks in Azure you can get a very robust solution for a minimal effort.

Attributes of this approach

  • There is downtime for authors in the CM environment.
  • There is no content authoring freeze. (however while a deployment is going on there is a publishing freeze, mitigated by an optional search index swap covered later)
  • Azure assets are created on demand so there is no offline environment hanging out doing nothing but costing money.
  • Orchestrated by powershell
  • The Issues

    Primary Issue

    A deploy is a 2 step process.  You need to publish new templates, renderings, or other developer owned Sitecore items to the content delivery database and you need to deploy the code that knows how to work those new templates. No matter how hard you try, you can’t do these things at the same time perfectly. This leaves the possibility of end users seeing server errors.

    Secondary Issues

    There are two secondary issues that are optional which will be discussed later Search index and Xconnect. These are secondary because they lead to some potentially annoying results, but not likely a server error.

    Solving these problems

    I’m going to focus on solving the primary problem for simplicity. Note that the diagram below has steps for Search Index replication. For simplicity in this blog i’ll focus on blue/green without search index handling.

    Sitecore 9 blue-green model.png

    To accomplish many of these tasks we’ll heavily be utilizing Kudu. Basically it’s a rest api suite for Azure app services.

    Process outline

    How will this process effect specific groups?

    I’m an author

    1. There will be brief downtime in the content management environment
    2. Content management will come back up with the new code and templates
    3. Content editing is allowed
    4. Publishing will send changes to the staging slot content delivery URL (NOTE: if you’re not duplicating a search index, this could impact your end users if your components are sourced by the search index)
    5. Once blue/green completes the changes that you published will be end user facing
    6. Business continues as usual

    I’m a dev-ops professional

    1. Authors have been warned about a brief downtime in CM
    2. Blue/green process is kicked off
    3. CM and CD deployments are done
    4. Alert testers and wait for testing to complete
    5. On successful test, swap staging slots to production, on failure wait for hotfix and deploy to the environments again, on catastrophic failure initialize rollback
    6. If Success, initialize finalize to clean up unused offline environments

    I’m a tester

    1. Get alerted by dev-ops team that the deployment to the staging slot is complete
    2. Break it
    3. Alert development team an emergency hotfix is needed
    4. Wait for dev-ops team to report the hotfix has been deployed
    5. Test again, no breaking this time
    6. Report to dev-ops team all is well

    The Powershell

    NOTE: These powershell functions all require the powershell context to have an authenticated Azure connection to perform it’s tasks. I recommend using a service principal for this.

    Utility functions

    This set of utility functions is mostly for file I/O with the Azure App Services. Saved in a file called “Get-KuduUtility.ps1”

    function Get-AzureRmWebAppPublishingCredentials($resourceGroupName, $webAppName, $slotName = $null){
    	if ([string]::IsNullOrWhiteSpace($slotName) -or $slotName.ToLower() -eq "production"){
    		$resourceType = "Microsoft.Web/sites/config"
    		$resourceName = "$webAppName/publishingcredentials"
    	}
    	else{
    		$resourceType = "Microsoft.Web/sites/slots/config"
    		$resourceName = "$webAppName/$slotName/publishingcredentials"
    	}
    	$publishingCredentials = Invoke-AzureRmResourceAction -ResourceGroupName $resourceGroupName -ResourceType $resourceType -ResourceName $resourceName -Action list -ApiVersion 2015-08-01 -Force
        	return $publishingCredentials
    }
    
    function Get-KuduApiAuthorisationHeaderValue($resourceGroupName, $webAppName, $slotName = $null){
        $publishingCredentials = Get-AzureRmWebAppPublishingCredentials $resourceGroupName $webAppName $slotName
        $ret = @{}
        $ret.header = ("Basic {0}" -f [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $publishingCredentials.Properties.PublishingUserName, $publishingCredentials.Properties.PublishingPassword))))
        $ret.url = $publishingCredentials.Properties.scmUri
        return $ret
    }
    
    function Get-FileFromWebApp($resourceGroupName, $webAppName, $slotName = string::Empty, $kuduPath){
        $KuduAuth = Get-KuduApiAuthorisationHeaderValue $resourceGroupName $webAppName $slotName
        $kuduApiAuthorisationToken = $KuduAuth.header
        $kuduApiUrl = $KuduAuth.url + "/api/vfs/site/wwwroot/$kuduPath"
    
        Write-Host " Downloading File from WebApp. Source: '$kuduApiUrl'." -ForegroundColor DarkGray
        $tmpPath = "$($env:TEMP)\$([guid]::NewGuid()).xml"
        $null = Invoke-RestMethod -Uri $kuduApiUrl `
                            -Headers @{"Authorization"=$kuduApiAuthorisationToken;"If-Match"="*"} `
                            -Method GET `
                            -ContentType "multipart/form-data" `
                            -OutFile $tmpPath
        $ret = Get-Content $tmpPath | Out-String
        Remove-Item $tmpPath -Force
        return $ret
    }
    
    function Write-FileToWebApp($resourceGroupName, $webAppName, $slotName = string::Empty, $fileContent, $kuduPath){
        $KuduAuth = Get-KuduApiAuthorisationHeaderValue $resourceGroupName $webAppName $slotName
        $kuduApiAuthorisationToken = $KuduAuth.header
        $kuduApiUrl = $KuduAuth.url + "/api/vfs/site/wwwroot/$kuduPath"
    
        Write-Host " Writing File to WebApp. Destination: '$kuduApiUrl'." -ForegroundColor DarkGray
    
        Invoke-RestMethod -Uri $kuduApiUrl `
                            -Headers @{"Authorization"=$kuduApiAuthorisationToken;"If-Match"="*"} `
                            -Method Put `
                            -ContentType "multipart/form-data"`
                            -Body $fileContent
    }
    
    function Write-ZipToWebApp($resourceGroupName, $webAppName, $slotName = string::Empty, $zipFile, $kuduPath){
        $KuduAuth = Get-KuduApiAuthorisationHeaderValue $resourceGroupName $webAppName $slotName
        $kuduApiAuthorisationToken = $KuduAuth.header
        $kuduApiUrl = $KuduAuth.url + "/api/zip/site/wwwroot/$kuduPath"
    
        Write-Host " Writing Zip to WebApp. Destination: '$kuduApiUrl'." -ForegroundColor DarkGray
    
        Invoke-RestMethod -Uri $kuduApiUrl `
                            -Headers @{"Authorization"=$kuduApiAuthorisationToken;"If-Match"="*"} `
                            -Method Put `
                            -ContentType "multipart/form-data"`
                            -InFile $zipFile
    }
    
    function Copy-AppServiceToStaging($resourceGroupName, $webAppName){
        $KuduAuth = Get-KuduApiAuthorisationHeaderValue $resourceGroupName $webAppName
        $kuduApiAuthorisationToken = $KuduAuth.header
        $KuduStagingAuth = Get-KuduApiAuthorisationHeaderValue $resourceGroupName $webAppName "Staging"
        $kuduStagingApiAuthorisationToken = $KuduStagingAuth.header
    #NOTE: you must copy all paths of the webroot that aren't involved in your deployment
    #For example if you also wanted to copy the Sitecore folder you could change this to:
    # @("App_Config", "App_Data", "Sitecore")
        @("App_Config", "App_Data") | ForEach-Object {
            $kuduConfigApiUrl = $KuduAuth.url + "/api/zip/site/wwwroot/$_/"
            $tmpPath = "$($env:TEMP)\$([guid]::NewGuid()).zip"
            try{
                $WebClient = New-Object System.Net.WebClient
                $WebClient.Headers.Add("Authorization", $kuduApiAuthorisationToken)
                $WebClient.Headers.Add("ContentType", "multipart/form-data")
    
                $WebClient.DownloadFile($kuduConfigApiUrl, $tmpPath)
    
                $kuduConfigApiUrl = $KuduStagingAuth.url + "/api/zip/site/wwwroot/$_/"
                $kuduApiFolderUrl = $KuduStagingAuth.url + "/api/vfs/site/wwwroot/$_/"
                Invoke-RestMethod -Uri $kuduApiFolderUrl `
                    -Headers @{"Authorization"=$kuduStagingApiAuthorisationToken;"If-Match"="*"} `
                    -Method PUT `
                    -ContentType "multipart/form-data"
                #need a sleep due to a race condition if this folder is utilized too quickly after creating
                Start-Sleep -Seconds 2
                Invoke-RestMethod -Uri $kuduConfigApiUrl `
                    -Headers @{"Authorization"=$kuduStagingApiAuthorisationToken;"If-Match"="*"} `
                    -Method PUT `
                    -ContentType "multipart/form-data" `
                    -InFile $tmpPath
            }finally{
                if (Test-Path $tmpPath){
                    Remove-Item $tmpPath
                }
            }
        }
    }
    function Get-DatabaseNames{
    	param(
    		[Parameter(Mandatory = $true)]
    		[string]$ResourceGroupName,
    		[Parameter(Mandatory = $true)]
    		[string]$AppServiceName,
    		[Parameter(Mandatory = $true)]
    		[string]$DatabaseNameRoot,
    		[string]$SlotName = string::Empty
    
    	)
    	$contents = (Get-FileFromWebApp -resourceGroupName $ResourceGroupName -webAppName $AppServiceName -slotName $SlotName -kuduPath "App_Config/ConnectionStrings.config") | Out-String
    	if ($contents.Contains("$DatabaseNameRoot-2")){
    		$ret = @{
    			InactiveDatabase = $DatabaseNameRoot
    			ActiveDatabase = $DatabaseNameRoot + '-2'
    		}
    	}elseif ($contents.Contains("$DatabaseNameRoot")){
    		$ret = @{
    			InactiveDatabase = $DatabaseNameRoot + '-2'
    			ActiveDatabase = $DatabaseNameRoot
    		}
    	}else{
            throw "unable to find $DatabaseNameRoot OR $DatabaseNameRoot-2"
        }
    	return $ret
    }
    
    

    Step 1

    Copy the Production slot to a Staging slot

    param(
        [Parameter(Mandatory=$true)]
        [string]$ResourceGroupName,
        [Parameter(Mandatory=$true)]
        [string]$AppServiceName
    )
    . "$PSScriptRoot\Get-KuduUtility.ps1"
    $existingSlot = Get-AzureRmWebAppSlot -ResourceGroupName $ResourceGroupName -Name $AppServiceName -Slot "Staging" -ErrorAction SilentlyContinue
    if ($null -ne $existingSlot){
        write-host "Removing existing Staging slot"
        Remove-AzureRmWebAppSlot -ResourceGroupName $ResourceGroupName -Name $AppServiceName -Slot "Staging" -Force
        Start-Sleep -s 10
    }
    $slot = Get-AzureRmWebAppSlot -ResourceGroupName $ResourceGroupName -Name $AppServiceName -Slot "Production"
    New-AzureRmWebAppSlot -ResourceGroupName $ResourceGroupName -Name $AppServiceName -Slot "Staging" -AppServicePlan $slot.ServerFarmId
    
    Copy-AppServiceToStaging -ResourceGroupName $ResourceGroupName -WebAppName $AppServiceName
    

    Step 2

    Make copies of all of your content delivery databases and wire your CM environment to the new databases

    param(
        [Parameter(Mandatory=$true)]
        [string]$ResourceGroupName,
        [Parameter(Mandatory=$true)]
        [string]$AppServiceName,
        [Parameter(Mandatory=$true)]
        [string]$CDAppServiceName,
        [string]$SlotName = string::Empty,
        [Parameter(Mandatory=$true)]
        [string]$DatabaseNameRoot,
        [Parameter(Mandatory=$true)]
        [string]$SqlServerName
    )
    
    . "$PSScriptRoot\Get-KuduUtility.ps1"
    $contents = (Get-FileFromWebApp -resourceGroupName $ResourceGroupName -webAppName $AppServiceName -slotName $SlotName -kuduPath "App_Config/ConnectionStrings.config") | Out-String
    $db = Get-DatabaseNames -ResourceGroupName $ResourceGroupName -AppServiceName $CDAppServiceName -DatabaseNameRoot $DatabaseNameRoot -SlotName $SlotName
    $contents = $contents.Replace("Catalog=$($db.ActiveDatabase);", "Catalog=$($db.InactiveDatabase);")
    
    
    $tst = Get-AzureRmSqlDatabase -DatabaseName $db.InactiveDatabase -ServerName $SqlServerName -ResourceGroupName $ResourceGroupName -ErrorAction SilentlyContinue
    if ($null -ne $tst){
        throw "Unable to copy database when the CM environment is referencing $($db.ActiveDatabase) and $($db.InactiveDatabase) already exist.  Make sure that both the tenant CD AND the CM environment are using the same database before this operation and delete the unused database and try again."
    }
    $tst = Get-AzureRmSqlDatabase -DatabaseName $db.ActiveDatabase -ServerName $SqlServerName -ResourceGroupName $ResourceGroupName -ErrorAction SilentlyContinue
    write-host "Copying database $($db.ActiveDatabase) to $($db.InactiveDatabase)"
    $parameters = @{
        ResourceGroupName = $ResourceGroupName
        DatabaseName = $db.ActiveDatabase
        ServerName = $SqlServerName
        CopyResourceGroupName = $ResourceGroupName
        CopyServerName = $SqlServerName
        CopyDatabaseName = $db.InactiveDatabase
    }
    if (-not [string]::IsNullOrWhitespace($tst.ElasticPoolName)){
        $parameters["ElasticPoolName"] = $tst.ElasticPoolName
    }
    New-AzureRmSqlDatabaseCopy @parameters
    Write-FileToWebApp -resourceGroupName $ResourceGroupName -webAppName $AppServiceName -fileContent $contents -slotName $SlotName -kuduPath "App_Config/ConnectionStrings.config"
    
    

    NOTE: This accesses the CD app service to determine the offline database. It toggles between {name} and {name}-2
    NOTE: DatabaseNameRoot refers to the database name without the -2 on it.
    NOTE: This assumes the assets all share a resource group, if that’s not true, add some more parameters

    Step 3

    Copy all content delivery app services to a Staging slot.

    param(
        [Parameter(Mandatory=$true)]
        [string]$ResourceGroupName,
        [Parameter(Mandatory=$true)]
        [string]$AppServiceName
    )
    . "$PSScriptRoot\Get-KuduUtility.ps1"
    $existingSlot = Get-AzureRmWebAppSlot -ResourceGroupName $ResourceGroupName -Name $AppServiceName -Slot "Staging" -ErrorAction SilentlyContinue
    if ($null -ne $existingSlot){
        write-host "Removing existing Staging slot"
        Remove-AzureRmWebAppSlot -ResourceGroupName $ResourceGroupName -Name $AppServiceName -Slot "Staging" -Force
        Start-Sleep -s 10
    }
    $slot = Get-AzureRmWebAppSlot -ResourceGroupName $ResourceGroupName -Name $AppServiceName -Slot "Production"
    New-AzureRmWebAppSlot -ResourceGroupName $ResourceGroupName -Name $AppServiceName -Slot "Staging" -AppServicePlan $slot.ServerFarmId
    
    Copy-AppServiceToStaging -ResourceGroupName $ResourceGroupName -WebAppName $AppServiceName
    

    NOTE: This assumes that your deployment process deploys all of Sitecore except things in App_Data and environment specific App_Config config files like ConnectionStrings.config

    Step 4

    Execute a deploy as you normally would while targeting the production slot for CM and the staging slot for CD.

    Step 5

    Test your changes on CM and the CD Staging slots

    NOTE: these scripts assume that your deployment process handles all non-dynamically generated assets. So things like the Sitecore folder would be included in your deployment process whereas things like your license.xml or ConnectionStrings.config would not be. These things would be handled by the app service copy in Step 3
    NOTE: if you need to hotfix, you can repeat step 4

    Step 6

    Swap production slot and staging slots in your CD app services

    param(
        [Parameter(Mandatory=$true)]
        [string]$ResourceGroupName,
        [Parameter(Mandatory=$true)]
        [string]$AppServiceName,
        [string]$SlotName = "Staging"
    )
    
    Switch-AzureRmWebAppSlot -ResourceGroupName $ResourceGroupName -Name $AppServiceName -DestinationSlotName "Production" -SourceSlotName $SlotName
    

    NOTE: At this time your change is live and there has been no downtime

    Step 7

    Clean up.
    It’s important to note that you can delay this step to provide a rapid rollback if needed. Before completing this step, swapping the slots again will give us a rollback in seconds.
    Remove the old content delivery databases.

    
    param(
        [Parameter(Mandatory=$true)]
        [string]$ResourceGroupName,
        [Parameter(Mandatory=$true)]
        [string]$AppServiceName,
        [Parameter(Mandatory=$true)]
        [string]$SqlServerName,
        [Parameter(Mandatory=$true)]
        [string]$DatabaseNameRoot,
        [switch]$DeleteActive = $false,
        [string]$SlotName = ""
    )
    . "$PSScriptRoot\Get-KuduUtility.ps1"
    
    $db = Get-DatabaseNames -ResourceGroupName $ResourceGroupName -AppServiceName $AppServiceName -DatabaseNameRoot $DatabaseNameRoot -SlotName $SlotName
    
    if ($DeleteActive){
        Remove-AzureRmSqlDatabase -ResourceGroupName $ResourceGroupName -ServerName $SqlServerName -DatabaseName $db.ActiveDatabase -Force
    }else{
        Remove-AzureRmSqlDatabase -ResourceGroupName $ResourceGroupName -ServerName $SqlServerName -DatabaseName $db.InactiveDatabase -Force
    }
    

    Remove the staging slots for each environment, CD app services and CM app service

    param(
        [string]$ResourceGroupName,
        [string]$AppServiceName,
        [string]$SlotName
    )
    
    Remove-AzureRmWebAppSlot -ResourceGroupName $ResourceGroupName -Name $AppServiceName -Slot $SlotName -Force
    

    That’s it, you’re done and there was no downtime, feels good doesn’t it?

    What if we need to roll back?

    If you’ve gotten to step 7 and find that the code is flawed and won’t be able to hotfix or you need an emergency content change this is how you roll back.

    Step 1

    Swap the CM staging slot to production (this will contain the database connections to the old database)

    param(
        [Parameter(Mandatory=$true)]
        [string]$ResourceGroupName,
        [Parameter(Mandatory=$true)]
        [string]$AppServiceName,
        [string]$SlotName = "Staging"
    )
    
    Switch-AzureRmWebAppSlot -ResourceGroupName $ResourceGroupName -Name $AppServiceName -DestinationSlotName "Production" -SourceSlotName $SlotName
    
    

    Step 2

    Then delete the NEW databases you created in step 2

    param(
        [Parameter(Mandatory=$true)]
        [string]$ResourceGroupName,
        [Parameter(Mandatory=$true)]
        [string]$AppServiceName,
        [Parameter(Mandatory=$true)]
        [string]$SqlServerName,
        [Parameter(Mandatory=$true)]
        [string]$DatabaseNameRoot,
        [switch]$DeleteActive = $false,
        [string]$SlotName = string::Empty
    )
    . "$PSScriptRoot\Get-KuduUtility.ps1"
    
    $db = Get-DatabaseNames -ResourceGroupName $ResourceGroupName -AppServiceName $AppServiceName -DatabaseNameRoot $DatabaseNameRoot -SlotName $SlotName
    
    if ($DeleteActive){
        Remove-AzureRmSqlDatabase -ResourceGroupName $ResourceGroupName -ServerName $SqlServerName -DatabaseName $db.ActiveDatabase -Force
    }else{
        Remove-AzureRmSqlDatabase -ResourceGroupName $ResourceGroupName -ServerName $SqlServerName -DatabaseName $db.InactiveDatabase -Force
    }
    

    Step 3

    then remove the stanging slots

    
    param(
        [string]$ResourceGroupName,
        [string]$AppServiceName,
        [string]$SlotName
    )
    
    Remove-AzureRmWebAppSlot -ResourceGroupName $ResourceGroupName -Name $AppServiceName -Slot $SlotName -Force
    

    Step 4

    Finally you’d run either a TDS sync or Unicorn sync to get your developer owned assets back to pre-deployment state.

    Solving secondary issues

    Search index

    For the search index you’ll want to treat it in the same way we treated Databases. At the start of the blue/green process we’ll create a new one as a clone from the production facing one and rewire the CM environment and CD staging slots to use the new Search index.

    You would do this if you use the search index to source content to end users. Primarily you’d see this in a site search. The issue would be if you add pages to your offline web database the search index would pick those up and possibly return links to end users that are 404s or if the code tries to get the Sitecore item you may end up with null exceptions.

    This is optional because with proper governance you can mitigate the risk. Things like content freezes or publishing freezes would work fine.

    With this safely implemented you could potentially lift any content author freezes. As long as the authors know that during a blue/green deployment their changes are published to the offline environment until the swap is completed.

    XConnect

    You would want to have 2 parallel XConnect environments, one that’s always customer facing and one that’s always not customer facing. During a blue/green deployment you’d want to have your CM environment and the CD staging slots pointed at the offline always XConnect environment. Then immediately before the swap you’ll want to rewire the CM and CD staging slots to point to the online only XConnect environment.

    You would do this if you wanted to be certain that there was no testing data in XConnect.

    This is optional because most people wouldn’t mind a bit of testing data in their analytics data.

    SXA Tenant Specific Field Validation

    SXA includes a large number of extremely helpful modules and it would be wise to utilize as many as you can.  However a problem arises if you have multiple tenants or sites who want to both apply different field validation rules to fields that come from an SXA base template.

    An extra huge thanks to @Sitecorey for a huge amount of help in working through this problem.

    By default validation rules are applied on the template field item under the template.  This means that every other template that inherits yours will automatically get the validation rules applied to it.

    Installation instructions and full source

    Download the Sitecore package

    My solution is to pull the validation rule definitions optionally out of the template field and into a global library of items that contain a template to template field mapping. The field can be defined in the template or in any of the base templates or the base templates base templates and so on.

    validatorDiagram

    An Example

    I have two tenants both using the SEO Metadata module to get keywords and page description fields on their page template. Using this technique i was able to have one tenant define a 125 character limit while having the other tenant not have any validation. This was done by specifying the base template for pages as the template target and the SXA meta description field as the field. Even though the template doesn’t directly define this field we’re still able to apply validation to it.
    fieldvalidationitem

    How it’s done

    The magic is done by overriding the default validation manager and adding functionality on top of it.  Basically what we want to do is augment the default functionality of the validator by looking into the library of global validators defined in the settings section of our SXA site.  To do that we have to follow a few steps:

    1. Get the root of the applicable site’s global field validator definitions root.
    2. Grab all the validator definitions from under that root.
    3. Build validators for all the definitions
    4. Return those validators
    using System;
    using System.Collections.Generic;
    using System.Linq;
    using Sitecore.Abstractions;
    using Sitecore.CodeDom.Scripts;
    using Sitecore.Data.Fields;
    using Sitecore.Data.Items;
    using Sitecore.Data.Validators;
    using Sitecore.XA.Foundation.Multisite;
    
    namespace JeffDarchuk.Foundation.ContentValidation
    {
    	public class GlobalFieldValidatorManager : DefaultValidatorManager
    	{
    		private readonly IMultisiteContext _multisiteContext;
    		private readonly BaseTemplateManager _templateManager;
    
    		public GlobalFieldValidatorManager(BaseItemScripts itemScripts, IMultisiteContext multisiteContext, BaseTemplateManager templateManager) : base(itemScripts)
    		{
    			_multisiteContext = multisiteContext ?? throw new ArgumentNullException(nameof(multisiteContext));
    			_templateManager = templateManager ?? throw new ArgumentNullException(nameof(templateManager));
    		}
    
    		public override ValidatorCollection BuildValidators(ValidatorsMode mode, Item item)
    		{
    			var validators = base.BuildValidators(mode, item);
    			var globalFieldRulesFolder = GetGlobalFieldRulesFolder(item);
    			if (globalFieldRulesFolder == null) return validators;
    			foreach (var validator in GetAdditionalValidators(item, globalFieldRulesFolder, mode))
    			{
    				validators.Add(validator);
    			}
    			return validators;
    		}
    
    		private Item GetGlobalFieldRulesFolder(Item item)
    		{
    			return _multisiteContext.GetSettingsItem(item)?.Children.FirstOrDefault(x =>
    				x.TemplateID.ToString() == Templates.GlobalFieldRuleFolder.Id);
    		}
    
    		private IEnumerable GetAdditionalValidators(Item item, Item globalFieldRulesFolder, ValidatorsMode mode)
    		{
    			var baseTemplates = new HashSet(_templateManager.GetTemplate(item).GetBaseTemplates().Select(x => x.ID.ToString()));
    			foreach (var globalFieldRule in GetGlobalFieldRules(globalFieldRulesFolder))
    			{
    				var template = globalFieldRule[Templates.GlobalFieldRule.Fields.Template];
    				if (!FieldRuleAppliesToItem(item, globalFieldRule, template, baseTemplates)) continue;
    				MultilistField validators = null;
    				switch (mode)
    				{
    					case ValidatorsMode.Gutter:
    						validators = globalFieldRule.Fields[Templates.FieldTypeValidationRules.Fields.QuickValidationBar];
    						break;
    					case ValidatorsMode.ValidateButton:
    						validators = globalFieldRule.Fields[Templates.FieldTypeValidationRules.Fields.ValidateButton];
    						break;
    					case ValidatorsMode.ValidatorBar:
    						validators = globalFieldRule.Fields[Templates.FieldTypeValidationRules.Fields.ValidatorBar];
    						break;
    					case ValidatorsMode.Workflow:
    						validators = globalFieldRule.Fields[Templates.FieldTypeValidationRules.Fields.Workflow];
    						break;
    				}
    				foreach (var validator in validators?.GetItems() ?? Enumerable.Empty())
    				{
    					var baseValidator = BuildValidator(validator, item);
    					baseValidator.FieldID = item.Fields[globalFieldRule[Templates.GlobalFieldRule.Fields.Field]].ID;
    					yield return baseValidator;
    				}
    			}
    		}
    
    		private IEnumerable GetGlobalFieldRules(Item globalFieldRulesFolder)
    		{
    			return globalFieldRulesFolder.Axes.GetDescendants().Where(x => x.TemplateID.ToString() == Templates.GlobalFieldRule.Id);
    		}
    
    		private bool FieldRuleAppliesToItem(Item item, Item globalFieldRule, string template, HashSet baseTemplates)
    		{
    			var useInheritedTemplates = ((CheckboxField)globalFieldRule.Fields[Templates.GlobalFieldRule.Fields.ApplyToInheritedTemplates]).Checked;
    			return item.TemplateID.ToString() == template || useInheritedTemplates && baseTemplates.Contains(template);
    		}
    	}
    }
    

    The end result is that we get to define validation rules in whatever template we wish. Even if that template doesn’t directly define the field but only inherits it.

    Ensure all projects use the same nuget versions

    If you’re using nuget packages in Sitecore helix it’s very easy to inadvertently use different versions of nuget packages between separate helix layers. This can cause very strange and hard to diagnose issues. However with a little validation we can avoid this problem entirely before it hits your webroot. Applying a little powershell script validation method to your publish process we can get clear and concise output of what the problem is.

    function NuGetPackageValidation {
    	param(
    		[string]$solutionPath
    	)
    	write-host "Beginning Nuget validation."
    	$tracker = @{}
    	Get-ChildItem (split-path $solutionPath) -recurse packages.config | ForEach-Object {
    	  $fullFileName = $_.FullName
    	  $csProjName = Split-Path (Resolve-Path "$(Split-Path $fullFileName)\*.csproj").Path -Leaf
    	  [xml]$curConfigFile = Get-Content $fullFilename
    	  $curConfigFile.packages.package | ForEach-Object {
    		  if ($null -eq $tracker[$_.id]){
    			$tracker[$_.id] = @{
    				id = $_.id
    				versions = @{}
    				}
    			$tracker[$_.id].versions[$_.version] = @{
    				version = $_.version
    				project = @($csProjName)
    				}
    			}
    			elseif ($null -eq $tracker[$_.id].versions[$_.version]){
    				$tracker[$_.id].versions[$_.version] = @{
    					version = $_.version
    					project = @($csProjName)
    					}
    			}else{
    				$tracker[$_.id].versions[$_.version].project += $csProjName
    			}
    	  }
    	}
    	$ret = $true
    	$tracker.Keys | ForEach-Object {
    		if ($tracker[$_].versions.Count -gt 1){
    			if ($ret){
    				Write-Host "Problems found with Nuget packages, ensure that the same Nuget package versions are used across projects."
    			}
    			$ret = $false
    			Write-Host @"
    ----------------------------------
      $_
    ----------------------------------
    "@
    			$versions = $tracker[$_].versions
    			$versions.Keys | ForEach-Object{
    				Write-Host $versions[$_].version
    				$versions[$_].project | ForEach-Object {
    					Write-Host "      $_"
    				}
    				Write-Host ""
    			}
    		}
    	}
    	return $ret
      }
    

    You can then take the results from this function and halt the build with some detailed results for what projects are involved in the mismatch. You can expect to see results like this:

    nugetValidation

    Using this data, you can easily track down and fix anomalies before they become hard to diagnose problems.

    Transform configs on Azure

    It can be a powerful tool to give your release process the ability to manipulate configuration files on the server that aren’t directly controlled by your source code.

    The primary use cases i’ve used this for is to at deployment time re-wire connection strings on the fly or to have the stock Sitecore web.config be deployed stock and managed by the release process through remote transforms. Whatever your devops constraints are this technique could come in handy.

    The technique is fairly simple:

  • Connect to Azure, here we’re using a service principal
  • Generate kudu credentials from publishsettings
  • Download xml file
  • Transform file using XDTs and optionally tokens
  • Upload xml file back to app service
  • Note: This requires the loading of the Microsoft.Web.XmlTransform.dll DLL, so make sure that’s available.

    param(
        [string]$KuduPath,
        [string[]]$XDTs,
        [string]$TenantId,
        [string]$SubscriptionId,
        [string]$ResourceGroupName,
        [string]$WebAppServiceName,
        [string]$SlotName = "",
        [string]$ServicePrincipalID,
        [string]$ServicePrincipalKey,
        [hashtable]$Tokens
    )
    
    function Get-AzureRmWebAppPublishingCredentials($resourceGroupName, $webAppName, $slotName = $null){
    	if ([string]::IsNullOrWhiteSpace($slotName)){
    		$resourceType = "Microsoft.Web/sites/config"
    		$resourceName = "$webAppName/publishingcredentials"
    	}
    	else{
    		$resourceType = "Microsoft.Web/sites/slots/config"
    		$resourceName = "$webAppName/$slotName/publishingcredentials"
    	}
    	$publishingCredentials = Invoke-AzureRmResourceAction -ResourceGroupName $resourceGroupName -ResourceType $resourceType -ResourceName $resourceName -Action list -ApiVersion 2015-08-01 -Force
        	return $publishingCredentials
    }
    
    function Get-KuduApiAuthorisationHeaderValue($resourceGroupName, $webAppName, $slotName = $null){
        $publishingCredentials = Get-AzureRmWebAppPublishingCredentials $resourceGroupName $webAppName $slotName
        $ret = @{}
        $ret.header = ("Basic {0}" -f [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $publishingCredentials.Properties.PublishingUserName, $publishingCredentials.Properties.PublishingPassword))))
        $ret.url = $publishingCredentials.Properties.scmUri
        return $ret
    }
    
    function Get-FileFromWebApp($resourceGroupName, $webAppName, $slotName = "", $kuduPath){
        $KuduAuth = Get-KuduApiAuthorisationHeaderValue $resourceGroupName $webAppName $slotName
        $kuduApiAuthorisationToken = $KuduAuth.header
        $kuduApiUrl = $KuduAuth.url + "/api/vfs/site/wwwroot/$kuduPath"
    
        Write-Host " Downloading File from WebApp. Source: '$kuduApiUrl'." -ForegroundColor DarkGray
        $tmpPath = "$($env:TEMP)\$([guid]::NewGuid()).xml"
        $null = Invoke-RestMethod -Uri $kuduApiUrl `
                            -Headers @{"Authorization"=$kuduApiAuthorisationToken;"If-Match"="*"} `
                            -Method GET `
                            -ContentType "multipart/form-data" `
                            -OutFile $tmpPath
        $ret = Get-Content $tmpPath | Out-String
        Remove-Item $tmpPath -Force
        return $ret
    }
    function Write-FileToWebApp($resourceGroupName, $webAppName, $slotName = "", $fileContent, $kuduPath){
        $KuduAuth = Get-KuduApiAuthorisationHeaderValue $resourceGroupName $webAppName $slotName
        $kuduApiAuthorisationToken = $KuduAuth.header
        $kuduApiUrl = $KuduAuth.url + "/api/vfs/site/wwwroot/$kuduPath"
    
        Write-Host " Writing File to WebApp. Destination: '$kuduApiUrl'." -ForegroundColor DarkGray
    
        Invoke-RestMethod -Uri $kuduApiUrl `
                            -Headers @{"Authorization"=$kuduApiAuthorisationToken;"If-Match"="*"} `
                            -Method Put `
                            -ContentType "multipart/form-data"`
                            -Body $fileContent
    }
    function Update-XmlDocTransform($xml, $xdt, $tokens)
    {
        Add-Type -LiteralPath "$PSScriptRoot\Microsoft.Web.XmlTransform.dll"
    
        $xmldoc = New-Object Microsoft.Web.XmlTransform.XmlTransformableDocument;
        $xmldoc.PreserveWhitespace = $true
        $xmldoc.LoadXml($xml);
        $useTokens = $false
        if ($tokens -ne $null -and $tokens.Count -gt 0){
            $useTokens = $true
            $sb = [System.Text.StringBuilder]::new((Get-Content -Path $xdt))
            $tmpPath = "$($env:TEMP)\$([guid]::NewGuid()).xdt"
            $tokens.Keys | ForEach-Object{
                $null = $sb.Replace($_, $tokens[$_])
            }
            Set-Content -Path $tmpPath -Value $sb.ToString()
            $xdt = $tmpPath
        }
        
        $transf = New-Object Microsoft.Web.XmlTransform.XmlTransformation($xdt);
        if ($transf.Apply($xmldoc) -eq $false)
        {
            throw "Transformation failed."
        }
        if ($useTokens){
            Remove-Item -Path $xdt -Force
        }
        return $xmldoc.OuterXml
    }
    $Credential = New-Object -TypeName PSCredential($ServicePrincipalID, (ConvertTo-SecureString -String $ServicePrincipalKey -AsPlainText -Force))
    
    
    # Connect to Azure using SP
    $connectParameters = @{
        Credential     = $Credential
        TenantId       = $TenantId
        SubscriptionId = $SubscriptionId
    }
    
    Write-Host 'Connecting to Azure.'
    
    $null = Add-AzureRmAccount @connectParameters -ServicePrincipal
    
    $contents = Get-FileFromWebApp `
        -resourceGroupName $ResourceGroupName `
        -webAppName $WebAppServiceName `
        -slotName $SlotName `
        -kuduPath $KuduPath
    $XDTs | Foreach-Object{
        $contents = Update-XmlDocTransform -xml $contents -xdt "$PSScriptRoot\XDT\$_.xdt" -tokens $Tokens
    }
    Write-FileToWebApp `
        -resourceGroupName $ResourceGroupName `
        -webAppName $WebAppServiceName `
        -slotName $SlotName `
        -kuduPath $KuduPath `
        -fileContent $contents
    

    Additionally with some simple adaptations of this code you can use Kudu to perform whatever kind of file manipulations you want using a similar technique.