Introduction
Say you have a VM and an application insights resource that you're actively using for development purposes. However you also want to think about the bill, because running a VM 24/7 when it's not being used for 66% of the time is throwing money out the window. So, you decide to automate the creation and deletion of the VM resource and app insights (because you have everything in one template for example). Our starting situation looks like this:
The main problem with this setup is that the application insights resource gets destroyed, which in turn means the application insights telemetry key will be a different key each day. Depending on the preferences of the tech team and support team, the telemetry key will get injected into the application:
- either by reading a config file,
- or by getting it at runtime
Getting the key at runtime in my opinion is the preferred option: when starting up the application connects to Azure and gets the telemetry key from the application insights resource by using its proper credentials. But I know most teams are still using config files for configuration settings so I wanted to explore option #1. This means that in this development setup, someone will have to run these steps every day:
The developer first has to go online (Azure portal or another means) and get the telemetry key from app insights. Then the dev has to open his favorite IDE for config files and change the setting, put it in git so that the application can get deployed by an automation tool.
Automating the dev workflow
What does the xml file look like? In this case this is a Service Fabric ApplicationParameters configuration file.
<?xml version="1.0" encoding="utf-8"?>
<Application xmlns:xsd="http://www.w3.org/2001/XMLSchema"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
Name="fabric:/MyApp"
xmlns="http://schemas.microsoft.com/2011/01/fabric">
<Parameters>
<Parameter Name="ApplicationInsights.InstrumentationKey" Value="a" />
</Parameters>
</Application>
Step 1: writing the script
Using Powershell, the script that gets the up to date instrumentation key, puts it in the config file and then puts it in git could look like:
$subscriptionId = ""
$appInsightsName = "myAppInsights"
$appInsightsResourceGroup = "myResourceGroup"
# Login- uncomment if you're not going to log in first...
# login-AzAccount
# Select-AzSubscription -SubscriptionId $subscriptionId
# Getting the instrumentation key
Write-Host "Getting the instrumentation key"
$instrumentationKey = (Get-AzApplicationInsights -ResourceGroupName $appInsightsResourceGroup) -Name $appInsightsName).InstrumentationKey
# Replacing the key in the XML files
Write-Host "Listing files, setting up directory"
$file = (Get-Location).Path + "\cloud.xml"
$content = New-Object -TypeName XML
$content.Load($file)
($content.Application.Parameters.Parameter | `
Where-Object { $_.Name -eq "ApplicationInsights.InstrumentationKey" } | `
Select-Object).value = $instrumentationKey
$content.Save($filePath)
# Commit
git add *
git commit -m "Daily update instrumentation keys"
This is what you would be doing on a daily basis, and this script is not even handling errors properly or logging to a central system. Going into Powershell logging is a subject for a later post.
In any case, we now have automated the updating of the keys on the developer machine. This means that the cycle became:
Automating - without human intervention
Using the Azure Release Pipelines we can easily execute powershell scripts. If you check out the marketspace you will find tasks that replace tokens or strings in a text file (json/xml), but none of these work on your local machine, and that is where we started. For the purpose of this blog we'll stick to the script that does the in-powershell transformation of the file.
- Create a release pipeline
- Set the git repo you want to commit to as an artifact
- Select an empty template
- Add an Azure Powershell script task
Why? Because as you saw in the script we first need to do a Login-AzAccount. That gets us connectivity to our Azure environment. When you use the Azure Powershell script task in Azure Devops you have to configure it to run as a user (service principal) using a Service Connection:
- Copy the script
- Create the pipeline variables for this script:
-
- appInsightsName
- appInsightsResourceGroup
- targetBranch
- Change the script to:
# Setting git info
git config --global user.email "your email"
git config --global user.name "your name"
# Getting the instrumentation key
Write-Host "Getting the instrumentation key"
$instrumentationKey = (Get-AzApplicationInsights -ResourceGroupName $(appInsightsResourceGroup) -Name $(appInsightsName)).InstrumentationKey
# Replacing the key in the XML files
Write-Host "Listing files, setting up directory" $file = "\cloud.xml"
# Replacing the key in the XML files
Write-Host $(System.DefaultWorkingDirectory)
$path = "$(System.DefaultWorkingDirectory)\_yourGitFolder\YourSubFolder\"
Write-Host $path
cd $path
$filePath = $path + $file
$content = New-Object -TypeName XML $content.Load($filePath) ($content.Application.Parameters.Parameter | Where-Object { $_.Name -eq "ApplicationInsights.InstrumentationKey" } | Select-Object).value = $instrumentationKey
$content.Save($filePath)
# Uncomment if you want to show the contents of the file in the Azure Release pipeline log window
# Get-Content $filePath
# This next line is necessary because the usage of STDERROR in git is somewhat different.
# This line will make sure that STDERR's are written to the standard output (thus not causing devops/powershell to think there is an exception/fault code).
# https://github.com/git/git/blob/b2f55717c7f9b335b7ac2e3358b0498116b94a5d/Documentation/git.txt#L712-L723
# Another solution is to wrap this in a try / catch block in powershell, as suggested in this stackoverflow answer.
# https://stackoverflow.com/questions/34820975/git-clone-redirect-stderr-to-stdout-but-keep-errors-being-written-to-stderr
$env:GIT_REDIRECT_STDERR = '2>&1'
# Commit
git add *
git commit -m "Daily update instrumentation keys [skip ci]"
# Merge
git branch tmpMerge
git checkout TestBranch
git merge tmpMerge
git push
# Delete
git branch -d tmpMerge
The interesting bits
Git redirect to standard output
When you try to perform a git command like git checkout, chances are some messages get returned on a non-standard output. For the most part this doesn't cause much problems but in the cause of Powershell and Azure DevOps release pipelines it causes some issues as you can see in the following screenshot:
What happens is that a perfectly successful git command gives a response on the non-standard output channel and that causes Azure DevOps to think that an error occured. As you can see on this github page and this stackoverflow page, there are ways around that issue. The most straightforward being to use this line in the script:
$env:GIT_REDIRECT_STDERR = '2>&1'
Git detached head
This is something that is easy to overlook, since most dev never push anything back to the repo when in a CI/CD pipeline. When an artifact is checked out the git repo is not checking out the branch by its name, but it is performing a checkout by its commit cash.
Because it is performing a checkout on a specific hash and not on a branch name the git checkout is in fact detaching the git repo from its branch.
One way to fix this problem is by using the following approach:
- commit changes
- make a new temporary branch
- checkout the to-merge to branch
- merge the branch
# Commit
git add *
git commit -m "Daily update instrumentation keys [skip ci]"
# Merge
git branch tmpMerge
git checkout $(targetbranch)
git merge tmpMerge
git push
# Delete - if you want to or need to for subsequent steps, but optional as its a local branch that will get deleted after the pipeline finishes
# git branch -d tmpMerge
Disable the continuous integration on the daily update commit
Depending on your scenario, you might not want to trigger the continuous build/integration process and just commit this for when you need things to happen. On this Azure docs page you can find more info about how that works.
Final thoughts
I am aware that some of the code is a little rough around the edges. The goal is to give you an overview of all the things that could happen to you if you are first exploring committing to git repos from inside an Azure DevOps Release pipeline.
Hope this helps!