<?xml version="1.0" encoding="utf-8"?>
<?xml-stylesheet type="text/xsl" href="https://blog.aabech.no/rss/xslt"?>
<rss xmlns:a10="http://www.w3.org/2005/Atom" version="2.0">
  <channel>
    <title>Lars-Erik's blog</title>
    <link>https://blog.aabech.no/</link>
    <description>Ramblings about Umbraco, .net and JavaScript development. With a sprinkle of other stuff.</description>
    <generator>Articulate, blogging built on Umbraco</generator>
    <item>
      <guid isPermaLink="false">1204</guid>
      <link>https://blog.aabech.no/archive/analyzing-w3c-logs-with-excel-and-powerquery/</link>
      <category>powerquery</category>
      <title>Analyzing W3C logs with Excel and PowerQuery</title>
      <description>&lt;p&gt;15 Years ago I wrote &lt;a href="https://sourceforge.net/projects/iislogviewer/"&gt;a .net WinForms tool called &amp;quot;IIS Log Viewer&amp;quot;&lt;/a&gt; to parse and visualize W3C log files in a nice color coded grid. Of course it topples over as soon as you analyze a file of some respectable size, so I'm sorry to say I abandoned maintenance of the program. Only today I realize I should have called it &amp;quot;W3C Log Viewer&amp;quot;, but here's to hindsight. 🤣🍻&lt;/p&gt;
&lt;p&gt;Today I use a different approach with Excel and PowerQuery instead. In this article I'll show how you can anlyze a fair amount of traffic (at least a million requests) from several logfiles possibly spanning your entire retention period. I'm a Microsoft fanboy, so I'll show how to do it with Azure App Services logging to Azure Storage accounts, but &lt;a href="https://support.microsoft.com/en-us/office/combine-files-in-a-folder-with-combine-binaries-power-query-94b8023c-2e66-4f6b-8c78-6a00041c90e4"&gt;you can point to filesystem folders as well&lt;/a&gt;.&lt;/p&gt;
&lt;h2&gt;Setting up W3C logging on an Azure App Service&lt;/h2&gt;
&lt;p&gt;The nicest way of HTTP logging I know of on Azure is by logging to storage. It'll persist across slots when you swap, and you won't have to download them to analyze them.&lt;/p&gt;
&lt;p&gt;You'll find the settings you want under &lt;code&gt;Monitoring\App Service Logs&lt;/code&gt;:&lt;/p&gt;
&lt;p&gt;&lt;img src="https://blog.aabech.no/media/1065/appservice-storage-logging.png" alt="Web Server logging set to storage with retention period" /&gt;&lt;/p&gt;
&lt;p&gt;Now, you'll production slot will happily rotate n days of logs in your storage account. They'll be stored in a hierarchy by slot name (?), year, month, date and finally hour. Totally horrible to analyze without a tool, but we got just the one! Also too difficult to screenshot, so you'll have to imagine it. 😇&lt;/p&gt;
&lt;h2&gt;Getting data into Excel&lt;/h2&gt;
&lt;p&gt;I believe PowerQuery really made it's break into Excel in the 2016 version. We'd had some &amp;quot;Power&amp;quot; tools (pivot?) from 2013, but this version really hit the target! I can't get over the fact I waited until 2020 to get more acquainted with it.&lt;/p&gt;
&lt;p&gt;In the old versions, we more or less only had the horrible text import / split column wizard. In newer versions of Excel we've got a super rich &amp;quot;Get &amp;amp; Transform Data&amp;quot; section in the Data ribbon, not to speak of (almost) the full power of PowerQuery behind the scenes.&lt;/p&gt;
&lt;p&gt;&lt;img src="https://blog.aabech.no/media/1046/excel-data-bar.png" alt="The Excel data bar with new options" /&gt;&lt;br /&gt;
&lt;em&gt;The &amp;quot;new&amp;quot; Excel data ribbon&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;I particularly like the &amp;quot;From Web&amp;quot; one, since it'll handle most modern formats such as JSON and XML. (Even CSV)&lt;/p&gt;
&lt;p&gt;However, for our purposes, we'll find our source of choice by drilling into the &amp;quot;Get Data&amp;quot; dropdown:&lt;/p&gt;
&lt;p&gt;&lt;img src="https://blog.aabech.no/media/1066/blob-storage-source.png" alt="Excel Get Data From Azure Blob Storage menu" /&gt;&lt;/p&gt;
&lt;p&gt;We go on to enter our storage account name:&lt;/p&gt;
&lt;p&gt;&lt;img src="https://blog.aabech.no/media/1045/entering-storage-account.png" alt="Entering storage account name" /&gt;&lt;/p&gt;
&lt;p&gt;Then you'll need to &lt;a href="https://docs.microsoft.com/en-us/azure/storage/common/storage-account-keys-manage?tabs=azure-portal#view-account-access-keys"&gt;find your account key&lt;/a&gt; and enter it in order to authorize with Azure:&lt;/p&gt;
&lt;p&gt;&lt;img src="https://blog.aabech.no/media/1044/entering-account-key.png" alt="Entering storage account key" /&gt;&lt;/p&gt;
&lt;p&gt;Mind the &lt;a href="https://docs.microsoft.com/en-us/power-query/dataprivacyfirewall"&gt;security concerns and the data firewall&lt;/a&gt; when authenticating sources and sharing connected Excel files.&lt;/p&gt;
&lt;p&gt;The next &amp;quot;wizard&amp;quot; we're faced with is a blob storage browser. We'll select the &amp;quot;logs&amp;quot; container, but leave it at that. We want to &amp;quot;load and &lt;em&gt;transform&lt;/em&gt;&amp;quot; all the blobs we see on the right hand side.&lt;/p&gt;
&lt;p&gt;&lt;img src="https://blog.aabech.no/media/1050/navigating-to-all-logs.png" alt="Viewing the logs container in the PowerQuery navigator" /&gt;&lt;/p&gt;
&lt;p&gt;So the final step is to resist the urge to click &amp;quot;Load&amp;quot;, but rather click &amp;quot;Transform Data&amp;quot;. This will launch the most powerful data transformation tool I know to date, the aptly named &amp;quot;PowerQuery&amp;quot; GUI.&lt;/p&gt;
&lt;h2&gt;Getting the contents of all the log files&lt;/h2&gt;
&lt;p&gt;Excel will immediately pop a new (sadly modal) window with a new Query aptly called &amp;quot;logs&amp;quot;. The smiley up there is for sending feedback, but I like to think PowerQuery is just a happy place alltogether.&lt;/p&gt;
&lt;p&gt;&lt;img src="https://blog.aabech.no/media/1060/viewing-blobs-in-powerquery.png" alt="The PowerQuery GUI while viewing log blobs in PowerQuery" /&gt;&lt;/p&gt;
&lt;p&gt;As you can see there's a bunch of options up top, but we're actually after a slightly more inconspicuous one down in the &amp;quot;Content&amp;quot; column header called &amp;quot;Combine files&amp;quot;:&lt;/p&gt;
&lt;p&gt;&lt;img src="https://blog.aabech.no/media/1042/combine-files-button.png" alt="The inconspicuous combine files button circled" /&gt;&lt;/p&gt;
&lt;p&gt;This will pop another dialog asking us about encoding and whether we want to skip things that bugs out. Also we can choose which file to use as our sample for further transformation:&lt;/p&gt;
&lt;p&gt;&lt;img src="https://blog.aabech.no/media/1043/combine-files-dialog.png" alt="The combine files dialog" /&gt;&lt;/p&gt;
&lt;p&gt;We'll go ahead and use the first one, since the sequence of those don't matter. The're all in the same format. What follows is this:&lt;/p&gt;
&lt;p&gt;&lt;img src="https://blog.aabech.no/media/1059/viewing-blob-content.png" alt="Blob content now visible in the query results" /&gt;&lt;/p&gt;
&lt;p&gt;We're now faced with a fairly more complex set of queries and steps in our &amp;quot;logs&amp;quot; query. The inner workings of those are out of the scope of this post, but I promise there's magic to be had if you'll dive deeper. The .net peeps reading this will also appreciate the familiar form of static method calling in the &amp;quot;formula bar&amp;quot;. PowerQuery is a delightful blend of familiar syntaxes. 😇&lt;/p&gt;
&lt;p&gt;The more interesting thing however is that (very quickly) the blob file contents are shown in a new column called &amp;quot;Column1&amp;quot;. This is our next target.&lt;/p&gt;
&lt;h2&gt;Transforming the data and preparing for analysis&lt;/h2&gt;
&lt;p&gt;Before we can make anything meaningful out of the actual log data we have to get rid of some junk, and figure out what fields we've logged. These are the first two lines of the logfile. Alas they've got a bit of junk attached. The first line is just garbage, so we'll employ our first secret weapon. Remove top rows: (it's on the Home ribbon)&lt;/p&gt;
&lt;p&gt;&lt;img src="https://blog.aabech.no/media/1052/remove-top-rows.png" alt="The remove top rows menu item" /&gt;&lt;/p&gt;
&lt;p&gt;You'll be prompted with a dialog asking how many rows to remove. We want to remove the leading hash lines except the &amp;quot;#Fields&amp;quot; one. In my case, this is just the &amp;quot;#Software&amp;quot; row, so I remove 1 row.&lt;/p&gt;
&lt;p&gt;The next step is to remove the leading &amp;quot;#Fields: &amp;quot; text of the column header specification. We'll drag out our second weapon of choice, &amp;quot;Replace Values&amp;quot;. For this step we need to select &amp;quot;Column1&amp;quot; and move over to the &amp;quot;Transform&amp;quot; ribbon. Here we'll find the &amp;quot;Replace Values&amp;quot; dropdown where we select the values option:&lt;/p&gt;
&lt;p&gt;&lt;img src="https://blog.aabech.no/media/1055/replace-values-menu.png" alt="GUI when replacing values" /&gt;&lt;/p&gt;
&lt;p&gt;We're faced with (yet) another dialog in which we specify that we want to get rid of the &amp;quot;#Fields: &amp;quot; string: (make sure to include the space)&lt;/p&gt;
&lt;p&gt;&lt;img src="https://blog.aabech.no/media/1054/replace-values-dialog.png" alt="The replace values dialog" /&gt;&lt;/p&gt;
&lt;h2&gt;Splitting columns&lt;/h2&gt;
&lt;p&gt;We're here! The good old &amp;quot;split column&amp;quot; phase is up!&lt;/p&gt;
&lt;p&gt;Back at the home ribbon, we find our old friend &amp;quot;Split Column&amp;quot;. For W3C log files we're dealing with space delimited files, so we go with the &amp;quot;by delimiter&amp;quot; option:&lt;/p&gt;
&lt;p&gt;&lt;img src="https://blog.aabech.no/media/1056/split-column-by-delimiter.png" alt="The split column menu" /&gt;&lt;/p&gt;
&lt;p&gt;Yet another dialog where we specify space as the delimiter:&lt;/p&gt;
&lt;p&gt;&lt;img src="https://blog.aabech.no/media/1057/split-column-dialog.png" alt="The split column by delimiter dialog" /&gt;&lt;/p&gt;
&lt;p&gt;Now we're definitely getting somewhere:&lt;/p&gt;
&lt;p&gt;&lt;img src="https://blog.aabech.no/media/1063/after-split-columns.png" alt="First view with actual split log columns" /&gt;&lt;/p&gt;
&lt;p&gt;Notice how we didn't have to specify any other options than how to split the data? No data types, no headers, nothing.&lt;/p&gt;
&lt;p&gt;We do need to get those headers as column names, though. We're already on the home ribbon and if you've been looking closely you've likely noticed both &amp;quot;Remove Columns&amp;quot; and &amp;quot;Use First Row as Headers&amp;quot;. We'll remove the now useless &amp;quot;Source.Name&amp;quot; column, and then use the first row as our headers:&lt;/p&gt;
&lt;p&gt;&lt;img src="https://blog.aabech.no/media/1067/remove-source-column.png" alt="Removing the source column" /&gt;&lt;/p&gt;
&lt;p&gt;Then we hit the &amp;quot;Use First Row as Headers&amp;quot; button, et voilá:&lt;/p&gt;
&lt;p&gt;&lt;img src="https://blog.aabech.no/media/1062/after-promote-headers.png" alt="Log files properly parsed" /&gt;&lt;/p&gt;
&lt;p&gt;The utter sweetness (if you look in the bottom right corner) is that PowerQuery is awesome at guessing data types. Granted you have to nudge it a bit every now and then, but the date, times and numeric data is already properly typed.&lt;/p&gt;
&lt;p&gt;Which leads us to the last necessary step for this to work flawlessly. PowerQuery has only sampled 1000 rows from the first file. It's not aware that several more lines starting with a hash is awaiting further down the road. That's gonna mess up it's date conversion for the first column. Here's for our final trick, &amp;quot;Remove errors&amp;quot;. It's an option under our old friend &amp;quot;Remove rows&amp;quot; on the home ribbon. Make sure to select the date column before you hit it:&lt;/p&gt;
&lt;p&gt;&lt;img src="https://blog.aabech.no/media/1053/removing-hash-row-errors.png" alt="Removing hash row errors" /&gt;&lt;/p&gt;
&lt;p&gt;And that should be it! We're ready to hit that captivating &amp;quot;Close &amp;amp; Load&amp;quot; button up in the left corner of the home bar. Go ahead! Hit it!&lt;/p&gt;
&lt;p&gt;&lt;img src="https://blog.aabech.no/media/1041/close-and-load.png" alt="The close and load button" /&gt;&lt;/p&gt;
&lt;p&gt;Now depending on whether you've got less than a mil. lines total in your log files this will succeed or fail. Despair not though, there are ways to get around it. I'll add some tips at the bottom.&lt;/p&gt;
&lt;p&gt;If you're lucky enough to escape the Excel row limit trap (unlike some unfortunate brits recently), you'll be faced with a fancy green table after a while. The &amp;quot;Queries &amp;amp; Connections&amp;quot; sidebar will appear and show progress of how many rows that's been loaded. (Including errors if any)&lt;br /&gt;
For me it takes a few minutes to load half a mil rows. (Granted I'm on a 1GB internet connection yielding .5 GB to my box. It is a few hundred MB to download.)&lt;br /&gt;
Here's how it looks:&lt;/p&gt;
&lt;p&gt;&lt;img src="https://blog.aabech.no/media/1048/log-files-in-sheet.png" alt="W3C log files loaded to an Excel sheet" /&gt;&lt;/p&gt;
&lt;p&gt;So we've got half a million log rows loaded to Excel. I can't be bothered to investigate the 680 errors, but if you click the link you'll get a special query showing those and their causes. (❤)&lt;/p&gt;
&lt;p&gt;However this is fairly hard to get some value out of in its present state. We need to pivot. If you've got less than a million you can go to the &amp;quot;Insert&amp;quot; ribbon and click &amp;quot;Pivot table&amp;quot;. Make sure you've selected a cell in the log table first. Just accept the default options and skip to the next section.&lt;/p&gt;
&lt;h2&gt;Pivoting more than a million rows&lt;/h2&gt;
&lt;p&gt;Incidentally pivoting is also the trick (untested) if you've got more than a million rows. You'll have to right click the &amp;quot;logs&amp;quot; query in the &amp;quot;Queries &amp;amp; Connections&amp;quot; toolbar on the right and select &amp;quot;Load to&amp;quot;. You can then change to a &amp;quot;Pivot Table&amp;quot; instead of table and skip loading all the details. I believe there's also options to build cached data models, but that's outside of my skillset and the scope of this post as well. :)&lt;/p&gt;
&lt;h2&gt;Pivoting W3C logs&lt;/h2&gt;
&lt;p&gt;Which ever way you opt to pivot the log data you'll end up with a worksheet like this:&lt;/p&gt;
&lt;p&gt;&lt;img src="https://blog.aabech.no/media/1040/clean-pivot-view.png" alt="A fresh pivot table in Excel" /&gt;&lt;/p&gt;
&lt;p&gt;I've got two favorite setups for analyzing logs. First of all, I'd like to examine whether a spike is due to a new bot of some sorts. We'll just block it if it's something we're not interested in. It can save an amazing amount of processing you possibly weren't aware you spent. &lt;/p&gt;
&lt;p&gt;Though we imported all the log files, for this you'd likely want to look at a specific day. Dragging &amp;quot;date&amp;quot; to &amp;quot;Filters&amp;quot; lets you select a date in B2. (Full screenshot further down)&lt;/p&gt;
&lt;p&gt;Next we want to look at agents, so we drag &amp;quot;cs(User-Agent)&amp;quot; to &amp;quot;Rows&amp;quot;. (They'll all appear instantly)&lt;/p&gt;
&lt;p&gt;Finally we want to see how many hits we've got from each agent and then sort. We can actually drag any field, but let's go with &amp;quot;time&amp;quot; for counting. The final dragging step is to drag &amp;quot;time&amp;quot; into &amp;quot;Values&amp;quot;.&lt;/p&gt;
&lt;p&gt;We should be left with something like this:&lt;/p&gt;
&lt;p&gt;&lt;img src="https://blog.aabech.no/media/1058/unsorted-pivot.png" alt="Unsorted list of web agent hits" /&gt;&lt;/p&gt;
&lt;p&gt;As you can see we still have quite a few agents to try to rid ourselves of, but the ones we see here have too acceptable request rates to bother. The one's we'd be interested in would have thousands in the &amp;quot;Count of time&amp;quot; column. To find them easily we have to sort, and this is were Excel can be a bit quirky. In order to sort the pivot by &amp;quot;Count of time&amp;quot; we have to open the dropdown menu on &amp;quot;Row Labels&amp;quot; and select &amp;quot;More Sort Options...&amp;quot;:&lt;/p&gt;
&lt;p&gt;&lt;img src="https://blog.aabech.no/media/1049/more-sort-options-menu.png" alt="The more sort options menu for Excel pivot sorting" /&gt;&lt;/p&gt;
&lt;p&gt;What follows is a rather quirky dialog, but we're after &amp;quot;Descending (Z to A) by: Count of time&amp;quot;:&lt;/p&gt;
&lt;p&gt;&lt;img src="https://blog.aabech.no/media/1047/excel-pivot-sort-dialog.png" alt="Excel Pivot sort dialog" /&gt;&lt;/p&gt;
&lt;p&gt;We're now left with a table showing that todays winner was (bloody) iPhones!&lt;/p&gt;
&lt;p&gt;&lt;img src="https://blog.aabech.no/media/1061/w3c-pivot-sorted-by-agent.png" alt="Aggregated W3C logs sorted by agent hit count" /&gt;&lt;/p&gt;
&lt;h2&gt;Discovering facts&lt;/h2&gt;
&lt;p&gt;If we'd had an excessive agent, likely with &amp;quot;bot&amp;quot; in it's name, the next step would be to add &amp;quot;c-ip&amp;quot; as a secondary row dimension under &amp;quot;cs(User-Agent)&amp;quot;. We could then drill down into the excessive agent and likely discover IPs to block or even report.&lt;/p&gt;
&lt;p&gt;Today however, this process lead me to replacing &amp;quot;cs(User-Agent)&amp;quot; as my rows field with &amp;quot;cs-uri-stem&amp;quot;. Repeating the sorting trick showed me that the customer obviously had had a campaign of some sorts leading to the sudden spike at three o'clock. Luckily our newly upgraded S3 plan managed to take it only dropping to 2 secs response time across 29 Umbraco sites, but I'm about to have a serious talk with this particular customer - leading to a slightly more expensive hosting fee for an individual plan. 😇 &lt;/p&gt;
&lt;p&gt;Another favourite setup of mine shows which URIs (or other interesting dimensions) have errors, redirects or excessive &amp;quot;access denieds&amp;quot;.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;
Rows
&lt;ul&gt;
&lt;li&gt;cs-uri-stem&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
Columns
&lt;ul&gt;
&lt;li&gt;sc-status&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
Values
&lt;ul&gt;
&lt;li&gt;time&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;Conclusion&lt;/h2&gt;
&lt;p&gt;Using custom tools for file based logs seems to be bygones. PowerQuery can do most popular serialization formats, and within Excel limits it's awesome combined with pivot tables. If the performance isn't good enough, the same techniques can be used with SQL Server, Analysis Services and Power BI, but I guess W3C log format parsing is unnecessary in those kinds of setups.  
&lt;/p&gt;
&lt;p&gt;I find myself using PowerQuery for more and more these days, so don't be surprised if another PowerQuery post sees the light of day this year. Hope you enjoyed following along and maybe even learned something. 🤓&lt;/p&gt;
</description>
      <pubDate>Wed, 06 Jan 2021 00:34:42 Z</pubDate>
      <a10:updated>2021-01-06T00:34:42Z</a10:updated>
    </item>
    <item>
      <guid isPermaLink="false">1162</guid>
      <link>https://blog.aabech.no/archive/armlinker-100-released/</link>
      <category>automation</category>
      <category>azure</category>
      <title>ARMLinker 1.0.0 released</title>
      <description>&lt;h2&gt;ARM What?&lt;/h2&gt;
&lt;p&gt;I've been having fun with ARM Templates the last couple of months.
It's a wonderful way to keep your Azure Resource definitions in source control.
Not to mention being able to parameterize deployment to different environments,
and not least keeping your secrets neatly tucked away in a vault.&lt;/p&gt;
&lt;p&gt;However, compiling a set of resources from multiple files currently requires
you to put your templates online. I want to keep most of our customer products'
templates private, and to do that one have to &lt;a href="https://docs.microsoft.com/en-us/azure/azure-resource-manager/templates/linked-templates#linked-template"&gt;jump through hoops&lt;/a&gt; to copy the
files over to a storage account and link to the given URLs.
It kind of defeats the whole purpose for me.&lt;/p&gt;
&lt;p&gt;So I went and created a small tool to be able to link templates locally.&lt;/p&gt;
&lt;h2&gt;How to use it&lt;/h2&gt;
&lt;p&gt;There's an installable project type for Visual Studio called &lt;a href="https://docs.microsoft.com/en-us/azure/azure-resource-manager/templates/create-visual-studio-deployment-project"&gt;&amp;quot;Azure Resource Group&amp;quot;&lt;/a&gt;.
When you create one you get a few files:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Deploy-AzureResourceGroup.ps1&lt;/li&gt;
&lt;li&gt;azuredeploy.json&lt;/li&gt;
&lt;li&gt;azuredeploy.parameters.json&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;You can stuff all of the resources you require in the azuredeploy.json file, and finally deploy them using a wizard, or run the PowerShell script in a CD pipeline.&lt;/p&gt;
&lt;p&gt;By installing ARMLinker you can start running the tool to link other JSON files
into the main azuredeploy.json file.&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;install-module ARMLinker
&lt;/code&gt;&lt;/pre&gt;

&lt;p&gt;Let's say we have a Logic App (what I've been doing).&lt;br /&gt;
To deploy it and its connections and other needed resources, we often want
a bounch of secret keys for different APIs and such.&lt;/p&gt;
&lt;p&gt;Here's a trimmed down sample of a Logic App that runs a SQL command:&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;{
    &amp;quot;$schema&amp;quot;: &amp;quot;https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#&amp;quot;,
    &amp;quot;contentVersion&amp;quot;: &amp;quot;1.0.0.0&amp;quot;,
    &amp;quot;parameters&amp;quot;: {
        &amp;quot;Tags&amp;quot;: {
            &amp;quot;type&amp;quot;: &amp;quot;object&amp;quot;,
            &amp;quot;defaultValue&amp;quot;: {
                &amp;quot;Customer&amp;quot;: &amp;quot;My customer&amp;quot;,
                &amp;quot;Product&amp;quot;: &amp;quot;Their Logic App&amp;quot;,
                &amp;quot;Environment&amp;quot;: &amp;quot;Beta&amp;quot;
            }
        },
        &amp;quot;SQL-Server&amp;quot;: {
            &amp;quot;defaultValue&amp;quot;: &amp;quot;some.database.windows.net&amp;quot;,
            &amp;quot;type&amp;quot;: &amp;quot;string&amp;quot;
        },
        &amp;quot;SQL-User&amp;quot;: {
            &amp;quot;defaultValue&amp;quot;: &amp;quot;appuser&amp;quot;,
            &amp;quot;type&amp;quot;: &amp;quot;string&amp;quot;
        },
        &amp;quot;SQL-Password&amp;quot;: {
            &amp;quot;defaultValue&amp;quot;: &amp;quot;&amp;quot;,
            &amp;quot;type&amp;quot;: &amp;quot;securestring&amp;quot;
        },
        &amp;quot;SQL-Database-Name&amp;quot;: {
            &amp;quot;defaultValue&amp;quot;: &amp;quot;beta-database&amp;quot;,
            &amp;quot;type&amp;quot;: &amp;quot;string&amp;quot;
        }
    },
    &amp;quot;variables&amp;quot;: {
        &amp;quot;ConnectionName&amp;quot;: &amp;quot;[replace(concat(parameters('Tags').Customer, '-', parameters('Tags').Product, '-SQLConnection-', parameters('Tags').Environment), ' ', '')]&amp;quot;,
        &amp;quot;LogicAppName&amp;quot;: &amp;quot;[replace(concat(parameters('Tags').Customer, '-', parameters('Tags').Product, '-', parameters('Tags').Environment), ' ', '')]&amp;quot;
    },
    &amp;quot;resources&amp;quot;: [
        {
            &amp;quot;type&amp;quot;: &amp;quot;Microsoft.Web/connections&amp;quot;,
            &amp;quot;apiVersion&amp;quot;: &amp;quot;2016-06-01&amp;quot;,
            &amp;quot;location&amp;quot;: &amp;quot;westeurope&amp;quot;,
            &amp;quot;name&amp;quot;: &amp;quot;[variables('ConnectionName')]&amp;quot;,
            &amp;quot;properties&amp;quot;: {
                &amp;quot;api&amp;quot;: {
                    &amp;quot;id&amp;quot;: &amp;quot;[concat(subscription().id,'/providers/Microsoft.Web/locations/westeurope/managedApis/sql')]&amp;quot;
                },
                &amp;quot;displayName&amp;quot;: &amp;quot;sql_connection&amp;quot;,
                &amp;quot;parameterValues&amp;quot;: {
                    &amp;quot;server&amp;quot;: &amp;quot;[parameters('SQL-Server')]&amp;quot;,
                    &amp;quot;database&amp;quot;: &amp;quot;[parameters('SQL-Database-Name')]&amp;quot;,
                    &amp;quot;authType&amp;quot;: &amp;quot;windows&amp;quot;,
                    &amp;quot;username&amp;quot;: &amp;quot;[parameters('SQL-User')]&amp;quot;,
                    &amp;quot;password&amp;quot;: &amp;quot;[parameters('SQL-Password')]&amp;quot;
                }
            }
        }, 
        {
            &amp;quot;type&amp;quot;: &amp;quot;Microsoft.Logic/workflows&amp;quot;,
            &amp;quot;apiVersion&amp;quot;: &amp;quot;2017-07-01&amp;quot;,
            &amp;quot;name&amp;quot;: &amp;quot;[variables('LogicAppName')]&amp;quot;,
            &amp;quot;dependsOn&amp;quot;: [ &amp;quot;[resourceId('Microsoft.Web/connections', variables('ConnectionName'))]&amp;quot; ], 
            &amp;quot;location&amp;quot;: &amp;quot;westeurope&amp;quot;,
            &amp;quot;properties&amp;quot;: {
                &amp;quot;state&amp;quot;: &amp;quot;Enabled&amp;quot;,
                &amp;quot;definition&amp;quot;: {
                    &amp;quot;$schema&amp;quot;: &amp;quot;https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#&amp;quot;,
                    &amp;quot;contentVersion&amp;quot;: &amp;quot;1.0.0.0&amp;quot;,
                    &amp;quot;parameters&amp;quot;: {
                        &amp;quot;$connections&amp;quot;: {
                            &amp;quot;defaultValue&amp;quot;: {},
                            &amp;quot;type&amp;quot;: &amp;quot;Object&amp;quot;
                        },
                        &amp;quot;SQL-Server&amp;quot;: {
                            &amp;quot;defaultValue&amp;quot;: &amp;quot;&amp;quot;,
                            &amp;quot;type&amp;quot;: &amp;quot;string&amp;quot;
                        },
                        &amp;quot;SQL-Database-Name&amp;quot;: {
                            &amp;quot;defaultValue&amp;quot;: &amp;quot;&amp;quot;,
                            &amp;quot;type&amp;quot;: &amp;quot;string&amp;quot; 
                        }
                    },
                    &amp;quot;triggers&amp;quot;: {
                        &amp;quot;Recurrence&amp;quot;: {
                            &amp;quot;recurrence&amp;quot;: {
                                &amp;quot;frequency&amp;quot;: &amp;quot;Day&amp;quot;,
                                &amp;quot;interval&amp;quot;: 1
                            },
                            &amp;quot;type&amp;quot;: &amp;quot;Recurrence&amp;quot;
                        }
                    },
                    &amp;quot;actions&amp;quot;: {
                        &amp;quot;Execute_a_SQL_query_(V2)&amp;quot;: {
                            &amp;quot;runAfter&amp;quot;: {},
                            &amp;quot;type&amp;quot;: &amp;quot;ApiConnection&amp;quot;,
                            &amp;quot;inputs&amp;quot;: {
                                &amp;quot;body&amp;quot;: {
                                    &amp;quot;query&amp;quot;: &amp;quot;select 'do something really useful' as task&amp;quot;
                                },
                                &amp;quot;host&amp;quot;: {
                                    &amp;quot;connection&amp;quot;: {
                                        &amp;quot;name&amp;quot;: &amp;quot;@parameters('$connections')['sql']['connectionId']&amp;quot;
                                    }
                                },
                                &amp;quot;method&amp;quot;: &amp;quot;post&amp;quot;,
                                &amp;quot;path&amp;quot;: &amp;quot;/v2/datasets/@{encodeURIComponent(encodeURIComponent(parameters('SQL-Server')))},@{encodeURIComponent(encodeURIComponent(parameters('SQL-Database-Name')))}/query/sql&amp;quot;
                            }
                        }
                    },
                    &amp;quot;outputs&amp;quot;: {}
                },
                &amp;quot;parameters&amp;quot;: {
                    &amp;quot;$connections&amp;quot;: {
                        &amp;quot;value&amp;quot;: {
                            &amp;quot;sql&amp;quot;: {
                                &amp;quot;connectionId&amp;quot;: &amp;quot;[resourceId('Microsoft.Web/connections', variables('ConnectionName'))]&amp;quot;,
                                &amp;quot;connectionName&amp;quot;: &amp;quot;variables('ConnectionName')&amp;quot;,
                                &amp;quot;id&amp;quot;: &amp;quot;/subscriptions/00000000-0000-0000-0000-000000000000/providers/Microsoft.Web/locations/westeurope/managedApis/sql&amp;quot;
                            }
                        }
                    },
                    &amp;quot;SQL-Server&amp;quot;: {
                        &amp;quot;value&amp;quot;: &amp;quot;[parameters('SQL-Server')]&amp;quot;
                    },
                    &amp;quot;SQL-Database-Name&amp;quot;: {
                        &amp;quot;value&amp;quot;: &amp;quot;[parameters('SQL-Database-Name')]&amp;quot;
                    }
                }
            }
        }
    ]
}
&lt;/code&gt;&lt;/pre&gt;

&lt;p&gt;The parameters here are ARM template parameters. The most interesting one is the secret password for the database server. It's secret, so it's not supposed to live in our parameter file or source control. We've also got the ID of the connection, which will be the &lt;em&gt;real&lt;/em&gt; ID in the actual deployed Logic App.&lt;/p&gt;
&lt;p&gt;There's a fancy way to go about keeping the password in a key vault on Azure, and the Visual Studio Wizard is really helpful with putting it into a vault.&lt;/p&gt;
&lt;p&gt;When we're done and ready for production, a parameter file may look like this:&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;{
    &amp;quot;$schema&amp;quot;: &amp;quot;https://schema.management.azure.com/schemas/2015-01-01/deploymentParameters.json#&amp;quot;,
    &amp;quot;contentVersion&amp;quot;: &amp;quot;1.0.0.0&amp;quot;,
    &amp;quot;parameters&amp;quot;: {
        &amp;quot;Tags&amp;quot;: {
            &amp;quot;value&amp;quot;: {
                &amp;quot;Customer&amp;quot;: &amp;quot;My customer&amp;quot;,
                &amp;quot;Product&amp;quot;: &amp;quot;Their Logic App&amp;quot;,
                &amp;quot;Environment&amp;quot;: &amp;quot;Production&amp;quot;
            }
        },
        &amp;quot;SQL-Database-Name&amp;quot;: {
            &amp;quot;value&amp;quot;: &amp;quot;production-database&amp;quot;
        },
        &amp;quot;SQL-Password&amp;quot;: {
            &amp;quot;reference&amp;quot;: {
                &amp;quot;keyVault&amp;quot;: {
                    &amp;quot;id&amp;quot;: &amp;quot;/subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/Vault-Group/providers/Microsoft.KeyVault/vaults/OurKeyVault&amp;quot;
                },
                &amp;quot;secretName&amp;quot;: &amp;quot;CustomerSQLPassword&amp;quot;
            }
        }
    }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;p&gt;One of the beauties of using Logic Apps is that it have this nice GUI to work with in the portal. There's also an extension for Visual Studio to be able to edit them within Visual Studio.&lt;/p&gt;
&lt;p&gt;However, the definition will look like this when viewed with the code editor. (I removed the bulk of it, but notice the parameters) &lt;/p&gt;
&lt;pre&gt;&lt;code&gt;{
    &amp;quot;definition&amp;quot;: {
        &amp;quot;$schema&amp;quot;: &amp;quot;https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#&amp;quot;,
        &amp;quot;actions&amp;quot;: {
            &amp;quot;Execute_a_SQL_query_(V2)&amp;quot;: {
                &amp;quot;inputs&amp;quot;: {
                    &amp;quot;body&amp;quot;: {
                        &amp;quot;query&amp;quot;: &amp;quot;select 'do something really useful' as task&amp;quot;
                    },
                    &amp;quot;host&amp;quot;: {
                        &amp;quot;...&amp;quot;
                    },
                    &amp;quot;...&amp;quot;
                },
                &amp;quot;runAfter&amp;quot;: {},
                &amp;quot;type&amp;quot;: &amp;quot;ApiConnection&amp;quot;
            }
        },
        &amp;quot;...&amp;quot;,
        &amp;quot;parameters&amp;quot;: {
            &amp;quot;$connections&amp;quot;: {
                &amp;quot;defaultValue&amp;quot;: {},
                &amp;quot;type&amp;quot;: &amp;quot;Object&amp;quot;
            },
            &amp;quot;...&amp;quot;
        },
        &amp;quot;triggers&amp;quot;: {
            &amp;quot;...&amp;quot;
        }
    },
    &amp;quot;parameters&amp;quot;: {
        &amp;quot;$connections&amp;quot;: {
            &amp;quot;value&amp;quot;: {
                &amp;quot;sql&amp;quot;: {
                    &amp;quot;connectionId&amp;quot;: &amp;quot;/subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/CustomerResourceGroup/providers/Microsoft.Web/connections/MyCustomer-TheirProduct-SQLConnection-Prod&amp;quot;,
                    &amp;quot;connectionName&amp;quot;: &amp;quot;MyCustomer-TheirProduct-SQLConnection-Prod&amp;quot;,
                    &amp;quot;id&amp;quot;: &amp;quot;/subscriptions/00000000-0000-0000-0000-000000000000/providers/Microsoft.Web/locations/westeurope/managedApis/sql&amp;quot;
                }
            }
        },
        &amp;quot;SQL-Database-Name&amp;quot;: {
            &amp;quot;value&amp;quot;: &amp;quot;production-database&amp;quot;
        },
        &amp;quot;SQL-Server&amp;quot;: {
            &amp;quot;value&amp;quot;: &amp;quot;some.database.windows.net&amp;quot;
        }
    }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;p&gt;Notice that the parameters are all filled out. We can't copy this into our ARM template since it's all real Resource ID references.&lt;/p&gt;
&lt;p&gt;There's another way to get only the definition. We can use the &lt;a href="https://docs.microsoft.com/en-us/powershell/module/az.logicapp"&gt;Az.LogicApp&lt;/a&gt; powershell module:&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;(get-azlogicapp -resourcegroupname CustomerResourceGroup -name mycustomer-theirproduct-prod).definition.ToString()
&lt;/code&gt;&lt;/pre&gt;

&lt;p&gt;It will give us only the &lt;code&gt;definition&lt;/code&gt; part of the template.&lt;/p&gt;
&lt;p&gt;Both gives us a means to put &lt;em&gt;only&lt;/em&gt; the &lt;em&gt;definition&lt;/em&gt; of the logic app into a file in our local project.&lt;/p&gt;
&lt;p&gt;Now we can go back to the ARM template and replace the definition with a simple link to the file.
Say we &lt;code&gt;Set-Content&lt;/code&gt; the result of the statement above into a file called &amp;quot;logicapp.json&amp;quot;. We can modify the ARM template as such:&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;{
    &amp;quot;$schema&amp;quot;: &amp;quot;https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#&amp;quot;,
    &amp;quot;contentVersion&amp;quot;: &amp;quot;1.0.0.0&amp;quot;,
    &amp;quot;parameters&amp;quot;: {
        &amp;quot;...&amp;quot;
    },
    &amp;quot;variables&amp;quot;: {
        &amp;quot;...&amp;quot;
    },
    &amp;quot;resources&amp;quot;: [
        {
            &amp;quot;type&amp;quot;: &amp;quot;Microsoft.Web/connections&amp;quot;,
            &amp;quot;...&amp;quot;
        }, 
        {
            &amp;quot;type&amp;quot;: &amp;quot;Microsoft.Logic/workflows&amp;quot;,
            &amp;quot;apiVersion&amp;quot;: &amp;quot;2017-07-01&amp;quot;,
            &amp;quot;name&amp;quot;: &amp;quot;[variables('LogicAppName')]&amp;quot;,
            &amp;quot;dependsOn&amp;quot;: [ &amp;quot;[resourceId('Microsoft.Web/connections', variables('ConnectionName'))]&amp;quot; ], 
            &amp;quot;location&amp;quot;: &amp;quot;westeurope&amp;quot;,
            &amp;quot;properties&amp;quot;: {
                &amp;quot;state&amp;quot;: &amp;quot;Enabled&amp;quot;,
                &amp;quot;definition&amp;quot;: {
                    &amp;quot;templateLink&amp;quot; {
                        &amp;quot;uri&amp;quot;: &amp;quot;.\logicapp.json&amp;quot;
                    }
                },
                &amp;quot;parameters&amp;quot;: {
                    &amp;quot;$connections&amp;quot;: {
                        &amp;quot;value&amp;quot;: {
                            &amp;quot;sql&amp;quot;: {
                                &amp;quot;connectionId&amp;quot;: &amp;quot;[resourceId('Microsoft.Web/connections', variables('ConnectionName'))]&amp;quot;,
                            &amp;quot;connectionName&amp;quot;: &amp;quot;variables('ConnectionName')&amp;quot;,
                                &amp;quot;id&amp;quot;: &amp;quot;/subscriptions/00000000-0000-0000-0000-000000000000/providers/Microsoft.Web/locations/westeurope/managedApis/sql&amp;quot;
                            }
                        }
                    },
                    &amp;quot;...&amp;quot;
                }
            }
        }
    ]
}
&lt;/code&gt;&lt;/pre&gt;

&lt;p&gt;By running &lt;code&gt;ARMLinker&lt;/code&gt; we will have the same generated file as we started with,
but we can use the GUI for the logic app and easily fetch the new JSON for it.&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;Convert-TemplateLinks azuredeploy.json azuredeploy.linked.json
&lt;/code&gt;&lt;/pre&gt;

&lt;p&gt;For now, I've actually turned those around and put the &amp;quot;linked&amp;quot; template in a file called azuredeploy.linked.json in order to generate the &amp;quot;conventional&amp;quot; azuredeploy.json file.&lt;/p&gt;
&lt;h2&gt;More options&lt;/h2&gt;
&lt;p&gt;When using the &amp;quot;copy content from the editor&amp;quot; method mentioned above, we have to make sure to copy &lt;em&gt;only&lt;/em&gt; the definition object. Otherwise we'll bring the concrete parameters into the local file.&lt;/p&gt;
&lt;p&gt;Do not despair!&lt;/p&gt;
&lt;p&gt;There's another option that doesn't match the official schema for &amp;quot;templateLink&amp;quot;.
By adding a property called &amp;quot;jsonPath&amp;quot; we can point to an object deeper in the linked file.
Say we copy the content from the online editor.&lt;/p&gt;
&lt;p&gt;We can modify the linked template as such:&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;&amp;quot;definition&amp;quot;: {
    &amp;quot;templateLink&amp;quot; {
        &amp;quot;uri&amp;quot;: &amp;quot;.\logicapp.json&amp;quot;,
        &amp;quot;jsonPath&amp;quot;: &amp;quot;definition&amp;quot;
    }
},
&lt;/code&gt;&lt;/pre&gt;

&lt;p&gt;It will now only merge the definition part from the logicapp.json file.&lt;/p&gt;
&lt;p&gt;I only implemented dot separated paths for now, so exotic paths to arrays or paths with special characters won't work.&lt;/p&gt;
&lt;p&gt;IE. &lt;code&gt;resources[0]['very fancy'].thing&lt;/code&gt; won't work, but &lt;code&gt;things.with.dots&lt;/code&gt; will work.&lt;/p&gt;
&lt;h2&gt;Plans and dreams&lt;/h2&gt;
&lt;p&gt;This is pretty much only a workaround while waiting for Microsoft to realise this is totally useful and obvious.&lt;/p&gt;
&lt;p&gt;I originally intended it to be a Custom Tool for Visual Studio, but I could not for the life of me figure out how to enable Custom Tools in projects not of the C# or Visual Basic archetypes.&lt;/p&gt;
&lt;p&gt;If anyone picks up on it, I'll happily discuss new features and even happierly receive meaningful pull requests.&lt;/p&gt;
&lt;p&gt;Other than that, I believe it does the job properly. It can be used in CD pipelines. It should even work for any JSON, not necessarily ARM templates. &lt;/p&gt;
&lt;p&gt;I would really appreciate your feedback, and hope you like it!&lt;/p&gt;
&lt;p&gt;Now go commit and deploy something automagically while fetching coffee! 🤘😁🦄&lt;/p&gt;
&lt;h2&gt;Code and gallery links&lt;/h2&gt;
&lt;p&gt;&lt;a href="https://github.com/lars-erik/ARMLinker"&gt;Github repository&lt;/a&gt;&lt;br /&gt;
&lt;a href="https://www.powershellgallery.com/packages/ARMLinker/1.0.1"&gt;PowerShell gallery&lt;/a&gt;&lt;/p&gt;
</description>
      <pubDate>Wed, 22 Jan 2020 23:22:41 Z</pubDate>
      <a10:updated>2020-01-22T23:22:41Z</a10:updated>
    </item>
  </channel>
</rss>