Steve Spencer's Blog

Blogging on Azure Stuff

Exporting Logs from Application Insights using Continuous Export

This is the fifth post of my series of posts about Log Analytics and Application Insights. The previous post talked about adding custom logging to you code using Application Insights. Now you’ve got your logging into Application Insights you can run log analytics queries and build dash boards, alerts etc. Sometime though you want to use this data in other systems and it would be useful if you could export the data and use it else where. This post will show you how you can regularly export the data from Application Insights into Azure Storage. Once it is in Storage it can easily be moved into other systems or used else where such as PowerBI. This can be achieved using the Continuous Export feature of Application Insights

To enable Continuous Export, login to the Azure management portal and navigate to Application Insights. Click on the instance you want Continuous Export enabled. The scroll down the options on the left until you find the Configure section and click on Continuous Export

image

To use Continuous Export you will need to configure a storage account. Click Add:

image

Click Data types to export:

image

I was only interested in the logs I emitted from my custom logging, so selected Custom Event, Exception and Trace then clicked OK

image

Next pick a storage location. Make sure you use one where your Application Insights instance is located otherwise you will be charged egress fees to move the data to a new datacentre.

image

You can now pick an existing storage account or create a new one. Upon selecting a storage account you can pick an existing  container or create a new one.Once a blob container is selected click OK.

Continuous Export is now configured. You will not see anything in the storage container until the next set of logs are sent to Application Insights.

My log analytics query shows the following logs have been generated:

image

If you look at the continuous Export configuration page you will see that the last updated date has changed.

image

Now look in blob storage. You should see a folder that is<ApplicationInsights_ServiceName>_<ApplicationInsights_InstrumentationKey

image

Click through and you will see a number of folders. One for each of the logs that I enabled when setting up Continuous Export. Click through one of them and you will see a folder for the date and then a folder for the hour of the logs. Then a file containing the logs for that hour.

image

You will get a row of json data for each row output for the log query. Note the whole logs emitted will be in each of the folders.

e.g.

{
     "event": [
         {
             "name": "Some Important Work Completed",
             "count": 1
         }
     ],
     "internal": {
         "data": {
             "id": "a guid is here",
             "documentVersion": "1.61"
         }
     },
     "context": {
         "data": {
             "eventTime": "2020-03-22T18:12:30.2553417Z",
             "isSynthetic": false,
             "samplingRate": 100.0
         },
         "cloud": {},
         "device": {
             "type": "PC",
             "roleInstance": "yourcomputer",
             "screenResolution": {}
         },
         "session": {
             "isFirst": false
         },
         "operation": {},
         "location": {
             "clientip": "0.0.0.0",
             "continent": "Europe",
             "country": "United Kingdom",
             "province": "Nottinghamshire",
             "city": "Nottingham"
         },
         "custom": {
             "dimensions": [
                 {
                     "CustomerID": "4df16004-2f1b-48c0-87d3-c1251a5db3f6"
                 },
                 {
                     "OrderID": "5440d1cf-5d06-4b0e-bffb-fad522af4ad1"
                 },
                 {
                     "InvoiceID": "a7d5a8fb-2a2e-4697-8ab4-f7bf8b8dbe18"
                 }
             ]
         }
     }
}

As the data is now out of Application Insights you can move it where ever you need it. You will also need to manage the blob storage data too otherwise you will end up with the logs stored in two places and the storage costs will be doubled.

One example of subsequent usage is exporting the data to Event Hub. As the data is in blob storage you can use a function with a blob trigger to read the blob in a row at a time and publish the data onto Event Hub:

[FunctionName("ContinuousExport")]
public static async void Run([BlobTrigger("logs/{name}", Connection = "ConitnuousExportBlobSetting")]Stream myBlob, string name,
     [EventHub("logging", Connection = "EventHubConnectionAppSetting")] IAsyncCollector<string> outputEvents,TraceWriter log)
{
     log.Info($"C# Blob trigger function Processed blob\n Name:{name} \n Size: {myBlob.Length} Bytes");
     StreamReader textReader = new StreamReader(myBlob);
     while (!textReader.EndOfStream)
     {
         string line = textReader.ReadLine();
         log.Info(line);
         await outputEvents.AddAsync(line);
     }
}

Note: This is an example, so will need additional code to make sure that you don’t exceed the Event Hub maximum message size

So with Continuous Export you can extract your log data from Application insights and move it to other systems for processing.

Steve Spencer's Blog | November 2016

Steve Spencer's Blog

Blogging on Azure Stuff

Processing a flat file with Azure Logic Apps

[Update: 5th Aug 2018 – This post is still relevant especially the integration account, schemas and maps and I  have written a new blog that builds on this one and integrates into SQL  -  Using Azure Logic Apps to Import CSV to SQL Server]

A lot of companies require the transfer of files in order to transact business and there is always a need to translate these files from one format to another. Logic Apps provides a straight forward way to build serverless components that provide the integration points into your systems. This post is going to look at Logic apps enterprise integration to convert a multi-record CSV file into and XML format. Most of the understanding for this came from the following post:

https://seroter.wordpress.com/2016/09/09/trying-out-standard-and-enterprise-templates-in-azure-logic-apps/

Logic Apps can be created in Visual Studio or directly in the Azure Portal using the browser. Navigate to the azure portal https://portal.azure.com click the plus button at the top of the right hand column, then Web + Mobile then Logic App

clip_image002

Complete the form and click Create

clip_image003

This will take a short while to complete. Once complete you can select the logic app from you resource list to start to use it.

If you look at my recent list

clip_image005

You can see the logic app I’ve just created but you will also see my previous logic app and you will also notice that there is also an integration account and an Azure function. These are both required in order to create the necessary schemas and maps required to translate the CSV file to XML.

The integration account stores the schemas and maps and the Azure function provides some code that is used to translate the CSV to XML.

An integration account is created the same way as a logic app. The easiest way is to click on the plus symbol and then search for integration

clip_image007

Click on Integration Account then Create

clip_image009

Complete the form

clip_image010

Then Create. Once created you can start to add your schemas and maps

clip_image012

You will now need to jump into Visual Studio to create your maps and schemas. You will need to install the Logic Apps Integration Tools for Visual Studio

You will need to create a schema for the CSV file and a schema for the XML file. These two blog posts walk you through creating a flat file schema for a CSV file and also a positional file

I created the following two schemas

clip_image014

clip_image016

Once you have create the two schemas you will need to create a map which allows you to map the fields from one schema to the fields in the other schema.

clip_image018

In order to upload the map you will need to build the project in Visual Studio to build the xslt file.

The schemas and map file project can be found in my repository on GitHub

To upload the files to the integration account, go back to the Azure portal where you previously selected the integration account, click Schemas then Add

clip_image020

Complete the form, select the schema file from your Visual Studio project and click OK. Repeat this for both schema files. You do the same thing for the map file. You will need to navigate to your bin/Debug (or Release) folder to find the xslt file that was built. Your integration account should now show your schemas and maps as uploaded

clip_image021

There’s one more thing to do before you can create your logic app. In order to process the transformation some code is required in an Azure Function. This is standard code and can be created by clicking the highlighted link on this page. Note: If you haven’t used Azure Functions before then you will also need to click the other link first.

clip_image023

clip_image025

This creates you a function with the necessary code required to perform the transformation

clip_image027

You are now ready to start your logic app. Click on the Logic App you created earlier. This will display a page where you can select a template with which to create your app.

clip_image029

Close this down as you need to link your integration account to your logic app.

Click on Settings, then Integration Account and pick the integration Account where you previously uploaded the Schemas and Map files. Save this and return to the logic app template screen.

Select VETER Pipeline

clip_image031

Then “Use This Template”. This is the basis for your transformation logic. All you need to do now is to complete each box.

clip_image032

clip_image034

In Flat File Decoding & XML Validation, pick the CSV schema

clip_image035

In the transform XML

clip_image036

Select the function container, the function and the map file

All we need to do now is to return the transformed xml in the response message. Click “Add an Action” on Transform XML and search for Response.

clip_image038

Pick the Transformed XML content as the body of the response. Click save and the URL for the logic app will be populated in the Request flow

clip_image039

We now have a Request that takes the CSV in the body and it returns the XML transform in the body of the response. You can test this using a tool like PostMan or Fiddler to send in the request to the request URL above.

There is also a test CSV file in my repository on GitHub which can be used to test this.

My next post covers how I diagnosed a fault with this Logic App