Steve Spencer's Blog

Blogging on Azure Stuff

Testing connectivity to your backend service with Hybrid Connection in App Service

I have a backend service that is connected to my web site which is hosted in Azure App Service using a Hybrid Connection. When setting up a Hybrid Connection it is useful to be able to test connectivity to your backend service. I’ve previously posted a video to show you to access the Kudu control panel so that you can look at the files in the hosted site. We’ll use the Kudu control panel to also test connectivity, only this time we’ll use the PowerShell debug console.

In Azure Portal click on your app service and go to Advanced Tools and select Go

image

This will open the Kudu console in a new tab.

Click on Debug console the PowerShell

image

This opens the debug console in PowerShell and allows your to run PowerShell commands on the App Service. As you are running on the App Service you will have access to the backend service that is connected via the web service.

image

My service has a Get endpoint that I can call to test connectivity. There are a number of commands we could run but I’ll use Invoke-WebRequest.

For this I am using www.bing.com but you use the url of your backend service that you have configured in your Hybrid Connection.

Invoke-WebRequest -Uri http://www.bing.com

image

This returns the following error

The response content cannot be parsed because the Internet Explorer engine is not available, or Internet Explorer's first-launch configuration is not complete. Specify the UseBasicParsing parameter and try again.

The error says what you need to do. Add UseBasicParsing to the command

Invoke-WebRequest -Uri http://www.bing.com –UseBasicParsing

image

We now get the error:

Win32 internal error "The handle is invalid" 0x6 occurred while reading the console output buffer

To fix this we need to tell PowerShell to silently continue by issuing the following command:

$progressPreference = "silentlyContinue"

Then call your web request

image

I’ve now got the output from the web request with a status code of 200 showing I’ve got connectivity.

Using Graph API to automate Azure AD

In my previous posts I discussed how you can manage access to applications (part 1) using Azure AD and also how you can add users users from outside of your organisation (part 2). Now we will look at how you can automate this using Graph API.

“The Microsoft Graph API offers a single endpoint, https://graph.microsoft.com, to provide access to rich, people-centric data and insights exposed as resources of Microsoft 365 services. You can use REST APIs or SDKs to access the endpoint and build apps that support scenarios spanning across productivity, collaboration, education, security, identity, access, device management, and much more.” - https://docs.microsoft.com/en-us/graph/overview

From the overview you can see that Graph API covers a large area of Microsoft 365 services. One of the services it covers is Azure AD. What I’ll show you today is how to invite users and then add/remove them to/from groups using Graph API.

There are two ways to access Graph API. A user centric approach (Delegated) that requires a user account and an application centric approach that uses an application key and secret. Accessing Azure AD for user invite and group management utilises the application centric approach. In order to get an application id and secret you will need to create an application in Azure AD. The first post in the series talks about how to create an App Registration.

Once you have created your application, there are a couple of bits of information you require in order to get started. These are the tenantId and clientId. These can be found in the Azure portal. Navigate to your App Registration and the details can be found in the Overview blade.

image

If you hover over each of the Guids a copy icon appears to allow you to easily copy these values.

Next you will need a key generating. For this you click on the Certificates and secrets blade.

image

Then click “New client secret” and populate the form and click “Add”

image

Your key will now appear.

image

Make sure you copy this as it is not visible again once you navigate away and you will need to generate a new one.

image

We are now ready to start looking at Graph API. There is good documentation about each of the functions in Graph API including the permissions required to access and code samples in a variety of languages. If we look at the list User function:

https://docs.microsoft.com/en-us/graph/api/user-list?view=graph-rest-1.0&tabs=http

image

You can see the permissions needed to access this function. As we are using an Application permission type we need to set one of the permissions: User.Read.All, User.ReadWrite.All, Directory.Read.All or Directory.ReadWrite.All.

You can set the permissions required by going to your App Registration and clicking on the “API permissions”

image

The application by default requires a user login that can read their own user profile. We need to add some additional permissions to allow our application to list the users in AD.Click on “Add permission”

This shows the list of built-in API’s that you can access. We are only looking at Microsoft Graph today

image

Click “Microsoft graph”

image

Then “Application permission” and scroll to the User section

image

To list users we need the User.Read.All permission, but we’ll also add the User.Invite.All so that we can invite B2B users. click “Add permissions”.

image

Although you have added the permissions you cannot currently access the Graph API as you will need to Grant admin consent in first. If we had  added a Delegated permission then the user could try an access the Graph API but Admin consent would be required to stop anyone from accessing certain features. This can be done in a workflow with selected Admins being notified of access. Before the use can access an Administrator would need to approve each access. This process will not work for our application as it is an unattended application using the application permission type. We can however grant access to this application user by clicking “Grant admin consent …” button and clicking Yes to the message box that pops up.

image

Clicking the button adds admin consent to all permissions. If you want to remove it from any, click the ellipsis (…) at the end and click “Revoke admin consent”

image

You can also remove permissions from this menu.

Your user is now ready to go. I’m using the C# SDK and this is available as a nuget package

Once the nuget package is installed. You will need to create an instance of the Graph API client:

ConfidentialClientApplicationOptions _applicationOptions = new ConfidentialClientApplicationOptions
{
     ClientId = ConfigurationManager.AppSettings["ClientId"],
     TenantId = ConfigurationManager.AppSettings["TenantId"],
     ClientSecret = ConfigurationManager.AppSettings["AppSecret"]
};


// Build a client application.
IConfidentialClientApplication confidentialClientApplication = ConfidentialClientApplicationBuilder
                 .CreateWithApplicationOptions(_applicationOptions)
             .Build();


// Create an authentication provider by passing in a client application and graph scopes.
ClientCredentialProvider authProvider = new ClientCredentialProvider(confidentialClientApplication);


// Create a new instance of GraphServiceClient with the authentication provider.
GraphServiceClient graphClient = new GraphServiceClient(authProvider);

You will need the ClientId, TenentId and Secret you copied earlier. Looking at the Graph API documentation there are example of how to use each of the functions.

image

We want to see if a user existing in our AAD before we invite them, so we will use the filter option as above.

var user = (await graphClient.Users
                 .Request(options)
                 .Filter($"mail eq '{testUserEmail}'")
                 .GetAsync()).FirstOrDefault();

Console.WriteLine($"{testUserEmail} {user != null} {user?.Id} [{user?.DisplayName}] [{user?.Mail}]");

If user is null then is does not exist in your AzureAD tenant. Assuming that this is an external user then you will need to invite the user to be able to access your application. I created a method for this:

private async Task<Invitation> InviteUser(IGraphServiceClient graphClient, string displayName, string emailAddress, string redirectUrl, bool wantCustomEmaiMessage, string emailMessage)
{
     // Needs: User.InviteAll

    var invite = await graphClient.Invitations
                     .Request().AddAsync(new Invitation
                     {
                         InvitedUserDisplayName = displayName,
                         InvitedUserEmailAddress = emailAddress,
                         SendInvitationMessage = wantCustomEmaiMessage,
                         InviteRedirectUrl = redirectUrl,
                         InvitedUserMessageInfo = wantCustomEmaiMessage ? new InvitedUserMessageInfo
                         {
                             CustomizedMessageBody = emailMessage,
                         } : null
                     });

    return invite;
}

Now you’ve just invited a B2B user into your Azure AD tenant. At the moment they do not have access to anything as you’ve not assigned them to any application. The Graph API for assigning users to applications uses the delegated permissions model which means you need to use an actual user account. The Graph API with the application permission model does not support adding users to applications. In order to use the same application client you used for inviting users, you could assign a group to your application and then use the Graph API to add/remove users to/from that group.

Adding/removing a user to/from a group requires one of the following permissions: GroupMember.ReadWrite.All, Group.ReadWrite.All and Directory.ReadWrite.All. This is set in the same way as for the user permissions in the App Registration/Api permission section mentioned earlier. Admin consent will also need to be granted for these permissions.

The code to add & remove users is below:

// find group
var groupFound = (await graphClient.Groups
                                         .Request()
                                         .Filter($"displayName eq '{groupName}'")
                                         .Expand("members")
                                         .GetAsync()).FirstOrDefault();

Console.WriteLine($"{groupName} {groupFound != null } [{groupFound?.Id}] [{groupFound?.DisplayName}] [{groupFound?.Members?.Count}]");

if (groupFound != null)
{
     // check is the user is already in the group
     var user = (from u in groupFound.Members
                 where u.Id == user.Id
                 select u).FirstOrDefault();
     Console.WriteLine($"user Found {user != null}");

    if (user != null)
     {
         Console.WriteLine($"removing user {user.Id}");
         // remove from group
         await graphClient.Groups[groupFound.Id].Members[user.Id].Reference
                                     .Request()
                                     .DeleteAsync();

    }
     else
     {
         Console.WriteLine($"adding user {user.Id}");
         // add to group
         await graphClient.Groups[groupFound.Id].Members.References
             .Request()
             .AddAsync(new DirectoryObject
             {
                 Id = user.Id
             });
     }
}

In the code above I wanted the Graph API to return me the list of users in the group. By default you do not see this data when retrieving group information. Adding the Expand method tells Graph API to extend the query and return the additional data. This is something to bear in mind when using Graph API. Just because the data is null does not mean that there is no data, you might need to expand the data set returned.

I hope you found this a useful introduction to Graph API, I will be posting more on Azure AD in the future including more on Graph API.

Managing Application Access with Azure AD–Part 2

In my previous post I showed you how to set up an application in Azure AD and allow Azure AD users to access it. In this post I will show how you can give access to these applications to users outside of your organisation using B2B (Business to Business) as guest users.

B2B is a feature of Azure AD that allows you to easily add two types of user to your applications.

  1. Users who are part of another Azure AD tenant
  2. Users who are not.

If your new user is part of another Azure AD tenant, then when we add them as a guest user to you application and they will use the credentials provided by their own organisation. This means they do not have to remember a new username and password when they want to access your application. It is also useful as they will be managed by their own organisation so you will not be responsible for resetting their passwords for example. Another advantage of using their own Azure AD credentials is that they will lose the ability to sign in to your application when their accounts are disabled or removed from your customer’s tenant. They will however still exist as a guest user in your application but they will no longer be able to sign in.

If your new user is not part of another Azure AD tenant, then they will automatically have a Microsoft account created for them. They will also be prompted to enter a new password. Again this is not managed by you but by Microsoft this time, so password resets are handled by a link provided by them.

To assign a guest user to your application you will need to invited them to use your application. They will then receive an invitation via email that they will need to redeem in order to access your application.

So, go back to the Azure AD blade of the Azure portal and click on Users:

Home ) Default Directory > Users All users 
a 
Users I All users 
Default Direct'bqj - Azure Active Director,' 
All users 
Deleted users 
Password reset 
+ New guest user 
New user 
Search users 
Name 
T Bulk create Bulk invite 
Add filters 
user name 
Bulk delete 
Download users 
user type 
Refresh 
p Reset passwo 
Source

The click on “New guest user”

Home > Default Directory > Users I All users > New user 
New user 
Default 
C) Got feedback? 
o 
Create user 
Create a new user in your organization. 
This user will have a user name like 
alice@sdspencergmail.onmicrosoft.com. 
I want to create users in bulk 
Invite user 
Invite a new guest user to collaborate With 
your organization. The user will be emailed 
an invitation they can accept in order to 
begin collaborating. 
I want to invite guest users in bulk 
Help me decide 
Identity 
Name C) 
Email address* @ 
First name 
Last name 
Personal message 
Invite 
Example: •Chris Green' 
xample: chris@contoso.com

Fill in the form and enter your own personal message and click  “Invite”. You need to enter a valid email address otherwise the user will not be able to receive the invite, as seen below:

image

The text highlighted inside the red box was the custom message I entered in the invitation process. It is possible to change the branding of this email but it is an Azure AD premium feature.

The invite process proves that the user has access to the mail box linked to the email address used. Also, if they are using their organisations Azure AD email address then they must also sign in with their own username an password so you can be confident that they user is who they say they are. This example shows the flow when a user is part of another Azure AD tenant. If the user is not part of another tenant then there will be additional screens for setting up their new Microsoft account and password.

When the user clicks the Accept invitation link they will be redirected to a consent page which is asking for permissions to read their user profile from their Azure AD tenant.

image

Accepting the permissions then will redirect the user to the application portal where the user can access the applications they have been assigned. As we have not allocated any applications to this user yet, they will not see anything,

image

To assign applications to the users, go back to the Azure AD blade in teh Azure portal and click on Users then click on the one you have just added to view their profile:

image

You can see, in this example, in the red box that this is a Guest user who has accepted the invitation.

Click on applications in the left hand menu bar you will see that there are none assigned. To assign this user to an application, navigate back to the Azure AD main blade and click Enterprise applications, then select the application you wish to assign this user to.

image

Click “Assign users and groups”, then Add User

image

Click “None Selected” then search for your new user, select them and click Select.

image

Now click Assign

image

The new users is now assigned. Go back to the  Application screen the user viewed after they signed in and refresh the page

image

The assigned application should now be visible and clicking the application will redirect the user to that applications web site.

Using Azure AD it is easy to now invite users to user your applications and when they are part of another Azure AD tenant, Azure AD takes all the pain out of federating with these new users tenants. Hopefully you have found that this is straight forward and this will have opened up access to your applications in a controlled way. My next post will look at how we can automate this using Graph API.

Introduction to Azure Role Based Access Control (RBAC)

Up until fairly recently I have been managing  access to a number of Azure subscriptions but as I’ve been working for smaller organisations the number of people who needed access was fairly small and easy to manage. It also meant that I generally gave the users Owner or Contributor access to the subscriptions as we were all managing everything so we needed the access at that level. Now I work for a large organisation there is a greater need to  limit access to certain areas of Azure and giving subscription wide access is limited to a few key administrators. This means that I need to look at the minimum access that is required for each of the users who need access to the resources. First I’d like to talk about the scope within which permissions can be set within Azure. For most of the scenarios I’ve worked in I have visibility of a single subscription. For organisations  with a large number of subscriptions there is a further level of scope, Management group, which I won’t be discussing.

image

Permissions can be set at the Subscription, Resource group or the individual resource scope.Depending upon the level of access your user requires there are three basic levels which you can use

  • Owner
  • Contributor
  • Reader

Owner gives the user full access to everything within the scope and can also assign roles to other users.

Contributor gives the user full access to everything within the scope except they are not able to assign roles to other users

Reader give the user access to view the resources within the scope but they are not able to change anything or assign roles.

So assigning the user the Owner role at the Subscription level, then the user can manage all resources within the subscription and assign roles to users. A user can be assigned multiple roles and Azure RBAC is additive so if a user was assigned Contributor at the subscription scope but only Reader on one of the resource groups, the Contributor role would override the reader role. It is also possible to have Deny role assignments. Where a user is Denied permissions on a specific role. Deny assignments take precedence over role assignments.

These roles plus the variety of scopes give some flexibility of access but it is still a large surface area of access that is provided. Azure offers a large number of finer grained roles to allow users to be given specific permissions to specific services. There are a large number of built in assignments as can be seen here: https://docs.microsoft.com/en-us/azure/role-based-access-control/built-in-roles

These finer grained roles allow you to set specific permissions on a specific user within a specific scope. For example if  you wanted to give a user access to a blob store to upload files via the Azure portal there are two permissions that can be set: Reader and Data Access and

Storage Blob Data Contributor. If you assign these two roles to a user in the storage account, then the user is able to login to the Azure portal and navigate to the storage account and access the blob store.

To do this, navigate to the storage account within which you want to assign a role and click the access control item

image

The click “Add role assignment”

image

In the role drop down pick “Storage Blob Data Contributor”, select the user you want to assign the role to and click save. Repeat this for the Reader and Data Access role. Your user now has access only to blob storage and has no access elsewhere in the resource group or subscription. I could have done the same thing by selecting the resource group and Access control and adding these roles there. This would have give the user access to all blob stores within the resource group.

Another example is that you may want to give someone access to your app service so that they can configure and deploy. So navigate to your App Service and click “Access control”, then select the role “Website Contributor”. See https://docs.microsoft.com/en-us/azure/role-based-access-control/built-in-roles#website-contributor for more details. This lets you manage the selected website but not app service plans and no other web sites. If you want to manage other app services then you could add the same role at the resource group level.

Managing Application Access with Azure AD – Part 1

In my next series of blog post I want to talk about how to manage access to applications using Azure AD.

I’ve been looking at how I can set up access to my web based applications and I want to be able to:

  1. Have a single sign on with multiple applications
  2. Allow some users access to only some of the applications
  3. Be able to give access to users outside of my organisation
  4. Be able to control access via code

Part 1 will cover setting my applications up and then restricting access to the applications via Azure AD.

In order to test this I needed to have a number of applications that I could use. I used this example:

https://github.com/AzureADQuickStarts/AppModelv2-WebApp-OpenIDConnect-DotNet

It allows me to login and see my claims. I deployed this into two different app services so I could navigate to them separately. I’m not going to talk about the code on the web side apart from the bits you need to configure up the sample. This series of blogs are more about how to setup Azure AD and the path I went through to my end goal of configuring up users programmatically.

In order to integrate with Azure AD we need to set up each of the applications. This will provide us with an ID with which we can  use to configure each of the applications.

In Azure Portal navigate to Azure Active Directory, or search for it in the search bar

C portal-azure.com/#home 
Microsoft Azure 
p Search resources, services, and docs (G./) 
Azure services 
Create a 
resource 
Azure Active 
Directory 
SQL databases 
Azure AD 
Privileged. 
App 
registrations    C portal-azure.com/#home 
p activd 
Microsoft Azure 
Services 
Azure s 
Azure AD Privileged Identity Management 
-+ Activity log 
Azure Active Directory 
reso HDlnsight clusters 
e Monitor

Home > Default Directory I Overview 
O Default Directory I Overviev 
Azure Active Directory 
p Search (Ctrl 4/) 
O Overview 
Getting started 
Diagnose and solve problems 
Manage 
users 
Organizational relationships 
Roles and administrators 
Enterprise applications 
Devices 
App registrations 
Identity Governance    Home > Default Directory App registrations 
Default Directory I App registrations 
Active 
Search (Ctrl *

In the menu bar on the left select App Registrations –> New registration and complete the form:

Home ) Default Directory App registrations ) Register an application 
Register an application 
-k Name 
The user-facing display name for this application (this can be changed later). 
My New app 
Supported account types 
Who can use this application or access this API? 
@ Accounts in this organizational directory only (Default Directory only - Single tenant) 
O Accounts in any organizational directory (Any Azure AD directory - Multitenant) 
O Accounts in any organizational directory (Any Azure AD directory - Multitenant) and personal Microsoft accounts (e.g. Skype, Xbox) 
Help me choose... 
Redirect URI (optional) 
We'll return the authentication response to this URI after successfully authenticating the user. Providing this now is optional and it can be 
changed later, but a value is required for most authentication scenarios. 
web 
v http /mynewapp.azurewebsites.net 
gy proceeding, you agree to the Microsoft Platform Policies 
Register

I've picked single tenant as I want to invite users using B2B. Now click Register

You need to copy the ID's needed for your web app:

Delete Endpoints 
O 
Got a second? We would love your feedback on Microsoft identity platform (previously Azure AD for developer). * 
Display name 
Application (client) ID 
Directory (tenant) ID 
Object ID 
My New app 
Supported account types 
: My organization only 
Redirect URIS 
: I web, O public client 
Application ID URI 
: Add an Application ID URI 
Managed application in My New app

Copy the Client ID and Tenant ID. Repeat this process for the next app. I've created two apps as I wanted to test limiting access to a single app and deny access to the second if the users has not been invited to it or added manually.

Now add these to the web.config in the sample app. There will be two settings for ClientId and Tenant. Make sure that the redirect url matches the url of the application you registered and redeploy. Repeat this for the second application.

If you navigate to the web apps and try and login, you may get an error as we haven't setup any users, although any users currently in your Azure AD should be able to login.

To give users access to your app. Go back to Azure Active Directory and this time select Enterprise Applications and click on the app you just created.

Home 
) Default Directory > Enterprise applications All applications > My New app I Overview 
My New app I Overview 
Enterp•ise Application 
rvlew 
Diagnose and solve problems 
Manage 
Properties 
Owners 
Users and groups 
Provlston•ng 
Application proxy 
Splf_+ruire 
Properties 
O 
Name 
MN 
My New app 
Application ID 
Object ID Q) 
Getting Started

Click Users and groups

My New app I Users and groups 
Enterprse Applicat& 
+ Add user 
Edit Remove p Update Credentials 
Overview 
O 
The application will appear on the Access Panel for assigned users. 
Diagnose and solve problems 
irst 100 shown, to search all users & groups, enter a display nami 
Manage 
Display Name 
Properties 
NO application assignments found 
Owners 
Users and groups 
provisioning

Click Add user

Home > Default Directory > Enterprise applications I All applications > My New app Users and groups > Add Assignment 
Add Assignment 
Default 
Groups are not available for assignment due to your Active Directory plan level. 
Users 
None Selected 
Select Role 
Default Access

Click None Selected, pick users from the list and click Select. These users have now been given access to your application. However, as I mentioned earlier all users who are part of your Azure AD currently are able to login to your web app, we need to now configure the app so that only assigned users can access it.

Click Properties in your enterprise application and set User Assignment required to yes and click Save. (repeat this for your other application)

Home Default Directory 
My New app I 
Enterprise Applicati'H 
Overview 
> Enterprise applications I All applications > My New app I Properties 
Properties 
Save Discard Delete 
Enabled for users to sign-in? O 
Diagnose and solve problems 
Manage 
properties 
Owners 
Users and groups 
provisioning 
Application proxy 
Self-service 
Security 
Conditional Access 
permissions 
Token encryption 
Name * (D 
Homepage URL G) 
Logo @ 
Application ID O 
Object ID O 
User assignment required? C) 
Visible to users? O 
My New app 
MN 
Select a file

Now only users who are assigned to your application can login. You can test this now. Go to the first application url and login with one of the users you assigned. Then go to the second app (you shouldn't have assigned any users just yet.) and login. This time you will get an error.

You can now assign users to the second application and the error should go away when you attempt to login.

We’ve now set up our applications in Azure AD and limited access to each application. In my next post I’ll show you how you can then add users from outside of your organisation to these applications.

Exporting Logs from Application Insights using Continuous Export

This is the fifth post of my series of posts about Log Analytics and Application Insights. The previous post talked about adding custom logging to you code using Application Insights. Now you’ve got your logging into Application Insights you can run log analytics queries and build dash boards, alerts etc. Sometime though you want to use this data in other systems and it would be useful if you could export the data and use it else where. This post will show you how you can regularly export the data from Application Insights into Azure Storage. Once it is in Storage it can easily be moved into other systems or used else where such as PowerBI. This can be achieved using the Continuous Export feature of Application Insights

To enable Continuous Export, login to the Azure management portal and navigate to Application Insights. Click on the instance you want Continuous Export enabled. The scroll down the options on the left until you find the Configure section and click on Continuous Export

image

To use Continuous Export you will need to configure a storage account. Click Add:

image

Click Data types to export:

image

I was only interested in the logs I emitted from my custom logging, so selected Custom Event, Exception and Trace then clicked OK

image

Next pick a storage location. Make sure you use one where your Application Insights instance is located otherwise you will be charged egress fees to move the data to a new datacentre.

image

You can now pick an existing storage account or create a new one. Upon selecting a storage account you can pick an existing  container or create a new one.Once a blob container is selected click OK.

Continuous Export is now configured. You will not see anything in the storage container until the next set of logs are sent to Application Insights.

My log analytics query shows the following logs have been generated:

image

If you look at the continuous Export configuration page you will see that the last updated date has changed.

image

Now look in blob storage. You should see a folder that is<ApplicationInsights_ServiceName>_<ApplicationInsights_InstrumentationKey

image

Click through and you will see a number of folders. One for each of the logs that I enabled when setting up Continuous Export. Click through one of them and you will see a folder for the date and then a folder for the hour of the logs. Then a file containing the logs for that hour.

image

You will get a row of json data for each row output for the log query. Note the whole logs emitted will be in each of the folders.

e.g.

{
     "event": [
         {
             "name": "Some Important Work Completed",
             "count": 1
         }
     ],
     "internal": {
         "data": {
             "id": "a guid is here",
             "documentVersion": "1.61"
         }
     },
     "context": {
         "data": {
             "eventTime": "2020-03-22T18:12:30.2553417Z",
             "isSynthetic": false,
             "samplingRate": 100.0
         },
         "cloud": {},
         "device": {
             "type": "PC",
             "roleInstance": "yourcomputer",
             "screenResolution": {}
         },
         "session": {
             "isFirst": false
         },
         "operation": {},
         "location": {
             "clientip": "0.0.0.0",
             "continent": "Europe",
             "country": "United Kingdom",
             "province": "Nottinghamshire",
             "city": "Nottingham"
         },
         "custom": {
             "dimensions": [
                 {
                     "CustomerID": "4df16004-2f1b-48c0-87d3-c1251a5db3f6"
                 },
                 {
                     "OrderID": "5440d1cf-5d06-4b0e-bffb-fad522af4ad1"
                 },
                 {
                     "InvoiceID": "a7d5a8fb-2a2e-4697-8ab4-f7bf8b8dbe18"
                 }
             ]
         }
     }
}

As the data is now out of Application Insights you can move it where ever you need it. You will also need to manage the blob storage data too otherwise you will end up with the logs stored in two places and the storage costs will be doubled.

One example of subsequent usage is exporting the data to Event Hub. As the data is in blob storage you can use a function with a blob trigger to read the blob in a row at a time and publish the data onto Event Hub:

[FunctionName("ContinuousExport")]
public static async void Run([BlobTrigger("logs/{name}", Connection = "ConitnuousExportBlobSetting")]Stream myBlob, string name,
     [EventHub("logging", Connection = "EventHubConnectionAppSetting")] IAsyncCollector<string> outputEvents,TraceWriter log)
{
     log.Info($"C# Blob trigger function Processed blob\n Name:{name} \n Size: {myBlob.Length} Bytes");
     StreamReader textReader = new StreamReader(myBlob);
     while (!textReader.EndOfStream)
     {
         string line = textReader.ReadLine();
         log.Info(line);
         await outputEvents.AddAsync(line);
     }
}

Note: This is an example, so will need additional code to make sure that you don’t exceed the Event Hub maximum message size

So with Continuous Export you can extract your log data from Application insights and move it to other systems for processing.

Processing data from IoT Hub in Azure Functions

If you have been following my previous posts (Part 1, part 2, part 3) you will know that I’m using an ESP 8266 to send data to the Azure IoT hub. This post will show you how to receive that data and store it in Azure Storage and also show how you can also forward the data onto the Azure Service Bus.

I’m going to use Visual Studio and C# to write my function. If you are unfamiliar with Azure functions you can setup bindings to a variety of Azure resources. These bindings make it easy to interface without needing to write a lot of boiler plate code. These bindings allow your function to be triggered when something happens on the resource or also use the output bindings to write data to these resources. For example, there are bindings for Blob and Table storage, Service bus, Timers etc. We’re interested in the IoT hub binding. The IoT hub trigger will be fired when an event is sent to the underlying Event hub. You can also use an output binding to put messages into the IoT hub event stream. We’re going to use the Table storage and Service bus output bindings.

To get started you need to create a new Function project in Visual Studio.

image

Select IoT hub trigger and browse to a storage account you wish to use (for logging) plus add in the setting name you want to use to store the IoT hub connection string.

image

This will generate your empty function with you preconfigured IoT hub trigger.

You need to add your IoT hub connection string to your setting file. Open local.settings.json and add in a new line below the AzureWebjobs settings with the same name you entered in the dialog. ConnectionStringSetting in my example.Your connection string can be found in the Azure Portal.

Navigate to your IoT hub, then click Shared Access Policies

image

Select the user you want to use to access the IoT hub and click the copy icon next to the primary key connection string.

image

You can run this in the Visual Studio debugger and when messages are sent to your IoT hub you should see a log appearing in the output window.

What I want to do is to receive the temperature and humidity readings from my ESP 8266 and store the data in Azure storage so that we can process it later.

For that I need to use the Table storage output binding. Add the binding attribute to your function below the FunctionName binding.

[return: Table("MyTable", Connection = "StorageConnectionAppSetting")]

Again, you will need to add the storage setting into your config file. Find your storage account in the Azure portal, click Access keys then copy the key1 connection string and paste it in your config file

image

To use Azure Storage Output binding you will need to create a class that represents the columns in you table.

image

I included a device id so that I can identify which device the reading we associated to. You will need to change the return type of your function to be TempHumidityIoTTableEntity then add the code to extract the data from the message.

Firstly, I changed the python code in my ESP8266 to send the data as json so we can process it easier. I’ve also added a message identifier so that we can send different messages from the ESP8266 and be able to process them differently.

sensor.measure()

dataDict = {'partitionKey': 'r',

      'rowkey':'recneptiot'+str(utime.ticks_ms()),

      'message':'temphumidity',

      'temperature':str(sensor.temperature()),

      'humidity': str(sensor.humidity())}

mqtt.publish(sendTopic,ujson.dumps(dataDict),True)

That means we can serialise the Iot Hub message into something we can easily access. So the whole function is below:

[FunctionName("Function1")]
[return: Table("yourtablename", Connection = "StorageConnectionAppSetting")]
public static TempHumidityIoTTableEntity Run([IoTHubTrigger("messages/events", Connection = "ConnectionStringSetting")]EventData message, TraceWriter log)
{
     var messageAsJson = Encoding.UTF8.GetString(message.GetBytes());
     log.Info($"C# IoT Hub trigger function processed a message: {messageAsJson}");

    var data = JsonConvert.DeserializeObject<Dictionary<string, string>>(messageAsJson);

    var deviceid = message.SystemProperties["iothub-connection-device-id"];

    return new TempHumidityIoTTableEntity
     {
         PartitionKey = deviceid.ToString(),
         RowKey = $"{deviceid}{message.EnqueuedTimeUtc.Ticks}",
         DeviceId = deviceid.ToString(),
         Humidity = data.ContainsKey("humidity") ? data["humidity"] : "",
         Temperature = data.ContainsKey("temperature") ? data["temperature"] : "",
         DateMeasured = message.EnqueuedTimeUtc.ToString("O")
     };

}

Providing your config is correct you should be able to run this in the Visual Studio debugger and view your data in Table Storage:

image

I mentioned at the start that I wanted to pass some messages onto the Azure Service bus. For example we may want to do something if the humidity goes above 60 percent. In this example we could add a HighHumidity message to service bus for some other service or function to respond to. We’ll send the message as a json string so that we can action it later in a different service. You can easily add a Service Bus output binding to your function. However, this binding documentation shows it as another return value. There is an alternative binging that allows you to set a message string out parameter with the message contents. This can be used as follows:

    [FunctionName("Function1")]
     [return: Table("yourtablename", Connection = "StorageConnectionAppSetting")]
     public static TempHumidityIoTTableEntity Run([IoTHubTrigger("messages/events", Connection = "ConnectionStringSetting")]EventData message,
         [ServiceBus("yourQueueOrTopicName", Connection = "ServiceBusConnectionSetting", EntityType = EntityType.Topic)]out string queueMessage,
         TraceWriter log)
     {
         var messageAsJson = Encoding.UTF8.GetString(message.GetBytes());
         log.Info($"C# IoT Hub trigger function processed a message: {messageAsJson}");

        var data = JsonConvert.DeserializeObject<Dictionary<string, string>>(messageAsJson);

        var deviceid = message.SystemProperties["iothub-connection-device-id"];

        queueMessage = null;
         if (data.ContainsKey("humidity"))
         {
             int humidity = int.Parse(data["humidity"]);

            if (humidity > 60)
             {
                 Dictionary<string, string> overHumidityThresholdMessage = new Dictionary<string, string>
                 {      
                     { "deviceId",deviceid.ToString()},
                     { "humidity", humidity.ToString()},
                     {"message", "HighHumidityThreshold" }
                 };
                 queueMessage = JsonConvert.SerializeObject(overHumidityThresholdMessage);
             }
         }

        return new TempHumidityIoTTableEntity
         {
             PartitionKey = deviceid.ToString(),
             RowKey = $"{deviceid}{message.EnqueuedTimeUtc.Ticks}",
             DeviceId = deviceid.ToString(),
             Humidity = data.ContainsKey("humidity") ? data["humidity"] : "",
             Temperature = data.ContainsKey("temperature") ? data["temperature"] : "",
             DateMeasured = message.EnqueuedTimeUtc.ToString("O")
         };

    }
}

We now have a function that reads the device temperature and humidity reading into table storage and then sends a message to a Service Bus Topic if the temperature goes above a threshold value.

Generating your IoT Hub Shared Access Signature for your ESP 8266 using Azure Functions

In my last 2 posts I showed how you can connect your ESP 8266 to the IoT hub to receive messages from the hub and also to send messages. One of the issue I had was generating the Shared Access Signature (SAS) which is required to connect to the IoT hub. I was unable to generate this on the device so I decided to use Azure Functions. The code required is straight forward and can be found here: https://docs.microsoft.com/en-us/azure/iot-hub/iot-hub-devguide-security#security-tokens

To create an Azure Function, go to the Azure management portal click the menu icon in the top left and select “Create a Resource”

image

Search for “Function”

image

and select “Function App” and click Create

image

Complete the form

image

And click Review and Create to accept the defaults or click next and work through the wizard if you want to change from the default values.

image

Click create to kick of the deployment of your new Azure Function. Once the deployment is complete navigate to the Function by clicking “Go To Resource”. You now need to create your function.

Click the + sign next to “Functions”. I used the In-portal editor as it was the easiest to use at the time as I already had most of the code copied from the site mentioned above.

image

Click In-Portal, then Continue and choose the Webhook + API template and click Create

image

Your function is now ready for editing. It will have some default code in there to give you an idea how to start

image


We’re going to use the previous SAS code in here and modify it to accept a json payload with the parameters you need for the SAS to be created.

The json we’ll use is as follows:

{
     "resourceUri":"[Your IOT Hub Name].azure-devices.net/devices/[Your DeviceId]",
     "expiryInSeconds":86400,
     "key":"[SAS Key from IoT hub]"
}

You can get you SAS key from the IoT hub in the Azure Portal in the devices section. Click on the device

image

Then copy the Primary or Secondary key.

Back to the function. In the editor Paste the following code:

C# function

#r "Newtonsoft.Json"

using System;

using System.Net;

using Microsoft.AspNetCore.Mvc;

using Microsoft.Extensions.Primitives;

using Newtonsoft.Json;

using System.Globalization;

using System.Net.Http;

using System.Security.Cryptography;

using System.Text;

public static async Task<IActionResult> Run(HttpRequest req, ILogger log)

{

     log.LogInformation("C# HTTP trigger function processed a request.");

     string token = "";

     try

     {

          string requestBody = await new StreamReader(req.Body).ReadToEndAsync();

          dynamic data = JsonConvert.DeserializeObject(requestBody);

          int expiryInSeconds = (int)data?.expiryInSeconds;

          string resourceUri = data?.resourceUri;

          string key = data?.key;

          string policyName = data?.policyName;

          TimeSpan fromEpochStart = DateTime.UtcNow - new DateTime(1970, 1, 1);

          string expiry = Convert.ToString((int)fromEpochStart.TotalSeconds + expiryInSeconds);

          string stringToSign = WebUtility.UrlEncode(resourceUri) + "\n" + expiry;

          HMACSHA256 hmac = new HMACSHA256(Convert.FromBase64String(key));

          string signature = Convert.ToBase64String(hmac.ComputeHash(Encoding.UTF8.GetBytes(stringToSign)));

          token = String.Format(CultureInfo.InvariantCulture, "SharedAccessSignature sr={0}&sig={1}&se={2}", WebUtility.UrlEncode(resourceUri), WebUtility.UrlEncode(signature), expiry);

          if (!String.IsNullOrEmpty(policyName))

          {

               token += "&skn=" + policyName;

          }

     }

     catch(Exception ex)

     {

          return (ActionResult)new OkObjectResult($"{ex.Message}");

     }

     return (ActionResult)new OkObjectResult($"{token}");

}

Click Save and Run and make sure that there are no compilation errors. To use the function you need to post the json payload to the following address:

https://[your Function Name].azurewebsites.net/api/HttpTrigger1?code=[your function access key]

To retrieve your function access key, click Manage and copy your key from the Function Keys section

image

We’re now ready to use this in micropython on your ESP 8266. I created a function to retrieve the SAS

def getsas(hubname, deviceid, key):

    import urequests

    import ujson

    dict = {}

    dict["resourceUri"] = hubname+'.azure-devices.net/devices/'+deviceid

    dict["key"] = key

    dict["expiryInSeconds"]=86400

    payload = ujson.dumps(dict)

    response = urequests.post('https://[your function name].azurewebsites.net/api/HttpTrigger1?code=[your function access key]', data=payload)

    return response.text

In my connectMQTT() function from the first post I replaced the hard coded SAS string with a call to the getsas function. The function returns a SAS which is valid for 24 hours so you will need to retrieve a new SAS once 24 hours has elapsed.


I can now run my ESP 8266 code without modifying it to give it a new SAS each time I want to use it. I always forgot and wondered why it never worked the next time I used it. I can now both send and receive data from/to the ESP 8266 and also generate a SAS to access the IoT hub. The next step is to use the data received by the hub in an application and send action messages back to the ESP 8266 if changes are made. I look forward to letting you know how I got on with that in a future post.

Sending data from the ESP 8266 to the Azure IoT hub using MQTT and MicroPython

In my previous post I showed you how to connect your ESP 8266 to the Azure IoT hub and be able to receive messages from the IoT hub to turn on a LED. In this post I'll show you how to send data to the IoT hub. For this I need to use a sensor that I will read at regular intervals and then send the data back to the IoT hub. I picked a temperature and humidity sensor I had from the kit of sensors I bought

image

This sensor is compatible with the DHT MicroPython library. I order to connect to the IoT hub use the same connect code that is in my previous post. The difference with sending is you need a end point for MQTT to send you temperature and humidity data to. The topic to send to is as follows:

devices/<your deviceId>/messages/events/

So using the same device id as in the last post then my send topic would be devices/esp8266/messages/events/

To send a message to the IoT hub use the publish method. This needs the topic plus the message you want to send. I concatenated the temperature and humidity and separated them with a comma for simplicity

import dht

import time

sensor = dht.DHT11(machine.Pin(16))

mqtt=connectMQTT()

sendTopic = 'devices/<your deviceId>/messages/events/'

while True:

    sensor.measure()

    mqtt.publish(sendTopic,str(sensor.temperature())+','+str(sensor.humidity()),True)

    time.sleep(1)

The code above is all that is required to read the sensor every second and send the data to the IoT hub.

In Visual Studio Code with the Azure IoT Hub Toolkit extension installed, you can monitor the messages that are sent to your IoT hub. In the devices view, right click on the device that has sent the data and select “Start Monitoring Built-in Event Endpoint”

v NO FOLDER OPENED 
You have not yet opened 
Open Fol 
> OUTLINE 
v AZURE IOT HUB 
v o recnepsiotu)l 
> Modules 
> Interfaces (Preview) 
Send D2C Message to 10T Hub 
Send C2D Message to Device 
Invoke Device Direct Method 
Edit Device Twin 
Start Monitoring Built-in Event Endpoint 
Start Receiving C2D Message 
Generate Code 
Generate SAS Token for Device 
Get Device Info 
Copy Device Connection 
Delete Device 
> Distributed Tracing Setting (Preview) 
> Endpoints 
"body": "23 54 
"applicationPro 
"mqtt-retain" 
[ 10THubFbni tor]

This then displays the messages that are received by your IoT hub in the output window

PROBLEMS 
OUTPUT DEBUG coNSOLE 
Azure IOT Hub Toolkit 
[10THub"bnitor] Created partition receiver [1] for consumerGroup [$Defau1t] 
[10THub"bnitor] [9:12:39 PM] Message received from [recnepsiotoøl] : 
"body": "23 54 
"applicationproperties 
"mqtt-retain": "true 
[10THubFbnitor] [9:14:28 PM] Message received from [recnepsiotoøl] : 
"body": "23,54 
"applicationproperties 
"mqtt-retain": "true

You can see in the body of the received message the temperature and humidity values that were sent.

I still need to sort out generating the Shared Access Signature and also programmatically access the data I send to the IoT hub. I hope to have blog posts for these soon.

Connecting the ESP 8266 to Azure IoT Hub using MQTT and MicroPython

Recently  was introduced to the ESP 8266 processor which is a low cost IoT device with built in Wi-Fi, costing around £3 - £4 for a development board. The thing that interested me (apart from price) was the device is Arduino compatible and will also run MicroPython. The version I purchased from Amazon was the NodeMcu variant with built in power and serial port via a microUsb port, so it makes an ideal board to start with as there are no additional components required.

clip_image001

This board however did not have MicroPython installed and that required a firmware change. The instructions were fairly straight forward and I followed this tutorial.

After installing MicroPython you can connect to the device using a terminal emulator via the USB serial port. Check in Device Manager to find the COM port number and the default baud rate is 115200. I used the Arduino Serial Monitor tool. In the terminal emulator you can press enter and you should get back the python REPL prompt. If not then you have the COM port or Baud rate wrong.

image

You can write you python directly into here but its easier to write the python in you PC then run it on the device. For this I use ampy

In Command Prompt install ampy using:

pip install adafruit-ampy

This allows you to connect to your device. Close the terminal emulator to free up the COM port then type the following to list the files on your device:

ampy --port COM4 --baud 115200 ls

The MicroPython Quick Ref will summarise how to access the GPIO ports etc but in order to connect to the IoT hub you will need to configure the Wi-Fi on the device. This can be done using the network module.

So create a new text file on your PC and write the code to connect to your Wi-Fi. To test this you can use ampy to run the python on the device:

ampy --port COM4 --baud 115200 run networking.py

Its a good idea to use print statements to help debug as once the run has complete the output will be reflected back in your Command Prompt.

Now you are connected to Wi-Fi we can start to look at connecting to the IoT hub. I am assuming that you already have your IoT hub set up. We now need to configure you new device. Navigate to the IoT hub in your Azure Portal. In Explorers click IoT Devices, then New

image

Enter your device id, the name your device will be known as. All your devices need a name that is unique to your IoT hub. Then click Save. This will auto generate the keys needed to generate the shared access signature needed to access the IoT hub later.

image

Once created you may need to click refresh in the devices list to see you new device. Click the device and copy the primary key, you will ned this for later to generate the Shared Access Signature used in the connection string. In order to generate a new Shared Access Token you can use Visual Studio Code with the Azure IoT Hub Toolkit extension installed. This puts a list of devices and endpoints in the explorer view and allows you to create a new Shared Access Token. find your device in the Devices list, Right click and select Generate SAS Token For Device

image

You will be prompted to enter the number of hours the token is valid for and the new SAS token will appear in the output window:

image

SharedAccessSignature sr=[your iothub name].azure-devices.net%2Fdevices%2Fesp8266&sig=bSpX6UMM5hdUKXHfTagZF7cNKDwKnp7I3Oi9LWTZpXI%3D&se=1574590568

The shared access signature is made up of the full address of your device, a time stamp indicating how long the signature is valid for and the whole thing is signed. You can take this an use it to test your access to IoT hub, so make sure you make the time long enough to allow you to test. The ESP8266 doesn't have a clock that can be used to generate the correct time so you will need to create the SAS off board. I’m going to use an Azure function with the code here to generate it.

Back to Python now. In order to connect to the IoT hub you will need to use the MQTT protocol. MicroPython uses umqtt.simple.

There are a few things required before you can connect.

Firstly the Shared Access Signature that you created above.

Next you will need to get the DigiCert Baltimore Root certificate that Iot Hub uses for SSL. This can be found here. Copy the text from -----BEGIN CERTIFICATE----- to -----END CERTIFICATE-----, including both the Begin and End lines. Remove the quotes and replace the \r\n with real new line in your text editor then save the file as something like baltimore.cer.

Next you will need a ClientId. For IoT hub the ClientId is the name of your device in IoT Hub. In this example it is esp8266

Next you will new a Username. For IoT hub, this is the full cname of your IoT Hub with your client id and a version. e.g. [your iothub name].azure-devices.net/esp8266//?api-version=2018-06-30

The following code should allow you to connect to the IoT Hub:

def connectMQTT():
     from umqtt.simple import MQTTClient

    CERT_PATH = "baltimore.cer"
     print('getting cert')
     with open(CERT_PATH, 'r') as f:
         cert = f.read()
     print('got cert')
     sslparams = {'cert':cert}

   CLIENT_ID='esp8266'
     Username='yourIotHub.azure-devices.net/esp8266/?api-version=2018-06-30'
     Password='SharedAccessSignature sr=yourIotHub.azure-devices.net%2Fdevices%2Fesp8266&sig=bSpX6UMM5hdUKXHfTagZF7cNKDwKnp7I3Oi9LWTZpXI%3D&se=1574590568'

   

    mqtt=MQTTClient(client_id=CLIENT_ID,server='yourIotHub.azure-devices.net',port=8883,user=Username,password=Password, keepalive=4000, ssl=True, ssl_params=sslparams)


     mqtt.set_callback(lightLed)
     mqtt.connect(False)

    mqtt.subscribe('devices/esp8266/messages/devicebound/#')
     flashled(4,0.1, blueled)


    return mqtt

set_callback requires a function which will be called when there is a device message sent from the IoT Hub. Mine just turns a Led on or off

def lightLed(topic, msg):
     if msg== b'on':
         statusled.on()
     else:
         statusled.off()

connect(False) means that the topic this device subscribes to will persist after the device disconnects.

I’ve also configured the device to connect to its bound topics so that any message sent to the device will call the callback function.

Now we need to have a process loop so that we can receive the messages. The ESP8266 does not seem to run async code so we need to call the wait_msq function to get any message back from the IoT hub

mqtt=connectMQTT()
print('connected...')
while True:
     mqtt.wait_msg()

save your python as networking.py (and make sure that all the code you wrote initially to connect to Wi-Fi is included) then run ampy again:

ampy --port COM4 --baud 115200 run networking.py

Your device should run now. I’ve used the Led flash to show me progress for connecting to Wi-Fi then connecting to IoT Hub and also through to receiving a message. There is a blue LED on the board which I’ve been using as well as a standard LED which is turned on/off based upon the device message received from the IoT Hub. The blue LED is GPIO 2.

In order to send a message from the IoT hub to your device then you can do this from the Azure Portal in the devices view. Click on the device then click Message To Device

image

Enter the Message Body (on or off) and click Send Message

image

Alternatively you can do this in Visual Studio Code by right clicking the device and selecting Send C2D Message To Device and enter the message in the box that pops up

image

In my example the Led lights when I enter on and turns off when I enter off. ampy is likely to timeout during this process, but that’s ok as the board will still be running. As we’ve put the message retrieval inside a loop then the board will continue to run. To stop it running you will need to reset the board by pressing the reset button.

My next step is to sort out automatically generating the Shared Access Signature  and then I’ll look at sending data to the IoT Hub

Steve Spencer's Blog | Blogging on Azure Stuff

Steve Spencer's Blog

Blogging on Azure Stuff

Azure Key Vault Logging and Events with Log Analytics

Following on from my previous blog post (http://blogs.recneps.net/post/Setting-up-Azure-Key-Vault-with-Audit-logging) which explains how to set up Azure key vault with logging enabled, this post explains how to access the details of these logs and also to create an alert so you can see if someone is accessing the key vault from an unknown ip address (for example)

Open the Azure portal and navigate to the Resource Groups section and pick the resource group that we configured last time which contains the key vault and log analytics resources

image

Click your log analytics item, to open Log Analytics.

You can then select Log Search

image

This screen allows you to create your own query or select from existing ones.

image

Selecting “All Collected Logs” will show you the logs for the last day. I’ve highlighted the areas where you can change the time period, see the query and also click on Advanced Analytics to give a richer environment for analysing your logs.

image

If you want to query just for the Key Vault Audit logs then you can use the following query:

search * | where Category=="AuditEvent"

image

This will default to a list view, but clicking the Table button will format the data in an easier to read table.

image

You can sort and filter on the column headers. This can also be achieved using the order by clause as follows:

search * |where Category=="AuditEvent"  | order by TimeGenerated desc

A blog post discussing the query language can be found here

We are interested in all calls where someone has tried to access a Secret from the key vault. For that we are looking for an AuditEvent with an OperationName of SecretGet. If we also want to restrict the columns we retrieve then you can use “project” e.g.

search * | where Category=="AuditEvent"  and OperationName == "SecretGet"
| order by TimeGenerated desc
| project TimeGenerated, OperationName, CallerIPAddress, ResultSignature, requestUri_s

image

Now we are familiar with writing queries we can look at alerting. I’d like to set up an alert when the key vault is access from an IP Address other than the one where my application is running. This can be done as follows:

search * | where Category=="AuditEvent" and CallerIPAddress != "51.140.184.51"

This ip address is actually the Azure Portal and is shown when you view the resource group that contains the key vault.I’m using this ip address so that I will actually get an alert (at the wrong time) when my application runs

Click New Alert Rule

image

The following screen should appear

image

The Alert Target should be the Log Analytics we’ve been using and the Target Criteria (when clicked) should show the query we’ve just written

image

We need to configure the rule for when this alert should be triggered. I’m interested when at least 1 attempt has been made in the last 5 minutes to access the Key Vault from an unknown location, so I set the threshold to be zero and click Done. We’ve now configured the logic to determine when the event is fired. Now we need to say what we want to happen when it fires.Firstly we need to give the alert a name and description

image

Now we need to configure how we are alerted. For this you need to create an action group. An action group allows you to define a collection of activities that will happen when the alert is fired. Click New Action Group

image

Action Types can be any of the following:

image

An action group can have multiple actions and you can select both email and SMS in a single action.Once you have created your Action Group you need to select in then click “Create alert rule”

image

Your alert is now set up and running. You can view/edit alerts by selecting Monitor in the Azure Portal

image

then click Alerts (preview), you will be able to see the alerts that have fired.

image

Click Manage Rules to edit the alert.

When the alert is fired I will get an email containing the details of the alert.

Log analytics is a powerful tool and whilst this series of posts has been related to auditing of Key Vault we can use log analytics for a wide variety of log sources such as Application Insights. We can also use the same mechanism for alerting to these other log sources,

The next post is a video that shows you how to connect existing log files to log analytics

Setting up Azure Key Vault with Audit logging

Azure Key Vault is a good way to share secrets with your partners in a way that allows you to have control over the access to each of the assets in Azure. We also need to know who is accessing the resources and from where so that we can monitor for suspicious activity. This post will talk through setting up the key vault and then configuring logging to keep track of the audit information for your certificates, keys and secrets. For each application that you want to access your resources you will need to create some credentials that the application can use.

To allow an application to access key vault an App Registration needs to be added to Azure Active Directory (AAD). This effectively sets up a username and password that the application can use for credentials.

Open the azure portal (http://portal.azure.com) and navigate to Active Directory.

Click "App registrations"

clip_image002

Then "New application registration"

clip_image001

Name needs to be unique within your AD, select Web API/API and enter sign-on url. If you not building a website then enter anything in here. It might be useful to use a url related to your existing domain with application name appended. It doesn’t need to be a valid url. The click “Create”

Once created copy the Application ID as this is equivalent to a username to be used when calling the Key Vault in code. You now need to create the password.

Click Settings then Keys

clip_image002

clip_image004

clip_image006

Enter a name in the description field and select a duration, then click Save. The new key value will be displayed. You will need to copy this as it will not be visible again once you leave this page. This will be used as the password.

clip_image002[5]

Now create the Key Vault. To do that it is a good idea to put it in a specific resource group, especially if you are creating a set of resources that the key vault is going to access or if you are going to setup third party access. Once the Resource Group has been created, select it and add a Key Vault. When the Create Key Vault panel appears, click Access Policies, click "Add new"

clip_image004[5]

Pick the application you just created in AAD and select Get in Secret permissions, Save then go back to the main Key Vault pane and click Create

You have just given the application we created earlier access to just retrieving secrets. As you can see from the access policy you can give the application permissions to access a combination of Keys, Secrets and Certificates with the minimum access of Get. The Key Vault security is at the vault level and you cannot protect individual secrets at the user level. By granting only Get access on the Secret the application will not be able to list the Secrets available and will only be able to retrieve secrets it knows the names of.

Now the Key vault is set up and can be accessed, we want to know who is accessing the vault and from where. Out of the box this is not enabled and requires additional configuration and resources to allow us to be able to retrieve this audit information. This is achieved by enabling diagnostic logs in the Key Vault.

Before you can enable this you need to create a new storage account in this resource group to store the logs, then add Application Insights to the resource group

clip_image002[7]

Once these have been provisioned, navigate to the Key Vault you just created & click Diagnostic logs

clip_image004[7]

Click "Turn on diagnostics"

clip_image006[6]

Select “Archive to Storage Account” and Pick the storage account you’ve just created

Select “Send to Log Analytics” and Create a new OMS workspace in your resource group

clip_image008

Once created select this for Log Analytics

clip_image009

select the AuditEvent log and click Save. 

Now any changes to the Key Vault plus any access from your application will be logged and visible via log analytics. There’s a 10 – 15 minute delay between accessing the Key Vault and the log appearing.

To Add a Secret to the vault, Navigate to the vault, click Secrets then Add

clip_image010

Select Manual from the Upload options, enter a name and the secret

clip_image011

Remember the name you gave the Secret as you will need this in your code when accessing the key vault. This secret will now have a unique identifier that you will use. The one I’ve just created is:

https://recneps-vault.vault.azure.net/secrets/recnepssvsb-key

You should see in the logs this secret being created and also when it gets accessed.

Accessing the KeyVault in C# can be seen here: https://docs.microsoft.com/en-us/azure/key-vault/key-vault-use-from-web-application

The application in the example uses settings as defined below:

ClientID is the Application ID we created in the application registration in AD

ClientSecret is the key you created (that you had to save as it wasn’t visible again) as part of creating the application registration in AD.

Each Key, Secret and Certificate has a unique url which is used as the SecretURI e.g. https://recneps-vault.vault.azure.net/secrets/recnepssvsb-key

You now have your key vault set up with audit logging and are able to access it. My next blog post will talk you through how to access the logs and also how to set up alerting

Adding Rigour to an AzureML Web Service Deployment

As a developer I want an automated mechanism to build, test and deploy everything I do, so when my team and I came to implementing something with AzureML it was the first thing I was challenged on. We have a group of analysts who want to use AzureML to build part of our system without having to translate their requirements into a language that we developers can use to build a system in code. I’d been playing around with AzureML and deployed a few services as part of a proof of concept but hadn’t looked at how we could automated the deployment process and add some control. We didn’t want to have a mechanism that would allow the analysts to build and deploy a new model to production, without some way to check whether what they have built was fit for purpose. After a bit of research I found that we could export both the experiment and the web service as JSON from AzureML. Exporting both the experiment and the web service definitions would allow us to version control the source. We could also import these definitions to allow us to move the experiment and web service to different subscriptions. Exporting and importing the experiment was relatively straight forward using PowerShell.

Export-AmlExperiementGraph & Import-AmlExperiementGraph

The full PowerShell to export and import the experiment is:

Get-AmlWorkspace -ConfigFile .\config-source.json
$exp = Get-AmlExperiment | where Description -eq '<ExperrimentName>' -ConfigFile .\config-source.json
Export-AmlExperimentGraph -ExperimentId $exp.ExperimentId -OutputFile 'C:\experiment.json' -ConfigFile .\config-source.json

Get-AmlWorkspace -ConfigFile .\config-dest.json
Import-AmlExperimentGraph -InputFile 'C:\experiment.json' -ConfigFile .\config-dest.json

Note: This relies on two config files, one to identify the source workspace and the other to identify the destination workspace. These are in the following formats:

{
"Location": "West Europe",
"WorkspaceId": "<WorkspaceID",
"AuthorizationToken": "<AuthorizationToken>"
}

You can find the workspace id and authorization tokens by opening your AzureML workspace then clicking settings.

clip_image002

clip_image004

Exporting the experiment as JSON allows you to add it to source control and version the experiment.

Importing the experiment into a new Workspace will allow you to open the workspace and view/edit the experiment and also to manually deploy the web service. This on its own will give you some control over the web service and allow you to control how and when it gets deployed and will stop the analytics team from accidentally deploying something to production and potentially breaking the system.

It is also possible to export a deployed web service and then import it into a new subscription.

When we came to trying to export the web service we ran into a few issues. Firstly there seemed to be a number of ways to export the web service definition and they seemed to produce different JSON.

https://github.com/ritwik20/AzureML-WebServices

https://github.com/hning86/azuremlps#export-amlwebservicedefinitionfromexperiment

https://docs.microsoft.com/en-us/powershell/module/azurerm.machinelearning/export-azurermmlwebservice?view=azurermps-2.2.0

We settled for the last link but also took some missing information from the first link. The process for exporting is a little more complicated.

Firstly, the web service definition needs to be exported as a JSON file. The web service import uses resource manager and requires a different mechanism to login to the experiment export/import. I used:

Login-AzureRmAccount –SubscriptionId <My Subscription ID>

This use the interactive login so if you want to automate this then you need to use the service principle 

To export the web service:

$webService = Get-AzureRmMlWebService -Name "Source Service Name" -ResourceGroupName "Source Resource Group Name" 
Export-AzureRmMlWebService -WebService $webService -OutputFile 'C:\wsexport.json' 

The JSON then needs editing to add in the new storage account and commitment plan.

The JSON can then be imported into the new subscription using New-Azure​RmMl​Web​Service 

Changing the JSON to add in the commitment plan id seemed to cause problems and I kept getting the error “Commitment Plan ID must be provided”. This error was confusing as I was including the commitment plan id in the configuration and I thought that I had it in the correct place. If you open the JSON that you export and find the storage account node then you will need to overwrite this with:

"storageAccount":{ 
     "name": "<StorageAccountName>",  
     "key": "<StorageAccountKey>" },  
"commitmentPlan": { 
      "id": "/subscriptions/0<Subscription-ID>/resourceGroups/<ResourceGroupName>/providers/Microsoft.MachineLearning/commitmentPlans/<CommitmentPlanName>"}, 

My issue was that I had the commitment id wrong rather than in the wrong place and I found the correct id using:

Get-AzureRmMlCommitmentPlan

Once I had this configured correctly the import worked without any errors using:

New-AzureRmMlWebService -Force -ResourceGroupName "New Resource Group Name" -Name "Web Service Name" -Location "West Europe" -DefinitionFile "C:\wsexport.json" 

As we’ve used PowerShell to automate the export and import we could then easily script the config file edits and wire this in to an automated test and deploy process. We can write tests that check the web service parameters have not been changed by the analytics team and that the return data is the correct format, so we can ensure that the deployed service will at least function correctly. It’s more difficult to check whether the actual machine learning process is correct, but we will know when the interfaces are broken. We have also version controlled both the experiment and web service JSON so we can easily roll back if necessary. We decided that we needed both so that the experiment didn’t need to be copied to a new workspace and then automate a web service deploy, we just needed the web service in the new subscription, but we wanted the version control for the experiment too.

Using Bots For Form Filling

After watching James Mann’s talk on the bot framework at DDD I decided to look at whether the Bot framework can offer an alternative to web pages for filling in forms. The Microsoft Bot Framework is a framework that helps you build bots that can run in a variety of messaging apps such as Skype, Skype For Business, Facebook and Slack. There are a number of ways to help you build the bot and you can interface with the Cognitive Services to add intelligence to your bot using services such as LUIS.

The simplest mechanism for retrieving data from a user is to use FormFlow within the BotBuilder SDK. This streamlines the creation of a bot to collect data from a user. In order to start building Bots you will need Visual Studio 2017 installed along with the  Bot Builder extensions. This quick start will help you get the prerequisites sorted. The FormFlow example creates a Bot that will walk you through ordering a sandwich and along with the advanced example you can add optional ingredients along with some simple terms you can add to make it easier to pick multiple items from a list.

In order to debug and test your Bot there is a local emulator that allows you to run your Bots in a local test messaging app. You will need to download and install this emulator prior to testing.

I followed the examples through and found that there is a compilation error with the sample code in the FormBuilder section:

image

This conversation on stackoverflow seems to resolve the issue.

image

When you run the Bot in the Visual Studio debugger, it creates a web service and you need to use the url of this service in the Bot emulator which is run separately.

image

I need to take the URL http://localhost:3979/, add /api/messages and use it to configure the Bot emulator

image

Now click connect. Your Bot is now ready to test. To start the Bot you need to type a message. It doesn’t matter what you type, but “Hello” is probably a polite way to start Smile

image

To interact with the Bot you can either enter the number of the sandwich you want or type something and the Bot will try and make sense of what you type. If you type some text that does not appear in the list like “chickn” then the Bot wont understand, but typing “chicken” will give you all the matching answers to reduce your options. you also need to type whole words. so Chick wont match whereas Chicken will.

image

The FormBuilder example allows you to add validation and also custom code. The custom code allows you to ask for specific toppings and also say “everything except olives pickles”

image

There is also validation included in the FormBuilder example and it allows you to check to make sure that the address starts with a number

image

image

Optional stages can be added. For example if you pick Foot Long then you get a free cookie or drink

image

FormFlow is a good way to start with your Bot if you want to capture user input. The next stage for me is to work out how this can be applied to a more complex form filling scenario where there are dynamic lists which change based upon the input answers from previous questions.

Creating a Scheduled Web Job in Azure

It’s been a while since I’ve talked about web jobs, but they are still around.I needed to modify one of mine recently and configure it as a scheduled web job.

You can deploy your web job from the Azure Portal. Web jobs are part of App Services and are deployed by selecting your app service you want and clicking the web jobs service.

image

Click the Add button.

image

Enter a name for your web job, Browse a zip or exe containing your web job

Select Triggered from the Type drop down. This change the UI to allow you to select Scheduled

image

The Schedule is triggered using a CRON expression.

Alternatively, configure the web job as a manual trigger, then use the Azure Scheduler to trigger the web job.. When you have configured your web job, click on its properties in the Azure Portal and you should see a web hook url.

https://yourwebapp.scm.azurewebsites.net/api/triggeredwebjobs/yourwebjob/run

Now create a new scheduler Job

image

Select the method as Post and paste in the webhook url.Once you've completed this configuration you can then configure the schedule

image

Using the scheduler allows you to configure retry policies and also error actions

Azure Relay Hybrid Connections

If you are using the Azure App Service to host your web site and you want to connect to an on-premises server then there a number of ways you can do this. One of the simplest is to use the hybrid connection. Hybrid connections have had a bit of a revamp lately and they used to require a BizTalk service to be created, now you just need a Service Bus Relay. You can generally use the hybrid connection to communicate to your back end server over TCP and you will need to install an agent on your server (or a server that  can reach the one you want to connect to) called the Hybrid Connection Manager (HCM). HCM will make an outbound connection to the Service Bus Relay over ports 80 and 443, so you are unlikely to need firewall ports changing.

Hybrid connections are limited to a specific server name and port and your code in the Azure App Service will address the service as if it was in your local network, but will only be able to connect to the machine and port configured in the Hybrid Connection. Instructions for configuring your hybrid connection and HCM are here.

I have setup a number of the old BizTalk style hybrid connections and the new way is a lot easier to do. I ran into a few connectivity issues when I first created the Relay hybrid connection and there were a few things I found that helped me to find out where the issues were. Firstly the link I provided to configure the hybrid connection has a troubleshooting section which talks about tcpping. You can run this in the debug console in Azure and it will check to see if your HCM is talking to the same relay as the one in your app service. To get to the debug console, log in to your azure portal, select the app service you want to diagnose. Scroll down to Advanced Tools and click Go.

image

This will take you to your Kudu dashboard where you can do a lot of nice things, such as process explorer, diagnostic dumps, log streaming and debug console

The address will be https://[your namespace].scm.azurewebsites.net/

The debug console will allow you to browse and edit files directly in your application without the need to ftp. This is really useful when trying to check configuration issues.

If you want to check connectivity from your server machine to the Azure Relay then you can use telnet. You might need to add the telnet feature to Windows by using:

dism /online /Enable-Feature /FeatureName:TelnetClient (From https://www.rootusers.com/how-to-enable-the-telnet-client-in-windows-10/)

in a command prompt type

telnet [your relay namespace].servicebus.windows.net 80 or

telnet [your relay namespace].servicebus.windows.net 443

Then a blank screen denotes successful connectivity (from: https://social.technet.microsoft.com/wiki/contents/articles/2055.troubleshooting-connectivity-issues-in-the-azure-appfabric-service-bus.aspx)

You can also use PowerShell to check:

Test-Netconnection -ComputerName [your relay namespace].servicebus.windows.net -Port 443

This all checks that you are connected to the relay, the final thing you need to check is whether you can actually resolve the dns of the target service from the server where HCM is running. This needs to be the host name of the server and not the fully qualified name. This also needs to match the machine name you configured in the hybrid connection.The easiest way to do this for me was to put the address of WCF service I wanted to connect to into a browser on the machine running HCM.

Hopefully I’ve given you a few pointers to help identify why your hybrid connection does not connect.

Custom ASP.NET MVC app running in a Container on Service Fabric

In an earlier post, I talked about how to create a Docker container on Windows that housed a custom ASP.Net MVC app. What I want to show now is how you can get this container running in Service Fabric.

I created 3 identical virtual machines all capable of running Docker as in my earlier post. Now I needed to make my three VMs into a Service fabric cluster. These two posts explain how:

https://docs.microsoft.com/en-us/azure/service-fabric/service-fabric-get-started

https://docs.microsoft.com/en-us/azure/service-fabric/service-fabric-cluster-standalone-deployment-preparation

My 3 VMs are called sf0, sf1 & sf2 and I needed to  put these into my cluster config. I picked the ClusterConfig.Unsecure.MultiMachine config file that comes with the Service Fabric files and changed it to include my 3 VMs, so my nodes look like this:

"nodes": [
{
      "nodeName": "sf0",
      "iPAddress": "sf0",
      "nodeTypeRef": "NodeType0",
      "faultDomain": "fd:/dc1/r0",
      "upgradeDomain": "UD0"
},
{
      "nodeName": "sf1",
      "iPAddress": "sf1",
      "nodeTypeRef": "NodeType0",
      "faultDomain": "fd:/dc2/r0",
      "upgradeDomain": "UD1"
},
{
      "nodeName": "sf2",
      "iPAddress": "sf2",
      "nodeTypeRef": "NodeType0",
      "faultDomain": "fd:/dc3/r0",
      "upgradeDomain": "UD2"
}
],

I then remoted onto one of the machines and ran the following PowerShell:

.\TestConfiguration.ps1 -ClusterConfigFilePath .\ClusterConfig.json

This will check all the machines in the ClusterConfig.json file to see if they are configured correctly and report any errors. I got the following error:

Machine 'sf2' is not reachable on port 445. Check connectivity/open ports. Error: A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond 192.168.1.222:445

This meant I needed to open the correct firewall ports on my VM. I got this error for all the machines in the cluster. Once I fixed this and reran the PowerShell, the tests passed which meant I could install Service Fabric on each of the machines as follows:

.\CreateServiceFabricCluster.ps1 -ClusterConfigFilePath .\ClusterConfig.json –AcceptEULA

When this completes successfully you should see something like this:

Your cluster is successfully created! You can connect and manage your cluster using Microsoft Azure Service Fabric Explorer or Powershell. To connect through Powershell, run 'Connect-ServiceFabricCluster

I could connect to Service Fabric Explorer using: http://sf0:19080

Now I have my cluster running I needed to create a Service Fabric App and deploy it to the cluster. Make sure that you have installed the Service Fabric SDK, then run Visual Studio. Create a new Service Fabric project. When the project is created, right click on the services node, Select Add->New Service Fabric Service

clip_image001[4]

Then pick Guest Container and enter the name in your Docker Hub repository where your Docker image resides.

clip_image001

This will add in the necessary files to your service fabric project. If you remember from my earlier post, the website was hosted on port 8000 of the container. We need to tell service fabric about this and also we may want to map this to a different port.

If you open the containers ServiceManifest file

clip_image001[8]

Add an endpoint with the endpoint you want Service Fabric to use to publish the website out

clip_image001[10]

In this example I’m using the same port. If you want to map the port to a different one then changes this to something else e.g. If I wanted to use http://sf0:8080 as the website then I would change the Service Manifest to this:

image

You also need to tell service fabric about the Container port that is published. This is done in the application manifest file:

image

This is set to 8000 as that is the port exposed by the Docker container

Now deploy your application to service fabric. It may take a while to initialise your container as it will need to be downloaded from Docker Hub before it will run. Once it is running you should see it as Ready in the Service Fabric Explorer

image

Error updating SSL certificates in Azure App Services

I was asked to update the SSL certificates on a website that was hosted in Azure Web Apps. No problem I thought.

Go to the Azure Portal,

Select the website you want to update.

When the blade appears scroll down the left panel and select SSL Certificates

image

image

 

remove the binding, by clicking … at the end of the binding row and select Delete

image

Now remove the certificate by clicking .. at the end of the certificate row and select Delete

This is where I got an error

image

It took a short while to resolve this.

I tried a few things like restarting the site & checking the staging slot, but I still got the error. Finally, I checked other sites in the same app service plan and I had the same certificate used for another Web App (both using the same domain url). Once I removed the binding from that site, I could delete the certificate and upload a new one. I had to then add the new bindings to both sites.

Custom ASP.NET MVC app running in a Windows Container

With the introduction of Windows Containers on  Window Server 2016 and the ability to run containers in Service Fabric I thought it was time to investigate Windows Containers and I wanted to know how to build one that will run a web site using IIS.

As I’m new to containers, although I’ve done a very similar exercise with Docker on Linux, I decided to follow the Windows Quick Start Guide. I hit a few problems early on so I’ve put the steps I followed here:

After opening a PowerShell window as administrator I ran the following commands:

Install-Module -Name DockerMsftProvider -Repository PSGallery –Force – Ran OK
Install-Package -Name docker -ProviderName DockerMsftProvider – Had an error
WARNING: Cannot verify the file SHA256. Deleting the file.
WARNING: C:\Users\ADMINI~1.DEV\AppData\Local\Temp\DockerMsftProvider\Docker-1-12-2-cs2-ws-beta.zip does not exist
Install-Package : Cannot find path 'C:\Users\ADMINI~1.DEV\AppData\Local\Temp\DockerMsftProvider\Docker-1-12-2-cs2-ws-beta.zip' because it does not exist.

 

Not sure what was causing this to fail but I followed the instructions to manually install (from https://github.com/OneGet/MicrosoftDockerProvider/issues/15)

Start-BitsTransfer -Source https://dockermsft.blob.core.windows.net/dockercontainer/docker-1-12-2-cs2-ws-beta.zip -Destination /docker.zip
Get-FileHash -Path /docker.zip -Algorithm SHA256
mkdir C:\Users\Administrator\AppData\Local\Temp\DockerMsftProvider\
cp .\docker.zip C:\Users\Administrator\AppData\Local\Temp\DockerMsftProvider\
cd C:\Users\Administrator\AppData\Local\Temp\DockerMsftProvider\
cp .\docker.zip Docker-1-12-2-cs2-ws-beta.zip
Install-Package -Name docker -ProviderName DockerMsftProvider -Verbose Restart-Computer –Force

After Rebooting I tried to download and run a sample container

docker run microsoft/dotnet-samples:dotnetapp-nanoserver

but I got the following error

docker : C:\Program Files\Docker\docker.exe: error during connect: Post http://%2F%2F.%2Fpipe%2Fdocker_engine/v1.25/containers/create: open //./pipe/docker_engine: The

system cannot find the file specified..

At line:1 char:1

+ docker run microsoft/dotnet-samples:dotnetapp-nanoserver

+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

+ CategoryInfo : NotSpecified: (C:\Program File...ile specified..:String) [], RemoteException

+ FullyQualifiedErrorId : NativeCommandError

It turns out that the docker service wasn’t running after the reboot, so open services.msc and find the docker service to start it.

Running the same command again will download the image from docker hub, create a container from it and then run it. As this is a visual container it runs once and then stops.

clip_image001

Every time I do docker run microsoft/dotnet-samples:dotnetapp-nanoserver it creates a new container. What I want to do is to run one that is stopped and view the output on the screen.

For this I needed to start the container I had already created. If you run

docker –ls –a

This will list all the containers that are both running and stopped and you can see from the image below that I had run docker run a number of time. Each time it tried to download the image (which was already downloaded) and then create a new container from it.

clip_image001[6]

docker container start -a 12d382ae0bd6 (-a attached STDOUT so you can see the output)

clip_image001[8]

Now I know how to create and start containers I wanted to build one of my own. This is easier than I first though as there a lots of base templates stored on docker hub and git hub.

https://docs.microsoft.com/en-us/virtualization/windowscontainers/samples#Application-Frameworks

https://hub.docker.com/r/microsoft/

I picked one on docker hub that has IIS and ASP.Net installed already, so all I needed to do after was to add my own website and configure IIS correctly. Using docker pull,

docker pull microsoft/aspnet (see https://hub.docker.com/r/microsoft/aspnet/)

This retrieves the template from Docker Hub and I want to use that template to install my ASP.Net MVC site and configure IIS to serve the pages on port 8000. Following the instructions here (https://docs.microsoft.com/en-us/dotnet/articles/framework/docker/aspnetmvc). I published my MVC site and copied the publish folder to my docker machine. Then I needed to create a Dockerfile recipe to instruct docker what to install in my image. So I created a folder that contained the Dockerfile and also the published website as below

clip_image001[10]

The contents of the Dockerfile are:

# The FROM instruction specifies the base image. You are
# extending the microsoft/aspnet image.
FROM microsoft/aspnet
# Next, this Dockerfile creates a directory for your application
RUN mkdir C:\sdsweb
# configure the new site in IIS.
RUN powershell -NoProfile -Command \
Import-module IISAdministration; \
New-IISSite -Name "sdsweb" -PhysicalPath C:\sdsweb -BindingInformation "*:8000:"
# This instruction tells the container to listen on port 8000.
EXPOSE 8000
# The final instruction copies the site you published earlier into the container.
ADD sdswebsource/ /sdsweb

Now I need to run this to create the image

In PowerShell, I changed directory to the folder containing my Dockerfile, then ran

docker build -t sdsweb .

This has created an image and we need to now get this running as a container

clip_image001[12]

using docker run again

docker run -d -p 8000:8000 --name sdsweb sdsweb

clip_image001[14]

My container is now running and I should be able to view the web pages in my browser on port 8000, but I need to know the IP address first

docker inspect -f "{{ .NetworkSettings.Networks.nat.IPAddress }}" sdsweb

clip_image001[16]

Now I can browser to http://172.17.97.235:8000

clip_image001[18]

I changed the default web page to show the machine name serving the pages under Getting Started. Listing the containers will show the container ID and this matches the machine name displayed on the web page

image

That’s it running in a container. There are a couple more things I’d like to do before I’ve finished. The first is to make sure that when my Windows Server restarts, then my sdsweb container also starts. At the moment it will not start as I didn’t add a restart parameter when I called docker run. Adding –restart always will cause the container to restart when windows restarts.

docker run -d -p 8000:8000 --name sdsweb --restart always sdsweb

The final thing I want to do is to be able to share this image so I’d like to push it up to docker hub

docker login - enter username and password

docker push recneps/sdsweb

clip_image001[20]

Then to use it on another machine

docker pull recneps/sdsweb

clip_image002

In my next post I am going to look at how I can create a container that can be hosted in Service Fabric