Steve Spencer's Blog

Blogging on Azure Stuff

Custom Application Roles in Azure AD

Previously I’ve talked about how you can control access to your web applications in Azure AD (Part 1 & Part 2) and also how to use Role Based Access Control (RBAC) to manage access to resources in Azure. This post will build on these previous posts and show you how you can create your own custom roles for use in your own web applications and how you can use these roles to control access to parts of your application.I’ll use the same example used in Part 1.

Firstly you need to created the roles for your application to use, assign the roles to users and finally change your code to make it role aware.

To add roles to your application. Navigate to the Azure portal and click on Azure Active Directory and App Registrations. Select the Web App you created previously.

image

Click on “App roles | Preview” then “Create App Role”

image

Enter the role information and click Apply and repeat for all the roles you require.

You should now see your roles in the grid:

image

I’ve added a standard user role and a test administrator role.

To assign these roles to users, Navigate to the Enterprise Applications blade and click your application. Then select “Users and groups”

image

To Add roles to an existing assigned user, tick the user and then click “Edit”

image

Select the role and click “Select”

image

You should now see the role is assigned to the user. Similarly to add a role to a new user click “Add user”

image

This time you need to select the new user and then the role:

image

You can add multiple roles to a user by repeating the Add user process.

image

Here I’ve added both new roles to one user.

You are now ready to user these roles in your application.

The sample code already shows how to view the claims for a user.

image

When I sign in to the application with the user that has two roles I see the following entries in the claims table:

image

Adding the roles to the application and the assigning the roles to a user is enough to make them appear as roles in your application when the user signs in. There is a limit to the number of roles that an application can have. These are stored in the manifest of the App Registration. There is a limit of 1200 items in the App Registration Manifest and this includes all the configuration items not just roles.

There are a number of ways in which you can use Roles in code. Firstly in your views you can add conditional code to limit what a standard user can see

@if (Request.IsAuthenticated && User.IsInRole("Test.Admin"))

{

<h2>You Are An Admininstrator</h2>

<br />

}

When you sign in with the Test,Admin role you will get this additional text which is not visible for the User role

image

You can also control access at the controller and controller action levels by adding the Authorize attribute on the controller or controller action:

[Authorize(Roles = "Test.Admin ")]

public class ClaimsController : Controller

{

or for multiple roles

[Authorize(Roles = "Test.Admin,User")]

public class ClaimsController : Controller

{

at the action level:

[Authorize(Roles = "Test.Admin")]

public async Task<ActionResult> Index()

{

}

or both

[Authorize(Roles = "Test.Admin,User")]

public class ClaimsController : Controller

{

        [Authorize(Roles = "Test.Admin")]

        public async Task<ActionResult> Index()

        {

        }


        [Authorize(Roles = "User")]

        public async Task<ActionResult> Index2()

        {

        }

}

In this example the user needs either the User or Test.Admin role to access the controller but only the Test.Admin role can access the Index action and the User role can access the Index2 action. This allows you to put controls in at multiple levels and provide a more custom experience for your users.

App Roles makes it easy to add custom roles to your application. If you have a higher Azure AD subscription you can assign these roles to groups and assign the groups to the applications. This means that you can add users to groups to assign the roles rather than adding them to each individual user. I can have a Standard user group that has the User role assigned and all users in that group will have the User role passed through to the application.

You now have Role Based Access control in your Azure AD application and can start to build your application features out based on the roles you define.

Custom ASP.NET MVC app running in a Windows Container

With the introduction of Windows Containers on  Window Server 2016 and the ability to run containers in Service Fabric I thought it was time to investigate Windows Containers and I wanted to know how to build one that will run a web site using IIS.

As I’m new to containers, although I’ve done a very similar exercise with Docker on Linux, I decided to follow the Windows Quick Start Guide. I hit a few problems early on so I’ve put the steps I followed here:

After opening a PowerShell window as administrator I ran the following commands:

Install-Module -Name DockerMsftProvider -Repository PSGallery –Force – Ran OK
Install-Package -Name docker -ProviderName DockerMsftProvider – Had an error
WARNING: Cannot verify the file SHA256. Deleting the file.
WARNING: C:\Users\ADMINI~1.DEV\AppData\Local\Temp\DockerMsftProvider\Docker-1-12-2-cs2-ws-beta.zip does not exist
Install-Package : Cannot find path 'C:\Users\ADMINI~1.DEV\AppData\Local\Temp\DockerMsftProvider\Docker-1-12-2-cs2-ws-beta.zip' because it does not exist.

 

Not sure what was causing this to fail but I followed the instructions to manually install (from https://github.com/OneGet/MicrosoftDockerProvider/issues/15)

Start-BitsTransfer -Source https://dockermsft.blob.core.windows.net/dockercontainer/docker-1-12-2-cs2-ws-beta.zip -Destination /docker.zip
Get-FileHash -Path /docker.zip -Algorithm SHA256
mkdir C:\Users\Administrator\AppData\Local\Temp\DockerMsftProvider\
cp .\docker.zip C:\Users\Administrator\AppData\Local\Temp\DockerMsftProvider\
cd C:\Users\Administrator\AppData\Local\Temp\DockerMsftProvider\
cp .\docker.zip Docker-1-12-2-cs2-ws-beta.zip
Install-Package -Name docker -ProviderName DockerMsftProvider -Verbose Restart-Computer –Force

After Rebooting I tried to download and run a sample container

docker run microsoft/dotnet-samples:dotnetapp-nanoserver

but I got the following error

docker : C:\Program Files\Docker\docker.exe: error during connect: Post http://%2F%2F.%2Fpipe%2Fdocker_engine/v1.25/containers/create: open //./pipe/docker_engine: The

system cannot find the file specified..

At line:1 char:1

+ docker run microsoft/dotnet-samples:dotnetapp-nanoserver

+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

+ CategoryInfo : NotSpecified: (C:\Program File...ile specified..:String) [], RemoteException

+ FullyQualifiedErrorId : NativeCommandError

It turns out that the docker service wasn’t running after the reboot, so open services.msc and find the docker service to start it.

Running the same command again will download the image from docker hub, create a container from it and then run it. As this is a visual container it runs once and then stops.

clip_image001

Every time I do docker run microsoft/dotnet-samples:dotnetapp-nanoserver it creates a new container. What I want to do is to run one that is stopped and view the output on the screen.

For this I needed to start the container I had already created. If you run

docker –ls –a

This will list all the containers that are both running and stopped and you can see from the image below that I had run docker run a number of time. Each time it tried to download the image (which was already downloaded) and then create a new container from it.

clip_image001[6]

docker container start -a 12d382ae0bd6 (-a attached STDOUT so you can see the output)

clip_image001[8]

Now I know how to create and start containers I wanted to build one of my own. This is easier than I first though as there a lots of base templates stored on docker hub and git hub.

https://docs.microsoft.com/en-us/virtualization/windowscontainers/samples#Application-Frameworks

https://hub.docker.com/r/microsoft/

I picked one on docker hub that has IIS and ASP.Net installed already, so all I needed to do after was to add my own website and configure IIS correctly. Using docker pull,

docker pull microsoft/aspnet (see https://hub.docker.com/r/microsoft/aspnet/)

This retrieves the template from Docker Hub and I want to use that template to install my ASP.Net MVC site and configure IIS to serve the pages on port 8000. Following the instructions here (https://docs.microsoft.com/en-us/dotnet/articles/framework/docker/aspnetmvc). I published my MVC site and copied the publish folder to my docker machine. Then I needed to create a Dockerfile recipe to instruct docker what to install in my image. So I created a folder that contained the Dockerfile and also the published website as below

clip_image001[10]

The contents of the Dockerfile are:

# The FROM instruction specifies the base image. You are
# extending the microsoft/aspnet image.
FROM microsoft/aspnet
# Next, this Dockerfile creates a directory for your application
RUN mkdir C:\sdsweb
# configure the new site in IIS.
RUN powershell -NoProfile -Command \
Import-module IISAdministration; \
New-IISSite -Name "sdsweb" -PhysicalPath C:\sdsweb -BindingInformation "*:8000:"
# This instruction tells the container to listen on port 8000.
EXPOSE 8000
# The final instruction copies the site you published earlier into the container.
ADD sdswebsource/ /sdsweb

Now I need to run this to create the image

In PowerShell, I changed directory to the folder containing my Dockerfile, then ran

docker build -t sdsweb .

This has created an image and we need to now get this running as a container

clip_image001[12]

using docker run again

docker run -d -p 8000:8000 --name sdsweb sdsweb

clip_image001[14]

My container is now running and I should be able to view the web pages in my browser on port 8000, but I need to know the IP address first

docker inspect -f "{{ .NetworkSettings.Networks.nat.IPAddress }}" sdsweb

clip_image001[16]

Now I can browser to http://172.17.97.235:8000

clip_image001[18]

I changed the default web page to show the machine name serving the pages under Getting Started. Listing the containers will show the container ID and this matches the machine name displayed on the web page

image

That’s it running in a container. There are a couple more things I’d like to do before I’ve finished. The first is to make sure that when my Windows Server restarts, then my sdsweb container also starts. At the moment it will not start as I didn’t add a restart parameter when I called docker run. Adding –restart always will cause the container to restart when windows restarts.

docker run -d -p 8000:8000 --name sdsweb --restart always sdsweb

The final thing I want to do is to be able to share this image so I’d like to push it up to docker hub

docker login - enter username and password

docker push recneps/sdsweb

clip_image001[20]

Then to use it on another machine

docker pull recneps/sdsweb

clip_image002

In my next post I am going to look at how I can create a container that can be hosted in Service Fabric

System.Web.Mvc not found after deploying to Azure Web Apps using Release Manager

I’m currently evaluating Release Manager in Visual Studio Team Services and I am using it to deploy website to Azure Web Apps. I recently tried to deploy an Asp.Net MVC 4 application and ran into some issues.

I’ve created a build that packages and zips up my web application which runs successfully.I’ve linked a Release pipeline to this build and I can deploy to my test Azure site without any errors, but when I try and run the web application I get the following error:

Could not load file or assembly 'System.Web.Mvc, Version=4.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35' or one of its dependencies. The system cannot find the file specified.

image

I’m using Visual Studio 2013 with MVC as a nuget package.Looking at the properties of System.Web.Mvc I can see that it is set to Copy Local = True

image

I tried a few different things to try to get the assembly to be copied like redoing the nuget install and eventually I toggled the Copy Local to False, saved the project file and then set it back to true. When I looked at the diff of the project file I found an additional property

image

This seems to fix the build. When I checked this in and rebuild, System.Web.Mvc now appears in the zip file. The build was then release to Azure and the web app worked correctly.

Changing Website Settings through the Azure Portal

When using configuration in Microsoft Azure websites, ensure that you put configuration that you are likely to change often in AppSettings. This allows you to make configuration changes in the Management portal of Azure rather than having to edit the web.config file directly. An example of where you might like to do this include settings that allow you to disable site features temporarily such as during an upgrade or routine maintenance.

App settings in the web config file are names/value pairs and are accessed as follows:

System.Configuration.ConfigurationManager.AppSettings["StevesSetting"]

Which can be seen in the web.config as follows:

<configuration>
  .
  <appSettings>
      <add key="StevesSetting" value="Webconfig setting" />
  </appSettings>
  .
</configuration>

In order to manage this configuration in the portal you need to navigate to your website and click the configure tab (in the old portal)

image

and scroll down to app settings, then add in the setting you wish to change

image

Or in the old portal navigate to the website and click settings then applications settings and scroll down to the app settings section

image

Azure Websites: Blocking access to the azurewebsites.net url

I’ve been setting up one of our services as the backend service for Azure API management. Part of this process we have mapped DNS to point to the service. As the service is hosted in Azure Websites there are now two urls that exist which can be used to access the service. I wanted to stop a user from accessing the site using the azurewebsites.net url and only access it via the mapped domain. This is easy to achieve and can be configured in the web.config file of the service.

In the <system.webServer> section add the following configuration

<rewrite>
    <rules>
        <rule name="Block traffic to the raw azurewebsites url"  patternSyntax="Wildcard" stopProcessing="true">
          <match url="*" />
          <conditions>
            <add input="{HTTP_HOST}" pattern="*azurewebsites.net*" />
          </conditions>
          <action type="CustomResponse" statusCode="403" statusReason="Forbidden"
          statusDescription="Site is not accessible" />
        </rule>
    </rules>
</rewrite>

Now if I try and access my site through the azurewebsites.net url, I get a 403 error, but accessing through the mapped domain is fine.

Windows Azure and SignalR with Gadgeteer

I’ve been playing with Gadgeteer (http://www.netmf.com/gadgeteer/) for a while now and I am a big fan of the simple way we can build embedded hardware applications with high functionality. We have a proof of concept device that includes a Colour touch screen, RFID reader and an Ethernet connections. This device is capable of connecting to a web api REST service which we have hosted in Windows Azure and we can use this service to retrieve data from our service depending upon the RFID code that is read. This works well but there are times when we would like to notify the device when something has changed. SignalR seems to be the right technology for this as it removes the need to write polling code in your application.

Gadgeteer uses the .Net Micro framework which is a cut down .Net framework and doesn’t support the ASP.NET SignalR libraries. As we can use web api using the micro framework using the WebRequest classes,  I wondered what was involved to get SignalR working on my Gadgeteer device.

The first problem was to work out the protocol used by SignalR and after a short while trawling the web for details of the protocol I gave up and got my old friend fiddler out to see what was really happening.

After creating a SignalR service I connected my working example to the signalR hub running on my local IIS..

The first thing that pleased me was that the protocol looked fairly simple. It starts with a negotiate which is used to return a token which is needed for the actual connection.

GET /signalr/negotiate?_=1369908593886

Which returns some JSON:

{"Url":"/signalr","ConnectionToken":"xyxljdMWO9CZbAfoGRLxNu54GLHm7YBaSe5Ctv6RseIJpQPRJIquHQKF4heV4B_C2PbVab7OA2_8KA-AoowOEeWCqKljKr4pNSxuyxI0tLIZXqTFpeO7OrZJ4KSx12a30","ConnectionId":"9dbc33c2-0d5e-458f-9ca6-68e3f8ff423e","KeepAliveTimeout":20.0,"DisconnectTimeout":30.0,"TryWebSockets":true,"WebSocketServerUrl":null,"ProtocolVersion":"1.2"}

I used this JSON to pull out the connection id and connection token. This was the first tricky part with the .Net Micro framework. There is not the same support for JSON serialisation you get with the full framework plus the string functions are limited as well. For this I used basic string functions using Substring and IndexOf as follows:

int index = negJson.IndexOf("\""+token+"\":\"");
if (index != -1)
{
    // Extracts the exact JSON value for then name represented by token
    int startindex = index + token.Length + 4;
    int endindex = negJson.IndexOf("\"", startindex);
    if (endindex != -1)
    {
        int length = endindex - startindex;
        stringToExtract = negJson.Substring(startindex, length);
    }
}

With the correct token received Fiddler led me to the actual connection of signalR:

GET /signalr/connect?transport=webSockets&connectionToken=yourtoken&connectionData=%5B%7B%22name%22%3A%22chathub%22%7D%5D&tid=2 HTTP/1.1

Looking at this I could determine that I needed to pass in the token I retrieved from negotiate, the transport type and the name of the hub I want to connect to. After a bit of investigating I used the transport of longPolling.

Now as I think I understood the protocol, I tried to implement it in SignalR. The first issue that arose was what to send with the negotiate call. I figured that this was some sort of id of the client that is trying to connect so I decided to use the current tick count. This seemed to work and I guess that as long as my devices don’t connect at exactly the same time then Signal R would work. I’ve had no problems so far with this.

Upon connecting to the hub I needed to create a separate thread to handle signalR so that the main device wouldn't stop running whilst the connection to the SignalR hub was waiting for a response. Once a response is received the response returns with a block of JSON data appropriate to the SignalR message being received. This needs to be decoded and passed onto the application. You then need to reconnect back to the SignalR hub. The period between receiving data and then reconnecting back to the hub needs to be small. Whilst the message is being processed it cannot receive any more message and may miss some data. I retrieve the response stream and then pass the processing of the stream to a separate thread so that I can reconnect to the hub as fast as possible.

This is not a full implementation of SignalR on the .Net Micro-framework but it is the implementation of a simple client and can be used fairly successfully on the Gadgeteer device. I still need to do a little more work to try to speed up the connections as it is possible to miss some data.

The SignalR hub is hosted on a Windows Azure website along side the web api service which allows both web, Windows 8 and Gadgeteer applications to work side by side.

Gadgeteer has opened up another avenue for development and helps us to provide more variety of devices in a solution

Windows Azure Websites, Web API and SignalR

One of our projects involves a web service that implements both SignalR and Web API and we were looking at the quickest and most cost effective way to get it deployed so that one of our customers could run a Windows 8 application as a demo away from the office. The application works well internally as we have the service deployed on one of our servers on IIS. The options we were considering were:

  1. Package the application up in an install package, ship this to our customer and then provide them with instructions and support to allow them to deploy and configure their application
  2. Deploy it on one of our servers and then publish the service through our firewall
  3. Deploy as a Cloud service in Windows Azure
  4. Deploy as a website in Windows Azure

We considered the fact that the first option would probably take us a fair amount of time to make a deployment package, test it and provide enough documentation and support to allow our customer to deploy it on their servers. The other 3 options involved us doing a smaller amount of work, but at least we could get everything working well before shipping the demo out. Option 2 would mean using our internal resources for something that would not be used that often but we would not necessarily know whether and when it was being used so the resources would need to be kept running limiting our capacity internally. Windows Azure was a good fit for this application and the choice was really between setting up a cloud service or use a web site, I guess we could have set up a virtual machine hosted in Windows Azure, but this was a bit excessive just for a simple web service. The two remaining options were to set up a cloud service by creating a web role in deploying to Windows Azure or to use Websites. The cloud service would involve more work for us as we would need to change the project to add in the cloud service project and web role and then do a full PaaS deploy to Windows Azure. This would then utilise a whole virtual machine (although we would have used an Extra Small instance), but the web sites seem a sensible option especially as we already have a number of them available for free. How easy was this going to be and will both Web API and SignalR work with Windows Azure Websites, especially as we were using preview software. I was surprised about how easy this was to deploy and I’ll walk through the process we went through.

Step 1: Make sure that the service runs locally,

Step 2: Our service uses Code First Entity Framework using a local SQL server. Create a database using Windows Azure SQL Server via the Windows Azure Management portal (https://manage.windowsazure.com), the copy the ADO.NET connection string.

image

Paste this into your web config file of the web api service. You will need to make sure that the Windows Azure SQL Server firewall has your public IP address configured and you will need to make sure that your firewall will allow connections through port 1433. Now run your application and make sure that you can connect to the Windows Azure SQL database. As we are using Code First Entity Framework, the database tables were created for me so I didn’t need to do any database deployment. The only issue I had with this approach was that I had to create the database first in Windows Azure.

Step 3: With our service running locally but with the database in Windows Azure we are now ready to deploy to the cloud. In the Windows Azure Management portal, click the “New” button

image

The “Quick Create”, enter the url you want to use and click “Create Web Site”

image

Step 4: We now need to deploy our service. In the Azure management portal, navigate to the web site you just created and click “Download Publishing Profile”. Save this to your computer.

image

In Visual Studio 2012, open your web api project, right click on the project in Solution Explorer and click publish.

image

This will display the publish dialog.

image

Click the import button and navigate to the folder where the publish profile was saved. This should then allow you to complete the wizard

image

Click Next and check to make sure the correct connection string is displayed, click Next then Publish. This should then start to upload your web api project to the Windows Azure Website. The deploy should be relatively quick and no where near the time it takes to deploy a cloud service. When completed, your deployed website should start in the browser and you can carry out whatever tests you need.

Step 5: With your website deployed you should just need to change the url of your service in the Window 8 application.

This whole process took less than 10 minutes to setup and deploy. One of the nice features of using websites is that changes are quick to deploy.

We had a number of issues to get this all working fully:

  1. As I mentioned earlier we had to ensure that the database was created before the EF code would create the correct tables
  2. When we first ran the Windows 8 application we were getting an error each time we tried to use SignalR. We received an “Incompatible protocol version”. This was because I installed the latest SignalR libraries on the server side code but the client was using an older version. You need to make sure that both the client and server are using the same version of SignalR
  3. We also had an issue when deployed to Windows Azure where it looked like the SignalR hubs were not being created correctly. It looked like the hub creation was hanging and not returning. This is a known issue that has been fixed but not yet deployed to Azure. There is a work around which is to configure SignalR to use long polling (https://github.com/SignalR/SignalR/issues/510). We did that with the following code:
   1: hubConnection = new HubConnection(App.SignalRUrl);            
   2: proxy = App.hubConnection.CreateHubProxy("statushub");
   3: App.hubConnection.Start(new LongPollingTransport()).Wait();

Windows Azure Web Sites is not just for web sites, using it also for services can make a lot of sense as the scaling model will allow a lot of flexibility and can provide a cost effective way to host your services, especially if they are not heavily loaded at the start. They are also easy and fast to deploy which is always a bonus Smile

Windows Azure Training Kit–June 2012 Release

The Windows Azure Training Kit June 2012 release is out now with the following features:

  • 12 new hands-on labs for Windows Azure Virtual Machines
  • 11 new hands-on labs for Windows Azure Web Sites
  • 2 new hands-on labs demonstrating Windows Azure with Windows 8 Metro-style applications
  • Several new hands-on labs for Node.js and PHP using Mac OS X
  • Updated content for the latest Windows Azure SDKs, tools, and new Windows Azure Management Portal
  • New and updated presentations designed to support individual sessions to a full 3 day training workshops

Publishing Windows Azure Websites with TFS

This is a follow on post from my introduction to Windows Azure Websites and shows you how you can synchronise your website in TFS with Windows Azure.

One of the biggest problems with the way you deploy applications to  Windows Azure is that minor changes (e.g .markup, content and styling) require a redeploy to publish the changes. Windows Azure Websites solves this problem by allowing you to synchronise your website with Team Foundation Server or GIT.

In this post I will show you how easy it is to manage your websites in version controlled environment using Team Foundation Service. Team Foundation Service is a cloud hosted version of Team Foundation Server.

This works by creating a continuous integration build with your source code that will automatically deploy your website after successful build each time code is checked in.

This is configured as follows:

Click the “+” button at the bottom of your portal screen and select Website –> Quick Create

image

Enter the url details and click Create Web Site

image

An Empty site has now been created.

This site now needs to be link to your Team Foundation Service. Click on the website in the dash board and then select “Setup TFS Publishing”. you will also note that you can use a GIT repository as well as TFS.

image

Enter your TFS url (or create a new one), then click Authorize Now.

image

this connects through to your TFS service and setup the CI build that will deploy your application to the cloud.

The TFS site will now be displayed asking you to authorize the connection

image

You now need to pick the website you want to deploy. If you haven’t create a site yet then you need to go to ~Visual Studio, create your site and check it in to TFS.

image

You have now linked your web site in TFS to the Azure Website. This will take a few moments to synchronise.

image

Your website has not been deployed yet. You need to make a change and then check the changes in

image

upon check-in the build is started

image

image

When the build is complete the new website is deployed

image

image

You can also revert back to older versions of the web site by clicking the desired version and then clicking redeploy:

image

This will start the redeploy of the older version:

image

A new build is kicked off using the same changeset details as the original deployment. Once the build is complete the  web site is reverted back. this whole cycle only took a few minutes so it is a lot faster than the redeploy mechanism you had previously.

image

image

TFS and Windows Azure provide a good mechanism for version controlling your website. Adding application life cycle management to any software development activity is a good thing.

Session State in Windows Azure

We recently moved a web application into Windows Azure that was using session state. As it was running on a single webserver the session state was set to InProc but this is not useful when in a multi-server environment as the session is stored on the specific machine and is therefore not accessible to other machines. There were a number of options:

  1. Use the Windows AppFabric Caching service (http://msdn.microsoft.com/en-us/library/windowsazure/gg278339.aspx)
  2. Use SQL Azure (http://blogs.msdn.com/b/sqlazure/archive/2010/08/04/10046103.aspx)
  3. Use Windows Azure Storage

Windows Azure Storage seemed to be the more cost effective version as the site does not currently use SQL Azure and they have purchased a subscription for Azure which includes both transaction and storage costs.

There is a sample asp.net session provider that uses Windows Azure Table Storage as its backing store. The sample can be downloaded from MSDN at

http://code.msdn.microsoft.com/windowsazure/Windows-Azure-ASPNET-03d5dc14

How to use the Azure Storage Session State Provider

Add the following Session State provider config to the web.config file of the project

   1: <!-- SessionState Provider Configuration -->
   2: <sessionState mode="Custom"
   3:               customProvider="TableStorageSessionStateProvider">
   4:   <providers>
   5:     <clear/>
   6:     <add name="TableStorageSessionStateProvider" type="Microsoft.Samples.ServiceHosting.AspProviders.TableStorageSessionStateProvider"/>
   7:   </providers>
   8: </sessionState>

Add your windows azure storage connection string (DataConnectionString) to each web role that requires session state (Not setting this will result in an object reference not set to an instance of an object exception)

Add a reference to the ASPProviders.dll taken from the sample project and make sure that the Copy Local property is set to true (Not setting this will cause an unable to load exception)

image

We also added a reference to System.DataServices.Client and set copy local to true on this too.(Not sure if this is needed)

Once this is setup and running, add multiple instances to your role configuration and run in the debugger. Make sure you can navigate to the page that has the session data in. I put a break point onto the action of the page and added a watch for Microsoft.WindowsAzure.ServiceRuntime.RoleInstance.CurrentRoleInstance.Id and checked to see if it changed and if it did change checked to see if the session data was visible.

You may well get the following error when you are using session as all the objects that are put into the Azure Table Storage session object need to be serializable.

Unable to serialize the session state. In 'StateServer' and 'SQLServer' mode, ASP.NET will serialize the session state objects, and as a result non-serializable objects or MarshalByRef objects are not permitted. The same restriction applies if similar serialization is done by the custom session state store in 'Custom' mode.

You can check to see the session data in the Azure Storage Server Explorer.

image

We are going to run this for a while to se how well it works and also see what debris is left behind in the table and blob storage due to ended sessions. We might have to have a job running that tidies up the expired sessions later.