Steve Spencer's Blog

Blogging on Azure Stuff

Azure Costing Considerations

Costing a system for running on Windows Azure seems complex at times and sometimes it is easy to calculate and other times it is not. However there have been a number of times in recent months where I have experienced higher than expected costs for Windows Azure. The first thing to check is the actual bill. The bills are pretty comprehensive and will itemise the usage pretty well. You should easily be able to identify overspend. The following areas are where we have seen unexpected expenses appear.

Compute

The compute prices are straight forward in Windows Azure and you pay for each instance hour you use. The minimum duration is 1 hour and is metered in blocks of hours that means if you have an instance running for 1 minute or 59 minutes you will be charged for 1 hour, similarly if you had 2 instances running at the start of the hour and then turn one off you would still be charged for 2 hours. It may also get complicated if you then put the second instance back up within the same hour. You will probably be charged for 3 hours as the second instance reappearing would be treated as a new deployment and would most likely be on a different machine (although I’ve never checked this) thus being seen as different to the first time the instance was active at the beginning of the hour. This in reality won’t cost you too much as these examples are not likely to happen. What is likely to happen and could cost a significant amount are the following

· Staging & Production

· Instance count

When you upgrade your system using the VIP swap mechanism, you will deploy your new software to the Staging area, do your testing and then VIP swap Production for Staging. During the period where you have something deployed in Staging you will be charged for the number of instances you have running in Production plus the number of instances you have running in Staging. For example I have deployed 2 instances to Production and want to upgrade my software using VIP swap. I then deploy my software to Staging configured to have 2 instances running and carry out the tests. This process takes 30 minutes. I then VIP swap Staging to Production. I will be charged for 4 instances for that hour. If I then do not undeploy the software that is currently in Staging (i.e. the old software that was previously in Production) I will continue to be charged for 4 instances. You may want to leave the old software there so that you can swap back easily if there is an issue, but remember you are being charged twice still.

Another area where you could be getting billed is if you don’t bring the instance count back down after your peak loading has subsided. An example of this is that you increase your instance count on your website just before a big marketing campaign kicks in. Once the campaign is complete it is likely that the load of your website will drop. At this point it would make sense to reduce the instance count of your website otherwise you are paying more that you need to for your site.

Data

With Windows Azure storage you get charged for the space needed to store the data, the cost to transfer the data out of the data centre and the cost of transactions on the data storage API. Just to clarify a storage transaction is effectively a call to the Storage API, whether that is to retrieve data, query queues, delete data etc. Each is at least one transaction (Some data retrievals might be multiple SDK calls). Data storage costs are straightforward, but bandwidth and transaction costs could take you by surprise if you are not careful.

Bandwidth charges cover the cost to extract data from the data centre, that could be the presentation of a web page to a user, extraction of a blob extracting data from a queue outside of Windows Azure etc. unless you are regularly moving data outside of Windows azure the bandwidth costs are relatively small (when compared to the compute costs) e.g. $0.12 per GB. Another thing to bear in mind is that data transfer within the data centre is free and also data transferred into the data centre is also currently free. This means you can put a lot of data into a data centre, process it and then store it without incurring any bandwidth charges. You will only be charged for bandwidth when you pull the data out of the data centre. It is therefore important to make sure that if you are transferring data around your system that all your compute and data storage (Storage and SQL Azure) are all in the same data centre. If you don’t then you will be charged for bandwidth to get the data from one data centre to another. For example, in the diagram below we put our web site in Dublin and SQL Azure in Amsterdam.

clip_image002

In the simple scenario of putting some data into our database from a website and then doing a single retrieve you can see that the data will incur 3 bandwidth charges when it should only have 1. With a lot of data transfers the costs will soon mount up.

Transactions charges also appear to be relatively low when compared to compute costs and most of the time they are, but as you are charged each time a method on the Storage API is called you could be getting charged even when there is no data being processed. This would happen when you are processing a queue. Each time you call the SDK to see if there is something to process you would be registering a transaction, therefore if you had this code in a loop that regularly checked the queue you could start to rack up hidden costs. For example a queue is polled once per second this would cost around $300 per year, however if you then reduced the polling time to check it every 100ms then the cost would be around $3000 per year, now multiply that by the number of instance you have running and the costs significantly rise. Do you really need to poll every 100ms, can you redesign to an event driven system or poll less often. Polling the queue every minute for example reduces the costs from $3000 to $5 per year

The charging concept needs to be understood by your developers so that they you don’t end up with surprise bills. You also need to be careful not to engineer a solution that is overly complex just to save transaction costs when the costs could be negligible in the scheme of things. Sensible calculations up front can help you to balance the costs effectively but watch out for the charges mentioned above as these are often done by mistake.

Windows Azure Platform Training Kit Update

If you attended the Black Marble Architecture Event yesterday you would have seen a number of talks around Azure and the Windows Azure Platform Training Kit was mentioned a number of times.

The latest update to the training kit is here:

http://www.microsoft.com/download/en/details.aspx?id=8396

This update includes the changes for the Azure 1.6 SDK plus updates and new demos.

The training kit is a free resource that provides a good introduction to Azure and covers a large amount including Windows Azure, SQL Azure, AppFabric (Service Bus, Caching, Access Control) plus a load of other stuff.

Deploying Windows Azure Toolkit For Windows 8 Notification Service to the Cloud

If you have installed the Window Azure Toolkit For Windows 8 you may want to deploy it to a real live environment so that you can try out the notification service and start wiring useful Windows 8 applications. I assumed this was going to be a quick task, but it took me a little longer than expected. Firstly when you try and deploy the web application and service to Windows Azure it won’t just deploy out of the box. You need to sort out the certificates for SSL, Out of the box the certificates come as a cer file and Windows Azure only accepts pfx file so you will need to convert the file.  I set up the certificate and deployed to Azure but when I connected my windows 8 application to register for notifications I kept getting errors connecting to my registration service. After a number of attempts to connect I determined that it was an issue with the certificates. The Windows 8 application has an appmanifest file which contains the certificate information. I set this up as I thought was correct but I still could not get the application to talk to my Azure service. Running in the debugger didn’t seem to give me any error diagnostics. Eventually I found this article which provided me with a bit more detail as to what was required (I was doing most of what was suggested). A number of additional issues arose which slowed me down a bit further.

1. When creating a new certificate I needed to run the command prompt as administrator. On my computer my user account is not an administrator so when I created a new certificate. In order to export the certificate I needed to run the certmgr as administrator.

2. Selecting the certificate in Visual Studio to assign to the endpoint was also an issue as it is deployed as administrator so it didn’t seem to appear in the list. I found the certificate and then copied its thumbprint (converted it to uppercase letters) and pasted it into the thumbprint field in the certificate in the role properties.

The Azure application was then deployed to Azure and the new certificate added to the Windows 8 client as per the instructions in the article above.

You should now be able to login, upload images and send notifications.

Now that’s working I can start to build a proper notification service.

Session State in Windows Azure

We recently moved a web application into Windows Azure that was using session state. As it was running on a single webserver the session state was set to InProc but this is not useful when in a multi-server environment as the session is stored on the specific machine and is therefore not accessible to other machines. There were a number of options:

  1. Use the Windows AppFabric Caching service (http://msdn.microsoft.com/en-us/library/windowsazure/gg278339.aspx)
  2. Use SQL Azure (http://blogs.msdn.com/b/sqlazure/archive/2010/08/04/10046103.aspx)
  3. Use Windows Azure Storage

Windows Azure Storage seemed to be the more cost effective version as the site does not currently use SQL Azure and they have purchased a subscription for Azure which includes both transaction and storage costs.

There is a sample asp.net session provider that uses Windows Azure Table Storage as its backing store. The sample can be downloaded from MSDN at

http://code.msdn.microsoft.com/windowsazure/Windows-Azure-ASPNET-03d5dc14

How to use the Azure Storage Session State Provider

Add the following Session State provider config to the web.config file of the project

   1: <!-- SessionState Provider Configuration -->
   2: <sessionState mode="Custom"
   3:               customProvider="TableStorageSessionStateProvider">
   4:   <providers>
   5:     <clear/>
   6:     <add name="TableStorageSessionStateProvider" type="Microsoft.Samples.ServiceHosting.AspProviders.TableStorageSessionStateProvider"/>
   7:   </providers>
   8: </sessionState>

Add your windows azure storage connection string (DataConnectionString) to each web role that requires session state (Not setting this will result in an object reference not set to an instance of an object exception)

Add a reference to the ASPProviders.dll taken from the sample project and make sure that the Copy Local property is set to true (Not setting this will cause an unable to load exception)

image

We also added a reference to System.DataServices.Client and set copy local to true on this too.(Not sure if this is needed)

Once this is setup and running, add multiple instances to your role configuration and run in the debugger. Make sure you can navigate to the page that has the session data in. I put a break point onto the action of the page and added a watch for Microsoft.WindowsAzure.ServiceRuntime.RoleInstance.CurrentRoleInstance.Id and checked to see if it changed and if it did change checked to see if the session data was visible.

You may well get the following error when you are using session as all the objects that are put into the Azure Table Storage session object need to be serializable.

Unable to serialize the session state. In 'StateServer' and 'SQLServer' mode, ASP.NET will serialize the session state objects, and as a result non-serializable objects or MarshalByRef objects are not permitted. The same restriction applies if similar serialization is done by the custom session state store in 'Custom' mode.

You can check to see the session data in the Azure Storage Server Explorer.

image

We are going to run this for a while to se how well it works and also see what debris is left behind in the table and blob storage due to ended sessions. We might have to have a job running that tidies up the expired sessions later.

How Windows Azure Service Bus helped pin point a configuration error

This week we had a very useful side effect of using the Window Azure Service bus. We have an Azure hosted website that connects to a CRM backend using the service bus in relay mode to communicate between the two systems. We had a test system that worked fine but when we moved to a Live system we had a configuration error in one of the systems but it was difficult to identify.

The way the service bus works means that the server can easily be moved (as long as the server has an outgoing internet connection).Our service bus at the server side is a Windows Service but we also have a console application to help us with debugging as all the traces are logged to the console window. We turned off the service on the Live system and started it on the test system. As the Azure hosted website is connecting to the service bus rather than a specific server the website will now connect to our test system. By running a successful connection test on the Azure hosted site we could prove that the Azure website configuration was correct.

The next change was to configure the test system to point to the Live CRM system. This would prove whether our data was correct or not. Running the same test as before proved that our data migration to the Live CRM system was fine.

This left us with the service bus and the business logic web service running on the test system, so we reconfigured the Live service bus service to point to the Test web service (which we had previously configured to connect to the live CRM system) and this also work. Thus proving we had an issue with the business logic service.

What we were able to do then was to move the service bus console application on to a developers machine and run it in Visual Studio so that we could debug and break on the calls to the business logic service which helped us to easily identify the problem. All this was done without needing to reconfigure or redeploy our Azure hosted website.

I wish I could say that this ease of debugging was one of the reasons we chose to use the service bus, but I would be lying. The fact that it has made our debugging so much easier will now have an influence on its future use.

August Windows Azure Tools Release

A new release of the Windows Azure Tools for Visual Studio 2010 is available here

The release adds the following features:

  • Profile applications running in Windows Azure.
  • Create ASP.Net MVC3 Web Roles.
  • Manage multiple service configurations in one cloud project.
  • Improved validation of Windows Azure packages.

The Windows Azure Platform Training Kit has also be updated in line with this release. It can be downloaded from here. The training kit has the following changes:

  • [Updated] Labs and Demos to leverage the August 2011 release of the Windows Azure Tools for Microsoft Visual Studio 2010
  • [Updated] Windows Azure Deployment to use the latest version of the Azure Management Cmdlets
  • [Updated] Exploring Windows Azure Storage to support deleting snapshots
  • Applied several minor fixes in content

Using Azure Storage SDK outside of Azure

When trying to access the Azure Storage SDK in a non-Azure application I kept getting the following error:

 

“The type or namespace name 'WindowsAzure' does not exist in the namespace 'Microsoft' (are you missing an assembly reference?)    “

 

References to Microsoft.WindowsAzure.ServiceRuntime and Microsoft.WindowsAzure.StorageClient were both already included as references to the assembly that was trying to store data in a table. After a bit of investigation the reason for the error was that the assembly was set to have the target framework of “.Net Framework 4 Client Profile”. Changing to “.Net Framework 4” solved the problem.

Windows Azure Roles Fail to run when deployed to Azure

Recently I was helping out at the Azure Bootcamp in London and during the labs a common theme kept occurring when the labs were deployed to a real Azure account. The roles failed to run and it appeared that the deployment was taking forever.  This is something I experienced first hand when I was starting out with Azure. There is a way to diagnosing these deployment errors and it is by using IntelliTrace. During deployment you can enable IntelliTrace as part of the publish dialog

image

The IntelliTrace option is only available if you have Visual Studio 2010 Ultimate. Once deployed to Azure the Roles will attempt to start and any errors during this phase will lead to the symptoms mentioned above. You can then connect to your Azure environment using the Server Explorer in Visual Studio to retrieve the IntelliTrace files which can be opened in Visual Studio and show any exceptions that may have been thrown. Further information can be found here. Once you have diagnosed your issue please ensure at you then disable the IntelliTrace by redeploying the fixed application as it will have a negative impact on performance if left enabled.

 

Getting back to the problem we have at the Bootcamp, the issues was that the deployed application was trying to writing information to Azure storage and the connection string was still pointing to Development storage. This was strange because none of the deployed applications had got to the Azure storage part of the lab so you would have thought that there was no need for a connection string. Luckily I had the exact same problem with one of my earlier deployments and it turns out that when a project is created the Diagnostic plug-in is automatically enabled. The diagnostic plug-in requires its own connection string to Azure storage so that the diagnostic information can be stored. Looking at the role configuration in Visual Studio you can see the Diagnostic plug-in configuration.

image

To fix the deployment issue click the button next to the connection string text box and enter the details of your Azure Storage account.

image

You will need to redeploy the application or upload the new ServiceConfiguration.cscfg to fix this issue. If this still does not resolve the issue then try disabling the Diagnostics plug-in and redeploy.

Windows Azure Announcements

AppFabric

Microsoft have announced two new Azure AppFabric CTPs.

 

May 2011 CTP will include Service bus enhancements including

  • A more comprehensive pub/sub messaging
  • Integration with the Access Control Service V2
  • Queues based upon a new messaging infrastructure backed by a replicated, durable store.

See here for more details.

June 2011 CTP will include tooling to help with building, deploying and managing Windows Azure Applications including:

  • AppFabric Developer Tools
  • AppFabric Application Manager
  • Composition Model

See here for more details

SQL Azure

The SQL May 2011 update contains the following:

  • SQL Azure Management REST API – a web API for managing SQL Azure servers.
  • Multiple servers per subscription – create multiple SQL Azure servers per subscription.
  • JDBC Driver – updated database driver for Java applications to access SQL Server and SQL Azure.
  • DAC Framework 1.1 – making it easier to deploy databases and in-place upgrades on SQL Azure.

See here for more details