Steve Spencer's Blog

Blogging on Azure Stuff

Publishing Windows Azure Websites with TFS

This is a follow on post from my introduction to Windows Azure Websites and shows you how you can synchronise your website in TFS with Windows Azure.

One of the biggest problems with the way you deploy applications to  Windows Azure is that minor changes (e.g .markup, content and styling) require a redeploy to publish the changes. Windows Azure Websites solves this problem by allowing you to synchronise your website with Team Foundation Server or GIT.

In this post I will show you how easy it is to manage your websites in version controlled environment using Team Foundation Service. Team Foundation Service is a cloud hosted version of Team Foundation Server.

This works by creating a continuous integration build with your source code that will automatically deploy your website after successful build each time code is checked in.

This is configured as follows:

Click the “+” button at the bottom of your portal screen and select Website –> Quick Create

image

Enter the url details and click Create Web Site

image

An Empty site has now been created.

This site now needs to be link to your Team Foundation Service. Click on the website in the dash board and then select “Setup TFS Publishing”. you will also note that you can use a GIT repository as well as TFS.

image

Enter your TFS url (or create a new one), then click Authorize Now.

image

this connects through to your TFS service and setup the CI build that will deploy your application to the cloud.

The TFS site will now be displayed asking you to authorize the connection

image

You now need to pick the website you want to deploy. If you haven’t create a site yet then you need to go to ~Visual Studio, create your site and check it in to TFS.

image

You have now linked your web site in TFS to the Azure Website. This will take a few moments to synchronise.

image

Your website has not been deployed yet. You need to make a change and then check the changes in

image

upon check-in the build is started

image

image

When the build is complete the new website is deployed

image

image

You can also revert back to older versions of the web site by clicking the desired version and then clicking redeploy:

image

This will start the redeploy of the older version:

image

A new build is kicked off using the same changeset details as the original deployment. Once the build is complete the  web site is reverted back. this whole cycle only took a few minutes so it is a lot faster than the redeploy mechanism you had previously.

image

image

TFS and Windows Azure provide a good mechanism for version controlling your website. Adding application life cycle management to any software development activity is a good thing.

Introducing Windows Azure Websites

After the announcement about the new Windows Azure Services to Deliver a Hybrid Cloud and on Scott Guthrie’s blog, this blog post shows a little of what the Windows Azure Websites will bring.

These are additional services that allow quick and easy creation and deployment of websites to the Windows Azure environment. Websites can be deployed using templates from some of the leading CMS/Blog providers (e.g. Word Press, Umbraco, Orchard, DotNetNuke) as well as being able to deploy your own websites. Windows Azure websites supports a number of technologies (e.g. ASP.NET, PHP, Node.js and Classic ASP). Upon deploying a website to Windows Azure you will have all the benefits of the Windows Azure cloud computing platform for scalability, resilience and performance. This post will guide you through setting up a new CMS enabled website using one of the templates.

When logging in, the first thing to note is the new portal. This is a rewrite in HTML5.

image

Click on “Websites” in the left hand column:

image

The click “Create a website”:

image

Click “From Gallery”, enter your url and click “Create Web Site”

You can now choose from a number of templates

image

image

Pick the template you want and press Next

image

Enter the details on the form (don't for get to scroll down to enter your password)

image

click Next to specify your database settings

image

and Next again to specify your database server settings

image

Clicking Next will now provision your new Umbraco website

image

Clicking on the bar graph on the bottom right corner give you a status update.

image

The whole deployment took about 3 minutes. Your website is now ready to use. Although you will need to now add in the content and configure up Umbraco. Navigating to your new website will guide you through the install process.

image

image

In order to configure Umbraco you will need your database connection string. this can be found back in the Portal:

Click on the SQL Databases tab

image

Click the database name the View Connection String

image

You can now get the server and database names

image

Now you can finish installing Umbraco. when complete you will need to add your own content

image

This whole process is fast and easy and will allow you to deploy websites for your customers in a fast and efficient way. Now it is running one final  and useful feature that has been added to the portal is a dashboard where you can se the health of your website as well as  being able to scale the website if required

image

image

Look out for a follow on post that shows you how to synchronise your website in the cloud with Team Foundation Service.

Windows Azure Diagnostics

Diagnostics in any application is a necessity and Windows Azure applications are no different. You could remote desktop onto the instance and check the event logs and even run up debug view so that you can see your system diagnostic messages. There is however a mechanism provided to retrieve a whole load of diagnostic information. By enabling Windows Azure Diagnostics you can retrieve the diagnostic trace logs, windows event logs, performance counters and other useful data. Enabling Windows Azure Diagnostics can be done by setting the Enable Diagnostics check in the configuration of each Azure role. You also need to add a diagnostic connection string as follows:

image

This enables diagnostics but this alone will not provide you with any information. Windows Azure diagnostics works by capturing the data that you have requested and periodically transferring it to table or blob storage where you can view the information. The information can be requested by configuring the diagnostics you require either in code or in a config file. The config file resides in blob storage and can be changed at runtime so it make sense to use that configuration mechanism rather than code as it can be easily turned off when not in use.

Firstly you may want to retrieve your tracing information that is traced using System.Diagnostic.Trace. the easiest way to do this is to add a trace listener. The Windows Azure SDK contains one that can be used. this is called DiagnosticMonitorTraceListener. This can be added to the web.config file in the same way as other listeners or via code. When I added it to the web config file I had issue on certain projects where the Windows Azure Diagnostics assembly could not be found. Adding the configuration in code always seemed to work. As you will need to redeploy anyway to update your web config file in Windows Azure it makes little difference whether the configuration is in code or the web config file (except that you need to rebuild). In order to add it to code I added the following line to the  global.asx.cs file:

   1: void Application_Start(object sender, EventArgs e)
   2: {
   3:     // Code that runs on application startup
   4:     System.Diagnostics.Trace.Listeners.Add(new Microsoft.WindowsAzure.Diagnostics.DiagnosticMonitorTraceListener());
   5: }

You now need to modify the config file located in blob storage. The config file is located in blob storage inside the wad-control-container. There will be a config file in here for each deployment you have made with diagnostics enabled and for each instance. You will need to update each instance you wish to receive diagnostic information from.

Tracing can be enabled by modifying the following xml accordingly:

   1: <Logs>
   2:   <BufferQuotaInMB>1024</BufferQuotaInMB>
   3:   <ScheduledTransferPeriodInMinutes>5</ScheduledTransferPeriodInMinutes>
   4:   <ScheduledTransferLogLevelFilter>Verbose</ScheduledTransferLogLevelFilter>
   5: </Logs>

This will transfer the logs at 5 minute intervals at the highest log level. You can change the level to be Information or Error which represents whether you have used TraceInformation or TraceError whereas Verbose traces both and using Trace.WriteLine. The logs will be transferred to table storage in a table called “WADLogs”

Performance counters can be configured as follows:

   1: <PerformanceCounters>
   2:   <BufferQuotaInMB>0</BufferQuotaInMB>
   3:   <ScheduledTransferPeriodInMinutes>5</ScheduledTransferPeriodInMinutes>
   4:   <Subscriptions>
   5:     <PerformanceCounterConfiguration>
   6:       <CounterSpecifier>\Processor(_Total)\% Processor Time</CounterSpecifier>
   7:       <SampleRateInSeconds>30</SampleRateInSeconds>
   8:     </PerformanceCounterConfiguration>
   9:     <PerformanceCounterConfiguration>
  10:       <CounterSpecifier>\Memory\Available Bytes</CounterSpecifier>
  11:       <SampleRateInSeconds>30</SampleRateInSeconds>
  12:     </PerformanceCounterConfiguration>
  13:   </Subscriptions>
  14: </PerformanceCounters>

This will transfer the processor percentage and available bytes every 5 minutes with a 30 second sampling frequency. These are stored in table storage in a table called “WADPerformanceCountersTable”. Information about the performance counters that are available can be found here:

Event logs can be configured as follows:

   1: <WindowsEventLog bufferQuotaInMB="0"
   2:      scheduledTransferLogLevelFilter="Verbose"
   3:      scheduledTransferPeriod="1">
   4: <!-- The event log name is in the same format as the imperative 
   5:        diagnostics configuration API -->
   6:     <DataSource name="Application!*" />
   7:     <DataSource name="System!*" />
   8: </WindowsEventLog>

These are stored in a table called “WADWindowsEventLogsTable”.

Further information can be found at:

Example configuration file

Overview of Storing and Viewing Diagnostic Data in Windows Azure Storage

Imagine Cup NE

Just finished at the Hacking Imagination event in County Durham. My role was to help mentor and encourage the students from Universities and Colleges in the North East of England to help them to develop their ideas for their Imagine Cup entries.

The event was held in a secret location in county Durham and the students were bussed in.

DSC_0163

A number of us from Black Marble attended and performed workshops in Migrating to Azure (me), ALM/TFS (Richard), WPF, SilverLight, Phone 7 101 (Leon), Planning Poker (Me, Leon) & Software War Stories/How to Run a Software Business(Robert, Me). The students were all enthusiastic and asked a lot of sensible questions. The students were put into teams if they hadn’t already formed them, which was interesting to watch as an outsider as it was a bit like a recruitment fair with students walking around from team to team where both sides were deciding whether the person or the team were a good fit. This seemed to have worked well with everyone having a team. the age and skill levels range from college students doing their A levels to MSc.

The aim of the two days was to get the teams in a position where they had an idea that is more than just a software app with a good justification for it. This was a difficult task for a lot of teams as they either had a technology solution and tried to find a problem to solve or a really wide problem that they were trying to solve. Most of the discussions were around getting the teams to think about the whole solution including who it is aimed at and how they are going to deliver it rather than just the software side of the solution. It was also about getting some of the teams to focus down on to a specific area rather than being general. Some of the ideas changed each time I spoke to the teams!!

The (long) 2 days ended with each of the teams doing a presentation of their ideas to the whole group. The whole 2 days was enjoyable and the students seemed to gain a lot from the experiences, bringing some out of their shells.

The thing that impressed me most was the amount of young talent that is around and the abilities they have. One group of students were from a 6th form college doing their A Levels/BTEC and the college had around 15 – 20 attendees, all working hard to deliver a good thought through ideas and presenting their ideas to a group of about 50 people. That is something I used to avoid as much as possible when I was at that age. It is this talent that is the future of IT. Credit must go to the teacher/lecturer who has encouraged this amount of A Level/BTEC students to attend an event like this. Its not an easy task to get teenagers to do anything (he says from experience).

Events like the Imagine cup helps these guys (and girls) to gain real world experience and advice from experts in the field. Getting a chance to influence and help this talent has been rewarding for me and I want to try to help some of these young people more. So thanks to all the people who organised and mentored the event and thanks to the students that attended.

There was a lot of information being passed around and to finish off this post, I’ve put together a list of some of the areas that we were talking about in my sessions. I hope they are useful.

Video of wiring the Windows Azure Access Control service into an ASP.NET website. My bit is at the end and its a no code demo to allow users to login to you website using Live, Google or Yahoo accounts.

The Windows Azure Platform training Kit

The Windows Identity training Kit

Windows Azure Pricing& MSDN subscription allowances

Team Foundation Server in the Cloud

Books

Don't Make Me Think!: A Common Sense Approach to Web Usability : Steve Krug (ISBN: 0321344758)

The Art of Unit Testing – Roy Oshrove

Windows Azure Boot Camp – SQL Azure links

I promised a number of links during the SQL Azure talk at the London Windows Azure boot camp today & here they are:

Migration wizard
http://sqlazuremw.codeplex.com/

SQL Azure Federations
http://msdn.microsoft.com/en-us/library/windowsazure/hh597452.aspx

Unsupported SQL Azure T-SQL

http://msdn.microsoft.com/en-us/library/windowsazure/ee336253.aspx

Partially Supported T-SQL
http://msdn.microsoft.com/en-us/library/windowsazure/ee336267.aspx

Azure Costing Considerations

Costing a system for running on Windows Azure seems complex at times and sometimes it is easy to calculate and other times it is not. However there have been a number of times in recent months where I have experienced higher than expected costs for Windows Azure. The first thing to check is the actual bill. The bills are pretty comprehensive and will itemise the usage pretty well. You should easily be able to identify overspend. The following areas are where we have seen unexpected expenses appear.

Compute

The compute prices are straight forward in Windows Azure and you pay for each instance hour you use. The minimum duration is 1 hour and is metered in blocks of hours that means if you have an instance running for 1 minute or 59 minutes you will be charged for 1 hour, similarly if you had 2 instances running at the start of the hour and then turn one off you would still be charged for 2 hours. It may also get complicated if you then put the second instance back up within the same hour. You will probably be charged for 3 hours as the second instance reappearing would be treated as a new deployment and would most likely be on a different machine (although I’ve never checked this) thus being seen as different to the first time the instance was active at the beginning of the hour. This in reality won’t cost you too much as these examples are not likely to happen. What is likely to happen and could cost a significant amount are the following

· Staging & Production

· Instance count

When you upgrade your system using the VIP swap mechanism, you will deploy your new software to the Staging area, do your testing and then VIP swap Production for Staging. During the period where you have something deployed in Staging you will be charged for the number of instances you have running in Production plus the number of instances you have running in Staging. For example I have deployed 2 instances to Production and want to upgrade my software using VIP swap. I then deploy my software to Staging configured to have 2 instances running and carry out the tests. This process takes 30 minutes. I then VIP swap Staging to Production. I will be charged for 4 instances for that hour. If I then do not undeploy the software that is currently in Staging (i.e. the old software that was previously in Production) I will continue to be charged for 4 instances. You may want to leave the old software there so that you can swap back easily if there is an issue, but remember you are being charged twice still.

Another area where you could be getting billed is if you don’t bring the instance count back down after your peak loading has subsided. An example of this is that you increase your instance count on your website just before a big marketing campaign kicks in. Once the campaign is complete it is likely that the load of your website will drop. At this point it would make sense to reduce the instance count of your website otherwise you are paying more that you need to for your site.

Data

With Windows Azure storage you get charged for the space needed to store the data, the cost to transfer the data out of the data centre and the cost of transactions on the data storage API. Just to clarify a storage transaction is effectively a call to the Storage API, whether that is to retrieve data, query queues, delete data etc. Each is at least one transaction (Some data retrievals might be multiple SDK calls). Data storage costs are straightforward, but bandwidth and transaction costs could take you by surprise if you are not careful.

Bandwidth charges cover the cost to extract data from the data centre, that could be the presentation of a web page to a user, extraction of a blob extracting data from a queue outside of Windows Azure etc. unless you are regularly moving data outside of Windows azure the bandwidth costs are relatively small (when compared to the compute costs) e.g. $0.12 per GB. Another thing to bear in mind is that data transfer within the data centre is free and also data transferred into the data centre is also currently free. This means you can put a lot of data into a data centre, process it and then store it without incurring any bandwidth charges. You will only be charged for bandwidth when you pull the data out of the data centre. It is therefore important to make sure that if you are transferring data around your system that all your compute and data storage (Storage and SQL Azure) are all in the same data centre. If you don’t then you will be charged for bandwidth to get the data from one data centre to another. For example, in the diagram below we put our web site in Dublin and SQL Azure in Amsterdam.

clip_image002

In the simple scenario of putting some data into our database from a website and then doing a single retrieve you can see that the data will incur 3 bandwidth charges when it should only have 1. With a lot of data transfers the costs will soon mount up.

Transactions charges also appear to be relatively low when compared to compute costs and most of the time they are, but as you are charged each time a method on the Storage API is called you could be getting charged even when there is no data being processed. This would happen when you are processing a queue. Each time you call the SDK to see if there is something to process you would be registering a transaction, therefore if you had this code in a loop that regularly checked the queue you could start to rack up hidden costs. For example a queue is polled once per second this would cost around $300 per year, however if you then reduced the polling time to check it every 100ms then the cost would be around $3000 per year, now multiply that by the number of instance you have running and the costs significantly rise. Do you really need to poll every 100ms, can you redesign to an event driven system or poll less often. Polling the queue every minute for example reduces the costs from $3000 to $5 per year

The charging concept needs to be understood by your developers so that they you don’t end up with surprise bills. You also need to be careful not to engineer a solution that is overly complex just to save transaction costs when the costs could be negligible in the scheme of things. Sensible calculations up front can help you to balance the costs effectively but watch out for the charges mentioned above as these are often done by mistake.

Windows Azure Platform Training Kit Update

If you attended the Black Marble Architecture Event yesterday you would have seen a number of talks around Azure and the Windows Azure Platform Training Kit was mentioned a number of times.

The latest update to the training kit is here:

http://www.microsoft.com/download/en/details.aspx?id=8396

This update includes the changes for the Azure 1.6 SDK plus updates and new demos.

The training kit is a free resource that provides a good introduction to Azure and covers a large amount including Windows Azure, SQL Azure, AppFabric (Service Bus, Caching, Access Control) plus a load of other stuff.

Deploying Windows Azure Toolkit For Windows 8 Notification Service to the Cloud

If you have installed the Window Azure Toolkit For Windows 8 you may want to deploy it to a real live environment so that you can try out the notification service and start wiring useful Windows 8 applications. I assumed this was going to be a quick task, but it took me a little longer than expected. Firstly when you try and deploy the web application and service to Windows Azure it won’t just deploy out of the box. You need to sort out the certificates for SSL, Out of the box the certificates come as a cer file and Windows Azure only accepts pfx file so you will need to convert the file.  I set up the certificate and deployed to Azure but when I connected my windows 8 application to register for notifications I kept getting errors connecting to my registration service. After a number of attempts to connect I determined that it was an issue with the certificates. The Windows 8 application has an appmanifest file which contains the certificate information. I set this up as I thought was correct but I still could not get the application to talk to my Azure service. Running in the debugger didn’t seem to give me any error diagnostics. Eventually I found this article which provided me with a bit more detail as to what was required (I was doing most of what was suggested). A number of additional issues arose which slowed me down a bit further.

1. When creating a new certificate I needed to run the command prompt as administrator. On my computer my user account is not an administrator so when I created a new certificate. In order to export the certificate I needed to run the certmgr as administrator.

2. Selecting the certificate in Visual Studio to assign to the endpoint was also an issue as it is deployed as administrator so it didn’t seem to appear in the list. I found the certificate and then copied its thumbprint (converted it to uppercase letters) and pasted it into the thumbprint field in the certificate in the role properties.

The Azure application was then deployed to Azure and the new certificate added to the Windows 8 client as per the instructions in the article above.

You should now be able to login, upload images and send notifications.

Now that’s working I can start to build a proper notification service.

Session State in Windows Azure

We recently moved a web application into Windows Azure that was using session state. As it was running on a single webserver the session state was set to InProc but this is not useful when in a multi-server environment as the session is stored on the specific machine and is therefore not accessible to other machines. There were a number of options:

  1. Use the Windows AppFabric Caching service (http://msdn.microsoft.com/en-us/library/windowsazure/gg278339.aspx)
  2. Use SQL Azure (http://blogs.msdn.com/b/sqlazure/archive/2010/08/04/10046103.aspx)
  3. Use Windows Azure Storage

Windows Azure Storage seemed to be the more cost effective version as the site does not currently use SQL Azure and they have purchased a subscription for Azure which includes both transaction and storage costs.

There is a sample asp.net session provider that uses Windows Azure Table Storage as its backing store. The sample can be downloaded from MSDN at

http://code.msdn.microsoft.com/windowsazure/Windows-Azure-ASPNET-03d5dc14

How to use the Azure Storage Session State Provider

Add the following Session State provider config to the web.config file of the project

   1: <!-- SessionState Provider Configuration -->
   2: <sessionState mode="Custom"
   3:               customProvider="TableStorageSessionStateProvider">
   4:   <providers>
   5:     <clear/>
   6:     <add name="TableStorageSessionStateProvider" type="Microsoft.Samples.ServiceHosting.AspProviders.TableStorageSessionStateProvider"/>
   7:   </providers>
   8: </sessionState>

Add your windows azure storage connection string (DataConnectionString) to each web role that requires session state (Not setting this will result in an object reference not set to an instance of an object exception)

Add a reference to the ASPProviders.dll taken from the sample project and make sure that the Copy Local property is set to true (Not setting this will cause an unable to load exception)

image

We also added a reference to System.DataServices.Client and set copy local to true on this too.(Not sure if this is needed)

Once this is setup and running, add multiple instances to your role configuration and run in the debugger. Make sure you can navigate to the page that has the session data in. I put a break point onto the action of the page and added a watch for Microsoft.WindowsAzure.ServiceRuntime.RoleInstance.CurrentRoleInstance.Id and checked to see if it changed and if it did change checked to see if the session data was visible.

You may well get the following error when you are using session as all the objects that are put into the Azure Table Storage session object need to be serializable.

Unable to serialize the session state. In 'StateServer' and 'SQLServer' mode, ASP.NET will serialize the session state objects, and as a result non-serializable objects or MarshalByRef objects are not permitted. The same restriction applies if similar serialization is done by the custom session state store in 'Custom' mode.

You can check to see the session data in the Azure Storage Server Explorer.

image

We are going to run this for a while to se how well it works and also see what debris is left behind in the table and blob storage due to ended sessions. We might have to have a job running that tidies up the expired sessions later.

How Windows Azure Service Bus helped pin point a configuration error

This week we had a very useful side effect of using the Window Azure Service bus. We have an Azure hosted website that connects to a CRM backend using the service bus in relay mode to communicate between the two systems. We had a test system that worked fine but when we moved to a Live system we had a configuration error in one of the systems but it was difficult to identify.

The way the service bus works means that the server can easily be moved (as long as the server has an outgoing internet connection).Our service bus at the server side is a Windows Service but we also have a console application to help us with debugging as all the traces are logged to the console window. We turned off the service on the Live system and started it on the test system. As the Azure hosted website is connecting to the service bus rather than a specific server the website will now connect to our test system. By running a successful connection test on the Azure hosted site we could prove that the Azure website configuration was correct.

The next change was to configure the test system to point to the Live CRM system. This would prove whether our data was correct or not. Running the same test as before proved that our data migration to the Live CRM system was fine.

This left us with the service bus and the business logic web service running on the test system, so we reconfigured the Live service bus service to point to the Test web service (which we had previously configured to connect to the live CRM system) and this also work. Thus proving we had an issue with the business logic service.

What we were able to do then was to move the service bus console application on to a developers machine and run it in Visual Studio so that we could debug and break on the calls to the business logic service which helped us to easily identify the problem. All this was done without needing to reconfigure or redeploy our Azure hosted website.

I wish I could say that this ease of debugging was one of the reasons we chose to use the service bus, but I would be lying. The fact that it has made our debugging so much easier will now have an influence on its future use.