Steve Spencer's Blog

Blogging on Azure Stuff

Creating a Simple Workflow with Azure Webjobs and Service Bus

With the announcement of an upgrade to the webjobs service in Microsoft Azure service bus triggers to Topics and Queues were added.

This got me thinking about how they could be used and there are a lot of scenarios where events need to trigger simple actions as well as running things off of a timer. Utilising Service Bus Topics along with a number of filtered subscriptions led me to create a simple workflow using Webjobs and Topics.

Setting up web jobs is fairly easy and the two links at the top of this article will give you a good start to this along with the 3 articles written by Mike Stall (about Blobs, Queues and Tables) (although the syntax has changed with the latest update, but its easy enough to work out the changes)

The first thing to note which I didn’t pick up on straight away was that you need to  make sure that you set up connection strings for the diagnostics to work. I did this in the Configure section of the web site where I was deploying the webjob. I also added it to my app.config file so that I could debug locally in Visual Studio

image

They all point to my Azure Storage account.

The way I set this up was to create a message class that contained the data that I wanted to send between states. This message class would be wrapped in a BrokeredMessage and I would use the properties to determine the state of the message. The message is then added to a Service Bus Topic. By setting properties on the message, I could create a number of subscriptions that had an SqlFilter applied which would allow the subscription to only contain messages of a specific type.

When coding this I created a basic console application and added the following code to the Main method: 

// now create the subscriptions for the states
if (!_namespaceManager.SubscriptionExists(TopicName, TopicName + "start"))
{
    _namespaceManager.CreateSubscription(TopicName, TopicName + "start", 
                      new SqlFilter("State='WorkflowStart'"));
}
if (!_namespaceManager.SubscriptionExists(TopicName, TopicName + "state1"))
{
    _namespaceManager.CreateSubscription(TopicName, TopicName + "state1", 
                      new SqlFilter("State='WorkflowState1'"));
}
if (!_namespaceManager.SubscriptionExists(TopicName, TopicName + "state2"))
{
    _namespaceManager.CreateSubscription(TopicName, TopicName + "state2", 

                      new SqlFilter("State='WorkflowState2'"));
}

This allows me to create Topic triggers for the webjob for specific states as follows:

public static void SimpleWorkflowStart(
   [ServiceBusTrigger(TopicName, TopicName + "start")] BrokeredMessage message, TextWriter log)
{
      log.WriteLine("Workflow Started");
      .
      .
      .
 }

Two things to note with the method above. Firstly, the ServiceBusTrigger requires both the Topic name and the subscription name in order to receive the filtered list of messages. Secondly, by adding the TextWriter you can then send logging information to the dashboard, which is useful when trying to diagnose the webjobs when deployed to Azure.

I then setup a number of other methods for each subscription for each state. This allows me to perform whatever action I wanted to in each state. After carrying out the action I modified the message by changing the state property and then put it back onto the Topic. You can’t use the same brokered message so you either have to copy all the data out of the existing message and create a new one or you can use the Clone method on Brokered message (the easy option Smile).

// copied the old message
 BrokeredMessage newMessage = message.Clone();
// change the state
newMessage.Properties["State"] = "WorkflowState2";
TopicClient tc = TopicClient.CreateFromConnectionString(
               ConfigurationManager.ConnectionStrings["ServiceBus"].ConnectionString, TopicName);
// delay sending the message a little
newMessage.ScheduledEnqueueTimeUtc = DateTime.Now.AddSeconds(10);
// send it
tc.Send(newMessage);

In my example I also added a delay to the Enqueue of the message so that I could see things progressing in order. The enqueue time of the message is the delay before the message appears in the subscription. This could be also be used as a delayed trigger if you needed something done after a specific time interval. I also achieve this by adding the data to a Azure SQL db and then using a scheduled webjob to check the db for expired jobs. I created a separate console application for this which looked for messages that were expired due to a specific start date being older than a certain time, sending an email using SendGrid and then marking the record as complete in the db.

Webjobs made the actions easy to do and with the addition of the service bus triggers allowed me to have a fairly simple code structure to carry our a sequence of simple action (which is often what we need to do) without a heavy overhead of a workflow engine. It also utilised a hosting environment that I was already using.

Windows Store App Notifications, the Notification Hub and Background tasks

This article aims to talk about Windows Store Notifications and the Windows Azure Notifications Hub and it will attempt to collate the various articles in a single place to help you build notifications into your app.

In order for you to get an understanding of Windows notifications look at the following article

Introduction to Push Notifications - http://msdn.microsoft.com/en-us/library/windows/apps/hh913756.aspx. this provides a good overview of how push notifications work. To summarise the important bits.

1. Your store app needs to register with the Windows Notification Service to retrieve a unique URI for your instance of the app. Ideally you do this each time the app starts.

2. If the URI has changed then you need to notify your service that there is a new URI. Note: This URI expires every 30 days so your app needs to notify your service that this has been changed.

3. Your service sends notifications to this unique URI

You may have noticed above that I mentioned “Your service”. This is a critical piece of the notification mechanism and there are a number of ways to build this service. If you are not comfortable building backend services or you want something up and running quickly then mobile services might be the way to go for you. Here’s a tutorial that gets you started with mobile services http://www.windowsazure.com/en-us/develop/mobile/tutorials/get-started/

If, like me, you already have a source of data and a service then you will probably want to wire in notifications into your existing service. depending upon how many devices you have using your app may dictate the method that you get the notifications onto the users device. there are a number of options:

  1. Local updates
  2. Push Notifications
  3. Periodic Notifications

Local updates require the creation of a background task that Windows runs periodically that calls into your data service, retrieves the data to put on the tiles and sends out tile notifications using the Windows Store app SDK

Updating live tiles from a background task - http://msdn.microsoft.com/en-us/library/windows/apps/jj991805.aspx. Provides a tutorial on building a background task for your Windows Store App. this tutorial is for timer tasks but it can easily be used for push notification tasks. The bits that are likely to change are the details of the run method, the task registration and the package manifest.

Two more important links that you will require when you are dealing with notifications:

Tile template catalogue http://msdn.microsoft.com/en-us/library/windows/apps/hh761491.aspx

Toast template catalogue http://msdn.microsoft.com/en-us/library/windows/apps/hh761494.aspx

These two catalogues are important as they provide you with details of the xml you need for each type of notifications

Push notifications are sent through the Windows Notification Service to your device.

You can send notifications to your device from your service by creating a notification and sending it to each of the devices registered to your service via the Windows Notification Service.

If you have a large number of devices running your app then you will probably want to use the Windows Azure Notification Hub. This is the simplest way to manage notifications to your application as the notification hub handles scaling, managing of the device registration and also iterating around each device to send the notifications out. The Notification hub will also allow you to send notifications to Windows Phone, Apple and Android devices. To get started with the notification hubs follow this tutorial:http://www.windowsazure.com/en-us/manage/services/notification-hubs/getting-started-windows-dotnet/

The nice feature of the notification hub is that is makes the code needed to send notifications simple.

 

NotificationHubClient hub = NotificationHubClient.CreateClientFromConnectionString("<your notification hub connection string>", "<your hub name>");

 

var toast = @"<toast><visual><binding template=""ToastText01""><text id=""1"">Hello from a .NET App!</text></binding></visual></toast>";

 

await hub.SendWindowsNativeNotificationAsync(toast);

Compare this to the code to send the notification without the hub:

 

byte[] contentInBytes = Encoding.UTF8.GetBytes(xml);

 

 

HttpWebRequest request = HttpWebRequest.Create(uri) asHttpWebRequest;

request.Method =

"POST";

request.Headers.Add(

"X-WNS-Type", notificationType);

request.ContentType = contentType;

request.Headers.Add(

"Authorization", String.Format("Bearer {0}", accessToken.AccessToken));

 

 

using (Stream requestStream = request.GetRequestStream())

requestStream.Write(contentInBytes, 0, contentInBytes.Length);

 

 

In addition you will need to retrieve the list of devices that are registered for push notifications and iterate around the list to send this to each device. You will also require a service that receives the registrations and stores them in a data store. You need to manage the scalability of these services. On the down side the notification hub is charged per message which means the more often you send notifications the greater the costs where as hosting a service is load based and the notifications will be sent out slower as the number of devices increases but this would generally be a lower cost. If you also take into account that you will need to send out notifications for each tile size and that will increase the activity count on the notification hub for each tile size (currently 3).

[Update: You can send out a single notification for all tile sizes rather than 3 separate notifications by adding a binding for each tile size in your xml see http://msdn.microsoft.com/en-us/library/windows/apps/hh465439.aspx for more details]

It is possible to send custom notifications to your app which can be received directly in the app or by using a background task. These are called Raw notifications. In order to receive raw notifications in a background task your app needs to be configured to display on the start screen. However Raw Notifications can be received in your app whilst it is running when it is not configured to display on the start screen. A Raw Notification is a block of data up to 5KB in size and can be anything you want.

The following code will send a raw notifications using the notifications hub:

 

string rawNotification = prepareRAWPayload();

 

Notification notification = new Microsoft.ServiceBus.Notifications.WindowsNotification(rawNotification);

notification.Headers.Add(

"X-WNS-Cache-Policy", "cache");

notification.Headers.Add(

"X-WNS-Type", "wns/raw");

notification.ContentType =

"application/octet-stream";

 

 

var outcome = await hub.SendNotificationAsync(notification);

In order to receive Raw Notifications in your app you need to add an event to the channel you retrieve from the Windows Notification Service:

 

var channel = awaitPushNotificationChannelManager.CreatePushNotificationChannelForApplicationAsync();

 

channel.PushNotificationReceived += channel_PushNotificationReceived;

 

And then handle the notification received:

 

privatevoid channel_PushNotificationReceived(PushNotificationChannel sender, PushNotificationReceivedEventArgs args)

{

 

switch (args.NotificationType)

{

 

    casePushNotificationType.Raw:

 

        ReceiveNotification(args.RawNotification.Content);

 

    break;

}

}

 

Note: the content of the notification is the block of data that you sent out.

Sample background task for Raw Notifications is here: http://msdn.microsoft.com/en-us/library/windows/apps/jj709906.aspx

Guidelines for Raw Notifications can be found here: http://msdn.microsoft.com/en-us/library/windows/apps/hh761463.aspx

Periodic notifications also require a service but the application periodically calls into a service to retrieve the tile notifications without needing to process the source data and then create the notifications locally. details about how to use periodic notifications can be found here: http://msdn.microsoft.com/en-US/library/windows/apps/jj150587

In summary Windows Store application notifications can be send to the app in a variety of ways and the mechanism you choose will depend upon how quick and how many notifications are required. Push notifications allow notifications to be sent whenever they are ready to send. Periodic and Local updates are pull notifications and require a service to be available to pull the data from. All of these will require some sort of service and all have an associated costs. The notifications hub is a useful tool to assist with notifications and it can be useful to manage the device connections as well as sending out notifications to multiple device type. It does however come at a cost and you need to work out whether it is a cost effective mechanism for your solution.

Gadgeteer, Signal R, WebAPI & Windows Azure

After a good night in Hereford at the Smart Devs User Group and my presentation at DDDNorth

Here are the links from my presentation and some from questions asked:

Gadgeteer: http://www.netmf.com/gadgeteer/

Signal-R: http://www.asp.net/signalr/

Web API: http://www.asp.net/web-api

The Signal-R chat example can be found at: http://www.asp.net/signalr/overview/getting-started/tutorial-getting-started-with-signalr

Windows Azure Pricing Calculator : http://www.windowsazure.com/en-us/pricing/calculator/?scenario=full

Signal-R Scaleout using Service bus, SQL Server or Redis: http://www.asp.net/signalr/overview/performance-and-scaling/scaleout-in-signalr

The Windows Azure Training Kit: http://www.windowsazure.com/en-us/develop/net/other-resources/training-kit/

Gadgeteer Modules: http://proto-pic.co.uk/categories/development-boards/net.html

Fex Spider Starter Kit: http://proto-pic.co.uk/fez-spider-starter-kit/

 

In addition to these links I have more from my presentation at the DareDevs user group in Warrington

It is possible to drive a larger display from Gadgeteer using a VGA adapter. You use this the same way that the Display-T35 works using the SimpleGraphics interface for example.

VB eBook - Learn to Program with Visual Basic and Gadgeteer

Fez Cerberus Tinker Kit: https://www.ghielectronics.com/catalog/product/455 

Enabling Modern Apps

I’ve just finished presenting my talk on “Successfully Adopting the Cloud: TfGM Case Study”and there were a couple of questions that I said I would clarify.

1. What are the limits for the numbers of subscriptions per service bus topic. the answer is 2000. further details can be found at:http://msdn.microsoft.com/en-us/library/windowsazure/ee732538.aspx

2. what are the differences between Windows Azure SQL database and SQL Server 2012. The following pages provide the details:

Supported T-SQL: http://msdn.microsoft.com/en-us/library/ee336270.aspx

Partially supported T-SQL: http://msdn.microsoft.com/en-us/library/ee336267.aspx

Unsupported T-SQL: http://msdn.microsoft.com/en-us/library/ee336253.aspx

Guidelines and Limitations: http://msdn.microsoft.com/en-us/library/ff394102.aspx

3. Accessing the TfGM open data site requires you to register as a developer at: http://developer.tfgm.com

Thanks to everyone who attended I hope you found it useful.

Handling A Topic Dead Letter Queue in Windows Azure Service Bus

Whilst working on a project in which we we using the Topics on Windows Azure Service Bus, we noticed that our subscription queues (when viewed from the Windows Azure Management portal) didn’t seem to be empty even though our subscription queue processing code was working correctly. On closer inspection we found that our subscription queue was empty and the numbers in the management portal against the subscription were messages that had automatically faulted and had been moved into the Dead Letter queue.

The deadletter queue is a separate queue that allows messages that fail to be processed to be stored and analysed. The address of the deadletter queue is slightly different from your subscription queue and is the form:

YourTopic/Subscriptions/YourSubscription/ $DeadLetterQueue

for a subscription and

YourQueue/$DeadLetterQueue for a queue

Luckily you don’t have to remember this as there are helpful methods to retrieve the address for you:

SubscriptionClient.FormatDeadLetterPath(subscriptionClient.TopicPath, messagesSubscription.Name);

To create a subscription to the deadletter queue you need to append /$DeadLetterQueue to the subscription name when you create the subscription client

Once you have this address you can connect to the dead letter queue in the same way you would connect to the subscription queue. Once a deadletter brokered message is received the properties of the message should contain error information highlighting why it has failed. The message should also contain the message body from the original message. By default the subscription will move a faulty message to the dead letter queue after 10 attempts to deliver. You can also move the message yourself and put in sensible data in the properties if it fails to be processed by calling the DeadLetter method on the BrokeredMessage. The DeadLetter method allows you to pass in your own data to explain why the message has failed.

The DeadLetter can be deleted in the same was as a normal message by calling the Complete() method on the received dead letter message

Here is an example of retrieving a dead lettered message from a subscription queue

var baseAddress = Properties.Settings.Default.ServiceBusNamespace;
var issuerName = Properties.Settings.Default.ServiceBusUser;
var issuerKey = Properties.Settings.Default.ServiceBusKey;

Uri namespaceAddress = ServiceBusEnvironment.CreateServiceUri("sb", baseAddress, string.Empty);

this.namespaceManager = new NamespaceManager(namespaceAddress, 
                           TokenProvider.CreateSharedSecretTokenProvider(issuerName, issuerKey));
this.messagingFactory = MessagingFactory.Create(namespaceAddress, 
                           TokenProvider.CreateSharedSecretTokenProvider(issuerName, issuerKey));
var topic = this.namespaceManager.GetTopic(Properties.Settings.Default.TopicName);
if (topic != null)
{

    if (!namespaceManager.SubscriptionExists(topic.Path, 
                                  Properties.Settings.Default.SubscriptionName))
    {
        messagesSubscription = this.namespaceManager.CreateSubscription(topic.Path, 
                                            Properties.Settings.Default.SubscriptionName);
    }
    else
    {
        messagesSubscription = namespaceManager.GetSubscription(topic.Path, 
                                            Properties.Settings.Default.SubscriptionName);
    }
}
if (messagesSubscription != null)
{
    SubscriptionClient subscriptionClient = this.messagingFactory.CreateSubscriptionClient(
                                            messagesSubscription.TopicPath,
                                            messagesSubscription.Name, ReceiveMode.PeekLock);

   // Get the Dead Letter queue path for this subscription
    var dlQueueName = SubscriptionClient.FormatDeadLetterPath(subscriptionClient.TopicPath,
                                             messagesSubscription.Name);

   // Create a subscription client to the deadletter queue
    SubscriptionClient deadletterSubscriptionClient = messagingFactory.CreateSubscriptionClient(
                                           subscriptionClient.TopicPath, 
                                            messagesSubscription.Name + "/$DeadLetterQueue");

    // Get the dead letter message
    BrokeredMessage dl = deadletterSubscriptionClient.Receive(new TimeSpan(0, 0, 300));

   // get the properties
    StringBuilder sb = new StringBuilder();
    sb.AppendLine(string.Format("Enqueue Time {0}", dl.EnqueuedTimeUtc));
    foreach (var props in dl.Properties)
    {
        sb.AppendLine(string.Format("{0}:{1}", props.Key, props.Value));
    }
    dl.Complete();
}

Windows Azure Queues vs Service Bus Queues

If you have been wondering why you would use Windows Azure Queues that are part of the Storage Service or the queues that are part of the service bus then the following MSDN article will give you full details.

http://msdn.microsoft.com/en-us/library/hh767287(VS.103).aspx

Recent changes in pricing make the choice even harder. There are two specific areas I like that makes the service bus queue a better offering than the storage queues:

  1. Long connection timeout
  2. Topics

The long connection timeout means that I don't have to keep polling the service bus queue for messages. I can make a connection for say 20 minutes and then when a message is added to the queue my application immediately returns the data, which you then process and then reconnect to the queue to get the next message. After 20 minutes without a message the connection closes in the same way it does when a message is received except that the message is null. You then just reconnect again for another 20 minute. The makes your application a more event driven application rather than a polling application and it should be more responsive. You can make multiple connections to the queue this way and load balance in the same way as you would when polling queues.

The following code shows how you can connect with a long poll.

 1: NamespaceManager namespaceManager;
 2: MessagingFactory messagingFactory;
 3: Uri namespaceAddress = ServiceBusEnvironment.CreateServiceUri("sb", "yournamespace", string.Empty);
 4:  
 5: namespaceManager = new NamespaceManager(namespaceAddress, TokenProvider.CreateSharedSecretTokenProvider("yourIssuerName", "yourIssuerKey"));
 6: messagingFactory = MessagingFactory.Create(namespaceAddress, TokenProvider.CreateSharedSecretTokenProvider("yourIssuerName", "yourIssuerKey"));
 7:  
 8: WaitTimeInMinutes = 20;
 9:  
 10: // check to see if the queue exists. If not then create it
 11: if (!namespaceManager.QueueExists(queueName))
 12: {
 13:     namespaceManager.CreateQueue(queueName);
 14: }
 15:  
 16: QueueClient queueClient = messagingFactory.CreateQueueClient(queueName, ReceiveMode.PeekLock);
 17:  
 18: queueClient.BeginReceive(new TimeSpan(0, WaitTimeInMinutes, 0), this.ReceiveCompleted, messageCount);

When a message is received or the 20 minute timeout expires then the ReceiveCompleted delegate is called and a check is made to see if the message is not null before processing it.Once processed another long poll connecting is made and the process repeats. The beauty of this method is that you don’t have to manage timers or separate threads manage the queue.

Topics are private queues that are subscribed to by consumers and each private queue will receive a copy of the messages put into the original queue and are all manages individually. Topics can also apply filters to the message so that they only receive messages that they are interested in.

Further details of Service bus topics and queues

http://www.windowsazure.com/en-us/develop/net/how-to-guides/service-bus-topics/

Windows Azure Training Kit–June 2012 Release

The Windows Azure Training Kit June 2012 release is out now with the following features:

  • 12 new hands-on labs for Windows Azure Virtual Machines
  • 11 new hands-on labs for Windows Azure Web Sites
  • 2 new hands-on labs demonstrating Windows Azure with Windows 8 Metro-style applications
  • Several new hands-on labs for Node.js and PHP using Mac OS X
  • Updated content for the latest Windows Azure SDKs, tools, and new Windows Azure Management Portal
  • New and updated presentations designed to support individual sessions to a full 3 day training workshops

How Windows Azure Service Bus helped pin point a configuration error

This week we had a very useful side effect of using the Window Azure Service bus. We have an Azure hosted website that connects to a CRM backend using the service bus in relay mode to communicate between the two systems. We had a test system that worked fine but when we moved to a Live system we had a configuration error in one of the systems but it was difficult to identify.

The way the service bus works means that the server can easily be moved (as long as the server has an outgoing internet connection).Our service bus at the server side is a Windows Service but we also have a console application to help us with debugging as all the traces are logged to the console window. We turned off the service on the Live system and started it on the test system. As the Azure hosted website is connecting to the service bus rather than a specific server the website will now connect to our test system. By running a successful connection test on the Azure hosted site we could prove that the Azure website configuration was correct.

The next change was to configure the test system to point to the Live CRM system. This would prove whether our data was correct or not. Running the same test as before proved that our data migration to the Live CRM system was fine.

This left us with the service bus and the business logic web service running on the test system, so we reconfigured the Live service bus service to point to the Test web service (which we had previously configured to connect to the live CRM system) and this also work. Thus proving we had an issue with the business logic service.

What we were able to do then was to move the service bus console application on to a developers machine and run it in Visual Studio so that we could debug and break on the calls to the business logic service which helped us to easily identify the problem. All this was done without needing to reconfigure or redeploy our Azure hosted website.

I wish I could say that this ease of debugging was one of the reasons we chose to use the service bus, but I would be lying. The fact that it has made our debugging so much easier will now have an influence on its future use.