Note: see this article for some new insights.

In part 1 I provided an overview of the given solution. And now it is time to configure Azure Service Bus, Event Hubs and Stream Analytics to be able to sent and receive messages. I still be using the ‘old’ Azure Management Portal as I cannot use the new portal: I always get lost in that one.

The steps that needs to be done:

Configure Azure Service Bus

Create a new namespace

To use the Azure Service Bus and their Event Hubs in combination with Stream Analytics we first need to create a new namespace by clicking on Create a new namespace in the Service Bus page in the Azure Portal. One important thing is that it needs to be a Standard messaging tier. This is because we need to define a different Consumer Group as Stream Analytics cannot connect to the $Default group.

![Event Hub namespace/content/images/2016/6/EventHub%20Namespace.png)

Create a sender event hub

After we have created the namespace the Event Hubs can be created. This can easily done by using the standard way of creating objects in the Azure Management Portal.

Create Event Hub

By default an event hub has 16 partitions and a retention of 1 day. There is no need to change this in this case, but in a real life scenario these values can be altered to satisfy the business need.

Define access policy

Event Hub Access Policy Config
To access the even hub externally access policies need to be created. The direction of the policies are based on the program/apps that is using the event hub. By clicking the arrow left of the event hub name we can configure the event hub. In the configure tab we need to add a policy and set the permission to Send.

Create a custom consumer group

Event Hub Consumer Group
To use the sender event hub (PowerBIEventHub) with Stream Analytics we need to create a new Consumer Group. This can be done via the Consumer Groups tab next to the Configure tab.

Create a receiver event hub

For the receiver event hub (PowerBIReceiver) we can create it via the same way as the sender event hub. Only this one doesn’t need a custom consumer group and the permission of the access policy needs to be set to Listen.

Event Hub Policy

Configure Azure Stream Analytics

Now that the event hubs are set, Stream Analytics can be created and configured. This is pretty simple. Stream Analytics needs a storage account, but this can be done while creating a Stream Analytics.

Create ASA

Stream Analytics is basically a pipeline with an input and a output and in between a query defines which records of the input flows to the output. In our case we are linking the event hubs as in- and output of this stream via the Input and Output tab

Stream Analytics Overview

Connect the sender event hub as input

In the Input tab we can easily add an input and a wizard pops up. In the first step choose Data Stream

Add Input - step 1

In step 2 choose Event Hub as we are connecting our event hub to Stream Analytics

Add input - step 2

In step 3 we need to provide all the input event hub information. In this step the consumer group is needed we created before and it will not accept the $Default group as special characters are not accepted. The Input Alias is needed while defining the query; it is the ‘table’ used to select information from by the stream query.

Add input - step 3

And in the last step (4) choose JSON and UTF8 as we will send this information to the sender event hub.

Add input - step 4

Connect the receiver event hub as output

Adding an output is almost the same as adding an input: after clicking the *add output *option a wizard pops up. In the first step we select Event Hub as this is the only option we can easily use to process the data directly. (Note: in the UserVoice suggestions there is already a request to add an API output with status ‘Under Review’…)

Add output - step 1

And in the final step, number 2, we provide the receiver event hub information. There is no need to configure the advanced settings in this case.

Add output - step 2

Define the event processing query

Now that we have a input and an output, we need to define a query to retrieve the input information and add it to the output. This is done by using the query tab. In my case I use a very simple query that is just putting all the input unfiltered to the output. The query has a SQL-like syntax and uses two ‘tables’, namely the input and output name provided by the different wizards.

After we created the query we can test it by uploading a JSON file and look at the output. If everything is set correctly we can start the stream by clicking the play button and the Azure service are setup and working correctly and ready to use by the sending and processing app.

Next Steps

  1. Introduction: part 1
  2. Configuring Azure services: part 2
  3. Sending, reading and processing the events: part 3

-JP

© 2022 Azure BI (Jan Pieter Posthuma)
Lovingly made in 
Lovingly made in 'de Achterhoek'