API Management Policies - Part 1


For the past year I have spent time working on a hybrid/ bimodal EDI solution to support ordering- and logistics processes for a customer in China. The technical building blocks are based on integration components in the Microsoft Cloud and we're using API Management (APIM) for three main purposes:

  • Provide access for customers and suppliers, i.e. make sure that they can receive the information they are entitled to
  • Provide information about how, where, when and by whom the EDI-services are being used
  • Provide tools and documentation for customers and suppliers to make it easier to set up a working B2B-relationship 

For the first two bulletpoints one of the key features in APIM are policies where you, as an administrator, can set rules based on different criterias. Common policies include the ability to define how users interact with the service, for instance how many calls you can make per day or where the calls can be made from. Policies can be defined on different levels and there is an order of execution, i.e. an hierarchy, that defines which policies are subject to your services.
However, instead of blogging about how policies are executed I will, in a series of articles, provide you with some of the policies that I use for tasks that aren't necessarily related to security.

Log message details with <log-to-eventhub>

For most scenarios you eventually find yourself in a situation where the built-in analytics features in APIM just isn't enough. For instance, you might need to help a customer troubleshoot why her sales orders aren't accepted or you want to build a dashboard for your sales representatives so that they can see the status of all ongoing orders.

The <log-to-eventhub> will solve most, if not all, logging scenarios I can come up with. The policy lets you log activities on all or certain services/ operations that you expose. It will then send data from APIM to an Azure Event Hub for further processing somewhere else. Azure Event Hubs is a separate component/ service in Azure that can handle millions of events per second. You can think of it as a message channel where APIM dumps its logmessages for consumers to pick up.

The case for this article is about being able to follow an orderflow from the moment where a purchase order has been created in the ERP-system to the actual delivery of the goods. The process itself includes numerous activities where some of them are dependant on information to/ from suppliers and customers. Where it's suitable I use a database to collect information and status of external activities. For instance, when a supplier receives a purchase order this will generate a new row in a table with date, time, vendorname, PO-number, status and so on. This information can later be used to pair with other activities in the business process to give you an idea of the overall status for that particular order.

From an APIM standpoint we can use the <log-to-eventhub> to create rows in the table when there's an interaction between our company and it's partners.
The following example demonstrates the policy of a service that accepts Delivery Messages from different suppliers.  

<set-variable name="message-id" value="@(Guid.NewGuid())" />
  <log-to-eventhub logger-id="monitor-dn-test">
              var body = context.Request.Body?.As(true);
              var xElem = XElement.Load(new StringReader(body));
              string DNNumber = (string)xElem.Descendants("DeliveryMessageNumber").First().Value;
              List pONumbers = new List();
              IEnumerable DeliveryMessageLineItems = xElem.Descendants("DeliveryMessageLineItem"); 
              foreach (XElement DeliveryMessageLineItem in DeliveryMessageLineItems)
                   pONumbers.Add((string)DeliveryMessageLineItem.Descendants("DeliveryMessageReference").FirstOrDefault(el => el.Attribute("DeliveryMessageReferenceType") != null && el.Attribute("DeliveryMessageReferenceType").Value == "PurchaseOrderNumber"));
              return new JObject(
                      new JProperty("id",context.Variables["message-id"]),
                      new JProperty("dateTime",DateTime.UtcNow),
                      new JProperty("operation",context.Request.Method),
                      new JProperty("Path",context.Request.Url.Path + context.Request.Url.QueryString),
                      new JProperty("stage","Delivery Message Received"),
                      new JProperty("username",context.User.Email),
                      new JProperty("DNNumber",DNNumber),
                      new JProperty("pONumbers", new JArray(pONumbers))

The policy has a logger-id, which is a name/ alias for the EventHub used within APIM, where your logs will be sent. For more information about creating EventHub-names in Azure check out this guide by Steve Danielson.

As a first step in the policy we assign the body of the message, i.e. the Delivery Message sent from a supplier, to a variable called body. As this document is standardized and happen to be a XML document we load and serialize it into a variable called xElem.
After extracting the Delivery Note number (which can only exist once in each Delivery Message) we assign it to the string DNNumber. We also want to keep track of the PO-number but since there can be many references to different Purchase Orders in each Delivery Message we create a list which we iterate through.
Once we have received all PO-numbers we convert the list into an array. The reason for using a list in first place is that I find it easier to work with when you haven't got a clue about how many PO-numbers you will have to iterate through.

After collecting data from the Delivery Message we are ready to ship it off to Azure EventHub. But before doing do so we need to decide on the format to send it in. The <log-to-eventhub> can only return strings so we're quite limited in our options there. However in our case we want to send the data into a database and for doing that we will use another Azure component called Stream Analytics. The thing about Stream Analytics is that it only accepts serialization in JSON-objects, CSV and Avro. For a number of reasons I don't even wanna try serializing my data into CSV. When it comes to Avro I had to Google just to find out what it was. And even though it's probably a fantastic choice for some people I'm just not there..  
So the way forward for me is to create a JSON-object with the data collected along with some other information that we want to pass on to the database. Before sending it off we then convert the JSON-object into a string. Please also note that we're creating a JArray of the PO-numbers which makes a typical message look something like this:

With this policy in place you will now start receiving messages in your EventHub each time someone uses your service. As mentioned above, this also requires you to have an EventHub-name in place. 

Processing logmessage details

So how about doing something useful with those logs? Another of my favourite tools in Azure is the Stream Analytics. Basically it is a high-performing engine for processing realtime data from various sources - such as EventHub. You typically create a query to the input data and let you send it to one or many destinations. For this sample I have created a simple job in Stream Analytics with one input (the EventHub that APIM sends logmessages to) and one output (the SQL database).

The query I use is quite straighforward with one exception; the CROSS APPLY statement. As described above we put all the different PO-numbers in an JSON array. So we need to find a way to straighten/ flatten this data and let each item in the array (each PO-number) create a separate output from Stream Analytics. This is where CROSS APPLY comes into play and helps us out. For more information about working with complex data types check this article out. 

dnEvent.Stage AS Stage, po.ArrayValue AS 'PO-Number',
dnEvent.datetime AS 'EA-DateTime', dnEvent.DNNumber AS 'DN-number'


[EventHub-mp-edi-monitor-dn-test] as dnEvent

CROSS APPLY GetElements(dnEvent.pONumbers) as po

Now all we have to do is to sit back and watch our database get populated with new rows each time someone drops you a Delivery Message. What you probably end up with eventually is a database where you collect similar, but not identical information, from all of the services involved in your business process. So you might want to give it some thought in regards to how you will pair these various messages with it's different status'.
In my experience it will save you time (and money) to focus on the business process, the user experience and the database design before you start with the policies. I would also recommend you to create different Stream Analytics jobs for each logmessage type even though they may seem almost identical at first because if there's anything we can be certain of is that it will most likely change in the future.

Kommentera gärna: