If you've ever wondered how to use Microsoft Azure as a universal data store, here we detail how the process works. Read on to find out more...
This architectural pattern is very generic, in that it attempts to consume data into the Azure cloud; processing, storing and analysing it in a cost effective, highly available and flexible way.
- The entry Logic App can extract data that exists on premise, or can receive data directly through its own API
- The payload or body of the data is immediately encapsulated into a file with a unique filename generated from a GUID (global unique identifier), and finally stored in the highly efficient, highly available Azure Blob store
- The unique name is also submitted into an Azure Service Bus queue; the queue can be configured to store the unique name for weeks if necessary
- A second Logic App subscribes to the queue and receives each unique name as it appears
- The Logic app then retrieves the file held in the Blob store using the unique name as an identifier
- The Logic App then processes the body data through a workflow, saving back to the Blob store
- The unique name is then stored in either Azure Table store, or SQL Azure
- A Web App can provide a dashboard to display the results, loading from the Blob store or Azure Table/SQL Azure
As an example of this pattern, let us imagine that a UK airport needs a way of monitoring which cars are entering and leaving:
- Cameras can be fitted with software that can detect the number plate and/or any other detail
- The system receives images and data about each vehicle as it arrives in the Blob store and Service Bus respectively
- The system detects a vehicle departure and finds the matching arrival record
- The system can then apply rules that determine if any action should be taken
If you have any queries about Azure, or would like to discuss this in further detail, please don't hesitate to contact us.