©
Building a Data Pipeline for Azure Health Data Services

Building a Data Pipeline for Azure Health Data Services

To test the GET API and return the test EHR from the FHIR service, simple enter https://azure-healt-euuzo-func.azurewebsites.net/Patient/a2788980-20a1-49cf-9a91-17a8aef0f1bb on your browser.

New EHR records can be created on the FHIR service using a tool such as Postman and sending a POST body request to https://azure-healt-euuzo-func.azurewebsites.net/Patient/ that has data similar to what is here - https://github.com/Azure-Samples/azure-health-data-services-toolkit-fhir-function-quickstart/blob/main/tests/patient.json


Architectural diagram of solution.

Azure Health Data Services is a suite of cloud-based tools and services offered by Microsoft Azure that enables healthcare organizations to manage and analyze their data securely and efficiently. These services provide healthcare organizations with a platform to store, manage, and analyze their data, including electronic health records (EHRs), medical imaging, and patient-generated data.

Azure Health Data Services includes several components, such as Azure FHIR (Fast Healthcare Interoperability Resources) Service, which allows healthcare organizations to securely exchange healthcare data using the FHIR standard; Azure Healthcare Analytics, which provides advanced analytics capabilities to analyze data from multiple sources; and Azure Security and Compliance, which offers robust security and compliance features to meet regulatory requirements. All these interconnected services are housed in what is known as a Azure Health Data Services Workspace.

For the demo in this blog post I will be focusing on the Azure FHIR Service. We will be creating a simple API built around the FHIR standard, which will allow us to send GET and POST calls (with test EHR records) to the Azure FHIR service. From there the data will then be further processed through a ready made container application that transforms and transports the FHIR data into an Azure Storage Account which has been setup as a Data Lake. Once in there the data can then be analysed by whatever tool of your choice such as Azure Synapse, Power BI etc…

This is just a very basic proof of concept of how one can create FHIR data and then make a data pipeline for it, and was quick and easy to setup as there are two existing templates I used. Using these resources and templates anyone can get a functioning FHIR service deployed quickly and able to take API requests and feed it into a data warehouse/analysis solution.

The first template is this Azure Developer CLI template that does the heavy lifting by making the Azure Function, the Azure Health Data Service Workspace and the FHIR Service - https://github.com/Azure-Samples/azure-health-data-services-toolkit-fhir-function-quickstart

The second resource I used was this guide along with ARM template and script I used to implement the FHIR to data lake service (which is hosted on a Azure Container App) - https://github.com/microsoft/FHIR-Analytics-Pipelines/blob/main/FhirToDataLake/docs/Deploy-FhirToDatalake.md

With the help of those resources and templates I was able to create the infrastructure for my solution and tested it by sending a POST request with a body containing all the necessary data to make a new EHR record in the FHIR service. I could also test the GET request by returning the EHR record. The Azure Function is behind all the API functionality and even though it is very simple it has been built to conform to the FHIR messaging standard. The Container App has read permissions granted for the FHIR service and is automatically listening for any changes to it. Once changes do come in it proceeds to process those changes before depositing them in the Azure Data Lake.

All of the Azure resources in this solution.

Blog Categories - Technology · Personal

Azure DevOps Implementation Report for a Typical .Net Application

Azure DevOps Implementation Report for a Typical .Net Application

Cloud Native E-Commerce App Solution on Azure

Cloud Native E-Commerce App Solution on Azure