As our Power BI practice has grown over the past few years, our team frequently gets requests to pull data from atypical sources, be that music or television streaming data, insurance or patient data, or marketing data, to create comprehensive reports tying ERP data to supplemental data. In the past, this would have been an expensive and difficult ask – Power BI’s web query has many limitations, and it can be difficult to incorporate complex data structures that may require hitting multiple API endpoints. Oftentimes, clients will say “We have this ‘pie in the sky’ request” and in the past, we’d have to leverage manual data exports to get the data, or expensive web applications to perform the requests and store the data in a data warehouse.

In the past year or so, however, as we’ve expanded our Power Automate practice, we’ve been able to leverage Flows to query data from external sources with REST APIs and move the data into the Microsoft Dataverse for easy access in Power BI – this has allowed us to build dynamic Power BI reports that can combine many different data sources to provide a 360 view of a client’s business. In one exactly, we built out reports based on comprehensive online marketing, sales, and subscriber information combined with their ERP data to provide deep insights into customer habits and associated profitability.

From a technical perspective, our typical approach is to first identify the various components needed to successfully authenticate to the web service – in most use cases, this is possible solely through flow, however, as I wrote about in my last blog post (Leveraging Azure Functions & Microsoft Flow to move data seamlessly between NetSuite and PowerApps), sometimes more complex hashing is required than is available in Microsoft Power Automate and we may need to leverage a custom function in Azure to do so. From there, we identify the structure of the data that we will be receiving and create the necessary fields in the corresponding tables in the Microsoft Dataverse. From there, we’ll write the necessary HTTP requests to query the tables from the web resource.

After performing the authenticated HTTP request, we build duplicate detection into our logic to verify that the data does not already exist in the client’s Dataverse environment, and lastly perform the Create or Update operation to get the data into Dataverse. Wherever possible, we’ll leverage dynamic filters in our requests to reduce the amount of data that needs to be processed to increase the speed of a given routine, and when needed, will leverage the concurrency features within Flow to minimize execution time. On the Power BI side, we time up the queries to Dataverse for when the Power Automate process is complete.

Overall, this process has allowed us to build robust and incredibly in-depth Power BI reports that provide our clients with deep data insights that previously would be too difficult or costly to implement. As always, please reach out directly to discuss any Power BI or Integration needs!