Create Power BI dataflow from F&O Data in Azure Data Lake
Once Dynamics 365 Finance and Operations (F&O) Entity Store Data is refreshed to Azure Data Lake, there are few steps that need to be performed on the storage account before the data can be consumed from Power BI. Prior to following the steps in this blog post, Entity Store refresh to Azure Data Lake need to be setup. It is well documented here.
Let's take AssetTransactionMeasure Aggregate Measurement as an example.
When refreshed to Azure Data Lake, you would see a hierarchical folder structure with root 'dynamics365-financeandoperations' folder under storage account Blob containers.
In order to consume this data from Power BI, the user account used to login into Power BI and Power BI service should have read and execute permissions on all files and folders in the CDM folder.
It would be little tedious to give permissions to each and every folder and file.
To make it easier, create a new folder, apply permissions and upload the aggregate measure. That way, all child folders and files will inherit the permissions from the root folder.
New folder setup in storage account
- Create a new folder under storage account blob containers. Let's say 'd365fo'.
- Right click on the folder and select 'Manage Access'
- On Manage Access form, add user account used to login into PowerBI and enable read and execute permissions
- In Azure portal, select Azure Active Directory > Enterprise Applications. Search for Power BI Service and copy the Object Id.
Add Power BI Service Object Id on Manage Access form and give read and execute permissions.
Now that the permissions are set at the root folder level, all child folders and files created under 'd365fo' folder would inherit user account and Power BI Service permissions.
-
- Download AssetTransactionMeasure folder from 'dynamics365-financeandoperations' folder to your local desktop.
-
- Open model.json file in Visual Studio Code/Visual Studio to update the entity file paths. Replace the highlighted path with 'd365fo'. Basically we are pointing the URL to the new folder structure.
The path now looks like
Repeat the same for all instances of entity file paths or hit Replace All. Save and Close.
- Upload 'AssetTransactionMeasure' folder to d365fo folder in storage account
When you right click on AssetTransactionMeasure folder or any folder and file underneath will have permissions to Power BI Service and the user account added on root 'd365fo' folder.
Now that the permissions are setup, the next step us to create Power BI dataflow.
Copy Model.json file path
-
- Right click on model.json file and select properties
Copy the URL of model.json file.
https://d365fodemo11.blob.core.windows.net/d365fo/AssetTransactionMeasure%2Fmodel.json
**Note: Today, you need to make couple of changes to above URL in order to successfully read the data from Power BI. These will be fixed in the near future to show the correct path.
- The storage account URL should be https://d365fodemo11.dfs.core.windows.net. Please note it should be 'dfs' not 'blob'
- Replace any '%2F' in the URL with '/'
After making these changes the model.json URL would be https://d365fodemo11.dfs.core.windows.net/d365fo/AssetTransactionMeasure/model.json
Create Power BI Dataflow
- Login into Power BI.com with the user account you just gave permissions to.
- Create a new workspace and click create Dataflow
**Note: You need to have a minimum of Power BI Pro license to be able to create a dataflow.
- Select 'Create and Attach' CDM folder
- Enter name of the dataflow and model.json file path URL. Click 'Create and Attach'.
- Once created, you should see a dataflow listed External as in the below screenshot.
Now this F&O Data in the dataflow can be used to create Power BI reports/dashboards.
Comments
-
We are very keen to use this solution at our company but the CDM Connector in the Service is still in Preview therefore we are being blocked. Any idea when this will come out of Preview?
-
One question, after we have all data inside Data lake, instead create PowerBI.com Dataflow, can we connect directly by means not using PowerBI.com Data flow if we're using PowerBI Desktop. In PowerBI Desktop, they have connector to "Get Data" from ADLS Gen2 as well. So the plan is build from Power BI desktop first then Publish. If can what is the pros and cons by usng direct connector than PowerBI.com dataflow ? Thank you for the blog, btw, it is very helpful. Thanks
*This post is locked for comments