Skip to main content

Notifications

Announcements

No record found.

Dynamics 365 Community / Blogs / Learn with Subs / Bring data from Dataverse t...

Bring data from Dataverse to Microsoft Fabric Lakehouse, using Pipeline

Subhad365 Profile Picture Subhad365 7 User Group Leader

Amigos, this blog could give you an insight as to how to fetch data from Microsoft Dataverse. To begin with, we all know what is Micrsoft Fabric:


Microsoft Fabric is an end-to-end analytics and data platform designed for enterprises that require a unified solution. It encompasses data movement, processing, ingestion, transformation, real-time event routing, and report building. Microsoft Fabric is a data analytics platform that helps organizations manage, access, and act on data. It's designed to address all aspects of an organization's analytics needs, including all ingredients of data movement like: data processing, ingestion, and transformation, as well as real-time analytics to allow users to explore, analyze, and act on large volumes of streaming data in near-real time. 
Not to mention enough rooms for a number of services from Data Science like Data Lake and ingestion.
Just before we jump to our next topic, let me introduce you to Lakehouse (just in case, you've not encountered them already): which is a very powerful data architecture platform for storing, managing, and analyzing structured and unstructured data in a single location.  
Saying that, we all know capabilties of Microsoft Dataverse. If you are from Dynamics background, certinly you have confornted with the magnanimous capabilties of Dataverse, in someway or the other. The common data hub, can maiatin data, in an efficient way encompassing a deep sense of integrations with Microsoft's cloud services such as Azure, Dynamics 365, and Microsoft 365, etc., for an easy set-up, with no development overhead.
This blog is targetted at drawing data from Dataverse data to Microsodt Fabric. Given are few steps, to do that:
Step 1: Create a Datalake Storage account in Azure Portal, with the following settings:

 Don't forget to mark the following setting, from the advanced tab:

Click Review + create >> Create to complete the wizard to create the Storage account.

Step 2: Come to PowerApps page (https://make.powerapps.com/environments), and check if you are able to view the  Azure Synapse Link:
If not, then you have to click on More >> Discover all >> You should be able to see it as under the Data management:

Ensure that you are on the correct environment:


Click on the following to initialize the Synapse Link:


The following wizard will appear (allow it some time to get populated):

Click on next, to select what all the tables you want to include:

Click on Save to apply the changes. You need to wait for the process to be over, as it might take a while to complete.
Whew -- you are all set to use dataverse now.
Step-3: Connect to your Microsoft fabirc account, and in case if you don't already have one, you can create a trial version, as outlined in the following blog:
https://subsd365.blogspot.com/2024/10/introduction-to-microsoft-fabric-using.html
This blog, also outlines how to create a Lakehouse, along with a workspace.
Before proceeding to next step, you neeed to ensure you have an Azure APP registration ready, and that you have Administrative rights to access between Azure to dataverse and Fabric.
In your Lakehouse, click on Get data >> new data pipeline:

Fill out the following popup, as required and click on Create:

This will come with the following popup, allowing you to fetch data from a number of sources. Choose Dataverse, as shown:
This will allow you to select the data sources, the CRM/dataverse endpoint

Look at the Authentication Kind dropdown. Select Service principal, fill up Tenant Id, client Id, and secret key, which you have created in your Azure front. 
If you select Organizational Account, this will restrict the login to your interpersonal account login only, which is certainly not intended.
Click on Next to select which dataverse table(s) you want to fetch data from:
For my demo purpose, I am selecting Account table and clicking on Next. Here you can set/change the mappings/add transformations and click on next to review and start copy activity:
This will trigger the pipeline copy activty to bring the data from Dataverse to Lakehouse, and will show Succeeded, once completed:
And now you can double check the data by coming back to your Lakehouse >> Schemas >> DBO >> Tables >> account: it has got populated with the data just you've imported:

 Next is what? You can populate this data to PowerBI report, or further proceess it using various transforms and other utilities. 
Ok, cool 👐👐
Next, I would come back with another blog, describing how to enviage data from D365F&O to Data Fabric, using Pipeline. Till then, take care, much Love, as always 💓💓💓


  

Comments

*This post is locked for comments