Skip to main content

Notifications

Announcements

No record found.

Dynamics 365 Community / Blogs / Learn with Subs / Injest D365F&O into Azure S...

Injest D365F&O into Azure Synapse Analytics, using no code Pipelines and Linked

Subhad365 Profile Picture Subhad365 7 User Group Leader



Azure Synapse Analtycs is amazing and has so many hidden gems. This blog deals with one such awesome features that can help you to bring data from any source and can take it to any data sink/destination. We will be fetching data from D365F&O and will be taking it to Azure blob, using pipelines, without writing a single line of code.


To do this, please follow the following steps:

a. Keep your Azure Azure app registrations details ready. Don't forget to register your App to your Entra ID, and give appropriate API permissions >> Grant admin consent for Dynamics ERP.

b. Create a new Azure Synapse Analytics Workspace, by filling out the following form from Azure portal:



 Click on Review + Create >> Create to conclude the worksapce creation. Please do select appropriate Region. It will take some time for completing the deployment. Once done, click on View Studio, which will take you to Synapse Analytics Studio.


c. Click on Data >> Linked Tab >> + sign:



Select Integration datset to continue:




Select Ax from the filter as shown in the following screen:




Give a porper name, and click on the new from the dropdown:




This will open up the following form:




Fill out the form:
>> URL: this will be the baseurl/data
>> Tenant: will come on its own the moment you choose the URL.
>> Service principal Id: Give client Id.
>> Principal Key: this is the secret Key.

Click on Create. This will Create the Link service and will take a while to load the Path (this is table/entity which you want to export the data from):


Example: I selected Vendor --

Select Ok. Click Publish to continue. 
d. Click on Home buttton to continue:



Click on Ingest:


From the below screen you can choose from 


>> Do you wantt this process to be run once
>> Peridocially
>> Addtional settings like Tubling window --  are a series of fixed-sized, non-overlapping and contiguous time intervals.




For now select the first option >> Next. 
In the following screen, select Source Type as Ax, and then select the source as Vendor, which you created in the pervious step:


Click Next again >>  which will take a while to complete, as it checks/tests the connection. From here, we will select the VendorsV3:

Select Next to continue.
Select Destination type as Azure Blob storage >>  New connection:
The following window will open:


Give a proper name to your connection, select the necessary blob storage, as shown. Click Create to continue.
This will bring you back to the previous screen. You can furnish addtional details, select the browse button to choose the blob container:

Click Next >> The following screen will show the gist of the operation (ex: the file delimeter details, etc.):

Click on Next to contune. Give a proper name to your Task:

Finally it will give you what all you have selected as follows:


And on clicking Next it will give you the status of the process/initiation of the task:
Click on Finish to get started.
Click on Monitor >> wait for the piepline to complete the process:


You can now get back to your Blob folder and see the outcome of the process: Click on Edit on the blob to view the content:

Whew!!!
Cool....that was really an awesome feat from Azure Synapse Analytics. We can further embed the whole process through a Notebook to transform data while executing.
See soon guys -- will come back with more cool hacks on Data Engineering components (Microsoft Fabric, ADF and Azure Synapse analytics). Much love and Namaste 💓💓💓

Comments

*This post is locked for comments