Skip to main content

Notifications

Announcements

No record found.

Customer experience | Sales, Customer Insights,...
Unanswered

Copy Dataverse data to Azure SQL from Azure Synapse Pipelines

(0) ShareShare
ReportReport
Posted on by 30

Hello,

I am attempting to set-up a pipeline from Dataverse to Azure SQL. The goal is to re-map all our power bi reports from what will soon be deprecated data export service to Azure SQL database from our Gen2 Storage Account. 

I am following the Microsoft documentation:  https://docs.microsoft.com/en-us/power-apps/maker/data-platform/azure-synapse-link-pipelines and there's seems to be an issue with step 10 on Configure the Solution Template header where it gives us parameters: 

  • Container: @split(triggerBody().folderPath,'/')[0]
  • Folder: @split(triggerBody().folderPath,'/')[1]

Here is the error code below:

Operation on target LookupModelJson failed: ErrorCode=UserErrorFileNotFound,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=ADLS Gen2 operation failed for: Operation returned an invalid status code 'NotFound'. Account: 'mydatalake'. FileSystem: 'dataverse-xxx-123'. Path: 'Microsoft.Athena.TrickleFeedService/model.json'. ErrorCode: 'PathNotFound'. Message: 'The specified path does not exist.'. RequestId: '2d120e4c-1045-104f-6722-c9becd000000'. TimeStamp: 'Thu, 15 Sep 2022 12:57:24 GMT'.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=Microsoft.Azure.Storage.Data.Models.ErrorSchemaException,Message=Operation returned an invalid status code 'NotFound',Source=Microsoft.DataTransfer.ClientLibrary,'

I've done all the prerequisites and followed the instructions on every step. Ideally, it should be copying all the model.json files in every folder from the container(filesystem). There is a step where it shows all the matching files (step 8), and afterwards will ask to enter the parameters for the trigger. 

I saw another post with the same issue that was posted 2 days ago however no answers yet. Any ideas how to resolve this? Thank you all for your time!

  • MKennerley Profile Picture
    MKennerley 47 on at
    RE: Copy Dataverse data to Azure SQL from Azure Synapse Pipelines

    Hi,

    The documentation is wrong for Azure Synapse. I think it will work if you are using Azure Data Factory.

    You need to use "trigger().outputs.body.folderPath" and NOT "triggerBody().folderPath" for it to work in Azure Synapse

    Example:

    Container: @split(trigger().outputs.body.folderPath,'/')[0]

    Folder: @split(trigger().outputs.body.folderPath,'/')[1]

Under review

Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.

Helpful resources

Quick Links

Congratulations 2024 Spotlight Honorees

Kudos to all of our 2024 community stars! 🎉

Meet the Top 10 leaders for December

Congratulations to our December super stars! 🥳

Start Your Super User Journey Pt 2

Join the ranks of our community heros! 🦹

Leaderboard

#1
André Arnaud de Calavon Profile Picture

André Arnaud de Cal... 291,820 Super User 2024 Season 2

#2
Martin Dráb Profile Picture

Martin Dráb 230,514 Most Valuable Professional

#3
nmaenpaa Profile Picture

nmaenpaa 101,156

Leaderboard

Featured topics

Product updates

Dynamics 365 release plans