web
You’re offline. This is a read only version of the page.
close
Skip to main content

Notifications

Announcements

No record found.

Community site session details

Community site session details

Session Id :
Microsoft Dynamics 365 | Integration, Dataverse...
Suggested Answer

Copy Dataverse data into Azure SQL

(1) ShareShare
ReportReport
Posted on by 12

Hi,

we are migrating from Data Export Service (DES) to Azure Synapse Link, we were able to export data from Dataverse to Data Lake

but we are facing issues with Data Factory pipeline configuration which export data from Data Lake to Azure SQL

we are following this doc

https://docs.microsoft.com/en-us/power-apps/maker/data-platform/azure-synapse-link-pipelines#use-the-solution-template-in-azure-data-factory

in section 10 it has parameters for Container and Folder when configuring the Trigger

pastedimage1663029240744v1.png

when we run the pipeline we got this error:

Operation on target LookupModelJson failed: ErrorCode=AdlsGen2OperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=ADLS Gen2 operation failed for: 'filesystem' does not match expected pattern '^[$a-z0-9](?!.*--)[-a-z0-9]{1,61}[a-z0-9]$'.. Account: 'dverep00test02'. FileSystem: '@split(triggerBody().folderPath,''. Path: '')[0]/@split(triggerBody().folderPath,'/')[1]/model.json'..,Source=Microsoft.DataTransfer.ClientLibrary,''Type=Microsoft.Rest.ValidationException,Message='filesystem' does not match expected pattern '^[$a-z0-9](?!.*--)[-a-z0-9]{1,61}[a-z0-9]$'.,Source=Microsoft.DataTransfer.ClientLibrary,'

I don't know how to validate what triggerBody().folderPath has or return, I don't see any console or something to run this and get the value to understand why @split(triggerBody().folderPath,'/')[0] return of split function does not match the pattern

Blob Container has data (csv) and model.json (with data) in the root of the container

we followed instructions 100% and recreated the pipeline several times, didn't help

I did not find any info about this error in the net

Maybe anyone had the same problem?

I would appreciate any help.

Thanks.

I have the same question (0)
  • seimor Profile Picture
    30 on at

    Hi artcrm ,

    I am also receiving the same error message and followed the same steps from the documentation. Any update with yours? Thank you.

  • artcrm Profile Picture
    12 on at

    Unfortunately no updates. :-(

  • seimor Profile Picture
    30 on at

    I am also trying to get this working in order to start mapping my power bi reports to Azure SQL since data export will be deprecated in november.

  • MKennerley Profile Picture
    57 on at

    Hi,

    The documentation is wrong for Azure Synapse. I think it will work if you are using Azure Data Factory.

    You need to use "trigger().outputs.body.folderPath" and NOT "triggerBody().folderPath" for it to work in Azure Synapse

    Example:

    Container: @split(trigger().outputs.body.folderPath,'/')[0]

    Folder: @split(trigger().outputs.body.folderPath,'/')[1]

  • wajih Profile Picture
    34 on at

    I provided the value while creating Trigger, but Trigger never occurs. What I did what used Container Name in Container and "/" in folder Name. Trigger is working now, however, I have some other errors on execution of Pipeline

  • wajih Profile Picture
    34 on at

    hi artcrm,

    any progress on this issue?

    Thanks  

  • artcrm Profile Picture
    12 on at

    We are doing some different tests but no luck so far. Just getting different errors in the pipeline. Thanks.

  • Suggested answer
    Collisb Profile Picture
    5 on at

    Same problem for us. Very frustrating.

    Check here for solution that worked for us:  github.com/.../3634

  • Suggested answer
    seimor Profile Picture
    30 on at

    Hi,

    Mine ended up working if you delete the folder parameters in the dataflow. There are several folder parameters in the dataflow that asks where you have to delete. With the container parameter, enter your container name.

    Now one thing to note, if you have a lot of data constantly changing, do not use the "event" trigger they mentioned due to the fact that it gets extremely expensive. Each trigger (when your data is constantly moving), can create multiple pipeline runs for each second and each pipeline run costs $$$ depending how many entities and rows of data you have in your container.

    I would suggest using a schedule trigger. Check how long it takes for the first pipeline run to go through and set the schedule timer based on the duration of the first run for testing.

    The only thing I am uncertain is whether the documentation + pipeline copies all the entities and rows, or does it update when any changes occur and only add updated rows instead of copying the entire structure.

    Many people seem to have the same issue as us due to the very poor documentation. and cost wise $$$.

    I hope this helps!

  • raquifo Profile Picture
    5 on at

    I was successful with this solution and the template "Copy Dataverse data into Azure SQL using Synapse Link"

Under review

Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.

Helpful resources

Quick Links

Responsible AI policies

As AI tools become more common, we’re introducing a Responsible AI Use…

Neeraj Kumar – Community Spotlight

We are honored to recognize Neeraj Kumar as our Community Spotlight honoree for…

Leaderboard > Microsoft Dynamics 365 | Integration, Dataverse, and general topics

#1
#ManoVerse Profile Picture

#ManoVerse 93

#1
Siv Sagar Profile Picture

Siv Sagar 93 Super User 2025 Season 2

#3
Martin Dráb Profile Picture

Martin Dráb 62 Most Valuable Professional

Last 30 days Overall leaderboard

Product updates

Dynamics 365 release plans