Skip to main content

Notifications

Announcements

No record found.

Microsoft Dynamics 365 | Integration, Dataverse...
Suggested answer

Copy Dataverse data into Azure SQL

(1) ShareShare
ReportReport
Posted on by 12

Hi,

we are migrating from Data Export Service (DES) to Azure Synapse Link, we were able to export data from Dataverse to Data Lake

but we are facing issues with Data Factory pipeline configuration which export data from Data Lake to Azure SQL

we are following this doc

https://docs.microsoft.com/en-us/power-apps/maker/data-platform/azure-synapse-link-pipelines#use-the-solution-template-in-azure-data-factory

in section 10 it has parameters for Container and Folder when configuring the Trigger

pastedimage1663029240744v1.png

when we run the pipeline we got this error:

Operation on target LookupModelJson failed: ErrorCode=AdlsGen2OperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=ADLS Gen2 operation failed for: 'filesystem' does not match expected pattern '^[$a-z0-9](?!.*--)[-a-z0-9]{1,61}[a-z0-9]$'.. Account: 'dverep00test02'. FileSystem: '@split(triggerBody().folderPath,''. Path: '')[0]/@split(triggerBody().folderPath,'/')[1]/model.json'..,Source=Microsoft.DataTransfer.ClientLibrary,''Type=Microsoft.Rest.ValidationException,Message='filesystem' does not match expected pattern '^[$a-z0-9](?!.*--)[-a-z0-9]{1,61}[a-z0-9]$'.,Source=Microsoft.DataTransfer.ClientLibrary,'

I don't know how to validate what triggerBody().folderPath has or return, I don't see any console or something to run this and get the value to understand why @split(triggerBody().folderPath,'/')[0] return of split function does not match the pattern

Blob Container has data (csv) and model.json (with data) in the root of the container

we followed instructions 100% and recreated the pipeline several times, didn't help

I did not find any info about this error in the net

Maybe anyone had the same problem?

I would appreciate any help.

Thanks.

  • Brandon Duncan Profile Picture
    Brandon Duncan 20 on at
    RE: Copy Dataverse data into Azure SQL

    Could you look at your pipeline run for DataverseToSQL_Orchestrator and view the parameters passed in. You should see the values passed in if everything is set up correctly. Here's what mine look like:

    The error you're describing makes me think that the event is not configured correctly and it is passing the literal string '@split(triggerBody().folderPath,'/')[0]' through the pipelines.

  • asdfgh Profile Picture
    asdfgh 5 on at
    RE: Copy Dataverse data into Azure SQL

    Any updates? I have also followed the docs 100% and I'm receiving this error.

  • A Wennerwik Profile Picture
    A Wennerwik 155 on at
    RE: Copy Dataverse data into Azure SQL

    Hi, I tried this twice now and get the same error.

  • A Wennerwik Profile Picture
    A Wennerwik 155 on at
    RE: Copy Dataverse data into Azure SQL

    Hi,

    Stupid question, but where do I find this? file system name of your Azure Data Lake Storage Gen2 account

  • Suggested answer
    DAnny3211 Profile Picture
    DAnny3211 9,274 Super User 2024 Season 1 on at
    RE: Copy Dataverse data into Azure SQL

    hi

    It looks like the error message is related to the format of the file system name in Azure Data Lake Storage Gen2. According to the error message, the file system name should match the pattern '^$a-z0-9[-a-z0-9]{1,61}[a-z0-9]$', which means that it should start with a letter or a dollar sign, and it should contain only letters, numbers, and hyphens, but not two consecutive hyphens.

    Based on your description, it seems that the error occurs when the pipeline tries to access the file system name specified in the folderPath parameter. To troubleshoot this issue, you can try the following steps:

    Check the file system name of your Azure Data Lake Storage Gen2 account, and make sure it meets the format requirement mentioned above.

    Check the value of the folderPath parameter in your pipeline trigger. You can do this by adding a new activity to your pipeline that writes the value of triggerBody().folderPath to the pipeline output, and then check the output in the monitoring tab.

    If the folderPath parameter is correct, try modifying it to include only the file system name and the folder path, without the file name. For example, if your file system name is "myfilesystem", and the file you want to access is located in the "folder1/folder2" folder, the folderPath parameter should be "/myfilesystem/folder1/folder2".

    If none of the above steps help, you can try creating a new Azure Data Lake Storage Gen2 account and using it for your pipeline. Make sure to follow the format requirement for the file system name when creating the new account.

    DAniele

  • A Wennerwik Profile Picture
    A Wennerwik 155 on at
    RE: Copy Dataverse data into Azure SQL

    Hi,

    I have 100% followed the documentation and the validation comes through, but I still get the following error when I run the trigger:

    Operation on target LookupModelJson failed: ErrorCode=AdlsGen2InvalidFolderPath,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=The folder path is not specified. Cannot locate the file 'model.json' under the ADLS Gen2 account directly. Please specify the folder path instead.,Source=Microsoft.DataTransfer.ClientLibrary,'

    It's rather frustrating that MS can't supply documentation that actually works! 

  • AnisurR Profile Picture
    AnisurR on at
    RE: Copy Dataverse data into Azure SQL

    Hi seimor ,

    What I understood from the Pipeline and Dataflow ( DataverseCDMToSQL )  only the changed records are being written to SQL Database not  entire table data.

  • MKennerley Profile Picture
    MKennerley 47 on at
    RE: Copy Dataverse data into Azure SQL

    I have noticed that the Template has been updated.

    It is quite different to the original one I had issues with.

    Though, the updated template too had problems that I had to fix when I last looked at it in November.

    It may have changed again by now!

    Thanks

    Mark

  • raquifo Profile Picture
    raquifo 5 on at
    RE: Copy Dataverse data into Azure SQL

    I was successful with this solution and the template "Copy Dataverse data into Azure SQL using Synapse Link"

  • Suggested answer
    seimor Profile Picture
    seimor 30 on at
    RE: Copy Dataverse data into Azure SQL

    Hi,

    Mine ended up working if you delete the folder parameters in the dataflow. There are several folder parameters in the dataflow that asks where you have to delete. With the container parameter, enter your container name.

    Now one thing to note, if you have a lot of data constantly changing, do not use the "event" trigger they mentioned due to the fact that it gets extremely expensive. Each trigger (when your data is constantly moving), can create multiple pipeline runs for each second and each pipeline run costs $$$ depending how many entities and rows of data you have in your container.

    I would suggest using a schedule trigger. Check how long it takes for the first pipeline run to go through and set the schedule timer based on the duration of the first run for testing.

    The only thing I am uncertain is whether the documentation + pipeline copies all the entities and rows, or does it update when any changes occur and only add updated rows instead of copying the entire structure.

    Many people seem to have the same issue as us due to the very poor documentation. and cost wise $$$.

    I hope this helps!

Under review

Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.

Helpful resources

Quick Links

Congratulations 2024 Spotlight Honorees!

Kudos to all of our 2024 community stars! 🎉

Meet the Top 10 leaders for December!

Congratulations to our December super stars! 🥳

Get Started Blogging in the Community

Hosted or syndicated blogging is available! ✍️

Leaderboard

#1
André Arnaud de Calavon Profile Picture

André Arnaud de Cal... 291,642 Super User 2024 Season 2

#2
Martin Dráb Profile Picture

Martin Dráb 230,371 Most Valuable Professional

#3
nmaenpaa Profile Picture

nmaenpaa 101,156

Leaderboard

Product updates

Dynamics 365 release plans