Hello,
I am attempting to set-up a pipeline from Dataverse to Azure SQL. The goal is to re-map all our power bi reports from what will soon be deprecated data export service to Azure SQL database from our Gen2 Storage Account.
I am following the Microsoft documentation: https://docs.microsoft.com/en-us/power-apps/maker/data-platform/azure-synapse-link-pipelines and there's seems to be an issue with step 10 on Configure the Solution Template header where it gives us parameters:
- Container:
@split(triggerBody().folderPath,'/')[0]
- Folder:
@split(triggerBody().folderPath,'/')[1]
Here is the error code below:
Operation on target LookupModelJson failed: ErrorCode=UserErrorFileNotFound,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=ADLS Gen2 operation failed for: Operation returned an invalid status code 'NotFound'. Account: 'mydatalake'. FileSystem: 'dataverse-xxx-123'. Path: 'Microsoft.Athena.TrickleFeedService/model.json'. ErrorCode: 'PathNotFound'. Message: 'The specified path does not exist.'. RequestId: '2d120e4c-1045-104f-6722-c9becd000000'. TimeStamp: 'Thu, 15 Sep 2022 12:57:24 GMT'.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=Microsoft.Azure.Storage.Data.Models.ErrorSchemaException,Message=Operation returned an invalid status code 'NotFound',Source=Microsoft.DataTransfer.ClientLibrary,'
I've done all the prerequisites and followed the instructions on every step. Ideally, it should be copying all the model.json files in every folder from the container(filesystem). There is a step where it shows all the matching files (step 8), and afterwards will ask to enter the parameters for the trigger.
I saw another post with the same issue that was posted 2 days ago however no answers yet. Any ideas how to resolve this? Thank you all for your time!