Hello,
We are currently integrating using AppService to push data to Dataverse in a production environment, following this logic:
We tried reducing the batch size to 10 records and even one record per batch and running the process without the parallel "for each" loop. However, in all cases, the calls are still failing.
When we trigger the process, the code fails with the attached error, suggesting that the query length is being exceeded. We have not found a way to catch which field is failing in order to increase its length.
Is there any investigation tool that could help identify the issue? We created a pre-validation plugin to check all field lengths, but it seems the plugin is not being triggered, indicating the process isn't even reaching the insert step in Dataverse.
Unfortunately, the customer cannot provide us with access to the production environment due to the sensitivity of their data.
Any assistance would be greatly appreciated.
Best regards,
Julien
Stay up to date on forum activity by subscribing. You can also customize your in-app and email Notification settings across all subscriptions.
André Arnaud de Cal... 290,445 Super User 2024 Season 2
Martin Dráb 228,334 Most Valuable Professional
nmaenpaa 101,148