Hi,
I am experimenting with Azure Synapse Analytics exporting to Synapse and using Spark Pools.
I have run into some difficulties and I wanted to check if anyone has faced them and/or found a solution:
- In one of my environments, the synchronization suddenly stopped. I can see files going in the data lake (1 folder per export), but the Spark Pool is never being triggered and hence the data is not converted to Parquet. For me its a black box so I'd like to know what could be the cause or whether I can see any logs for this. I have checked security and it seems like the security matches another environment that works. There are 2 files in the EntitySyncFailure folder but they are from days before it stopped synchronizing.
- If I change the schema of a table, for example adding a field in Dynamics 365 F&O table (using the Create Field function), the field correctly shows (although it is set as varchar 40 even though in F&O it was varchar 10). However if I increase the field size, say from 10 to 100, the field in the External Table doesn't increase automatically, as a result, any data over the 40 character limit will result in a truncation error when querying. Do you have a way to solve this other than maybe dropping the whole table?
Looking forward to your replies.
Norbert