Announcements
We're trying to build a data sync from BC into a data lake (in our case, Azure Blob Storage) where we are exporting all attributes of specific entities and storing them in our data lake. The process is meant to run periodically and do incremental changes, so it only pull records that have been updated since the last sync. We are doing this by querying against the SystemModifiedAt field. Our entities are pretty static, but do change sometimes, so if would be nice if the data sync process was able to handle schema drift automatically.
We've currently tried the following approaches:
What we are thinking is that we need to now build a custom query/API in BC and creating our own custom data connector for adf. However, before we go down that path, does anybody have any other suggestions or methods they have used to sync data from BC into a data lake?
This looks like a promising option. Thank you very much!
We are in a project to basically do the same thing.
You should check out this github repo:
Hi
Maybe look into logic of the following app to see if you can find another method or not.
André Arnaud de Cal...
294,120
Super User 2025 Season 1
Martin Dráb
232,866
Most Valuable Professional
nmaenpaa
101,158
Moderator