Personalized Community is here!
Quickly customize your community to find the content you seek.
For the 2020 wave 2 release, Dynamics 365 Customer Insights introduced two new capabilities:
* Audience insights (Generally Available)
* Engagement insights (available to public preview as of 10/30/20)
2021 Release Wave 2Discover the latest updates and new features releasing from October 2021 through March 2022.
2021 release wave 2 plan
The FastTrack program is designed to help you accelerate your Dynamics 365 deployment with confidence.
FastTrack Program | Finance and Operations TechTalks | Customer Engagement TechTalks | Upcoming TechTalks | All TechTalks
I keep getting this error (DataSource Error: Cdm failed to generate manifest. Folder path [...] file name: model.json) when trying to edit or create new entity data imports from Data Lake (Azure Data Lake Storage Gen2 - CDM folder view). The error appears in PowerQuery editor after selecting the model (database icon). Unfortunately the information under "show details" is not helpful at all:
section Section1;shared Query = let Source = Cdm.Contents(AzureStorage.DataLake("https://[...].dfs.core.windows.net/", [HierarchicalNavigation = true]))in Source;
The CDM folders have been created by Power BI dataflows and used for other purposes (reports etc.) so I assume the content of model.json is correct. I can read most CDM folders from the data lake with no problem. Only some folders throw this error (not all the time but 95% of the time). There is no data refresh going on at the source when the import fails.
Any thoughts on this? This is currently a blocker in our project so somewhat urgent.
PS: I know I could directly connect to the Common Data model folders (Not using Import/PowerQuery). However this is not the preferred options because we can not apply any transformations there (plus there are other limitations like unsupported data types).
If I'm understand correctly, you have the model.json file already created and are sure it works. When you reference it, it looks like Power Query, is that right? When you create the data source, there's an option to use the "Connect to a common data model folder" - are you using that, the Connect to Common Data Service or Import data?
We have recently added the ability to work with the model file when using the "Connect to CDM Folder" option when creating a data source. You should now be able to do some transforming of the data, changing types, etc. I will be publishing a CI in Under 5 video on this...this week :) https://aka.ms/ciunder5
Hi Bill, it's the Import Option using PowerQuery.
Thanks John, I am aware of this option, but would prefer PowerQuery as it provides more transformation flexibility. But will further exploire this new option (and looking foward to your video!)
However, why does the PowerQuery import work on some CDM folders on not on others? If I knew the reason I might be able to work around.
I see John answered and him and Aditya are who always save me, so I'll defer to him - but your question about Why does it work on some or not the others? I've done several implementations and not run into this. But I always create my own Model.json files. I'd take a look at the files if you can in ADL explorer and see, I generally walk through each before a deployment, just make sure the URl's are accessible, the types match etc. I've had a few problems where, a key I defined needed to be a string, date format and control characters like CRLFs have caused a lot of problems. If you are comfortable posting a screen shot or the files I'd be happy to take a look at them for you.
I responded to wrong reply Johannes - but short answer, check the data first, make sure there's nothing like CRLF or control characters, check the date format and make sure they comply to ISO 8601, check lengths of fields and type conversions, You can also try editing the model.json to point to a file with two records that you know are perfect, use that to help eliminate whether it's access or something in data.
Thanks Bill, the CDM folder and model.json are generated from Power BI dataflow and work perfectly as input for a PBI dataset so it cannot be corrupt as such.
However, I already learned that CI tends to be picky with certain data types ("date" anyone?!) or field/table names. Are there more limitations we need to be aware when loading data? A comprehensive list would be very useful.
Johannes, just to be clear, the reason I mentioned the model is just to look at the file access (b/c container access can be problematic that I'm not sure Power BI has same issue with). But ISO 8601 is one of the most common issues on Dates we run into. Control characters is another one, anything like crlfs, schema drift if there's csv files.
Business Applications communities