Personalized Community is here!
Quickly customize your community to find the content you seek.
Have questions on moving to the cloud? Visit the Dynamics 365 Migration Community today! Microsoft’s extensive network of Dynamics AX and Dynamics CRM experts can help.
2021 Release Wave 2Discover the latest updates and new features releasing from October 2021 through March 2022.
2021 release wave 2 plan
The FastTrack program is designed to help you accelerate your Dynamics 365 deployment with confidence.
FastTrack Program | Finance and Operations TechTalks | Customer Engagement TechTalks | Upcoming TechTalks | All TechTalks
Quick one. I'm setting up Power BI reports pulling data from Dynamics 365. Querying the tables I'm using the Power Query builder.
I've brought in several date fields in my table and looking to setup incremental refresh in Power BI. Now I've read somewhere that it's recommended to setup incremental refresh only for systems that support query folding and I'm not quite sure whether Dynamics and the method of getting the tables out of Dynamics supports query folding. Is it a good idea to use incremental refresh for large datasets that I need to get out of Dynamics?
Support for query folding in Dynamics 365 varies depending on the connection method you are using. If you use FetchXML from the Power Query Builder, then you query is already folded as only data included in the FetchXML will be retrieved. If you use the Common Data Service connector, then there doesn't seem to be much or any query folding. I say that anecdotally though. Using the SQL connector and the new TDS endpoint does allow you to do query folding, although that connection is subject to a 2 minute timeout, so you need to be careful there.
On another note, incremental refresh is generally intended for transactional data, where the records don't change and you can make assumptions like "records older than 3 or 6 months never change". Dynamics 365 data is more volatile in its nature and as such the benefits you get from incremental refresh can be limited, if there are any benefits at all. You can always try it out and see if the refresh time is better. If there are no or limited benefits, then just go back to the normal process.
Btw huge fan and follow your blog extensively.
So the reason I'm looking at incremental refresh is say for example I've got two tables/entities that I want to bring into Power BI.
I want to bring in opportunities from say the 1st of January 2020 onwards indefinitely. I also need all the emails related to the opportunity because I want to aggregate the number of sent emails per opportunity but also to drill down to the individual emails.
The problem comes is if I don't set the fetch xml query to a specific date range or use one of the relative date filters like created on equals this year. It'll continue to pull more and more data from the CRM via the API using Power Query builder. Especially when it comes to email activities, the CRM generates alot of it. I'm just thinking of the overall refresh time as more and more data needs to be imported into Power BI.
This was why I was looking at incremental refresh. Basically I want to only import data that's changed in CRM like for example of the last modified date on the opportunity has changed, bring it in and update. If a new email was created and sent, bring it in and update/create the record in Power BI.
This off course assumes the query folds back to CRM so that it only brings in the records that were last changed and not everything and then apply the filter.
Also is there another way of doing this more efficiently in Power BI?
Not sure if this helps or not, you can 'trickle' your CRM data into a Data lake using Export to Data Lake (in the make.powerapps.com), this pushes just the new/changes (based on change tracking being set on the Entities).
Business Applications communities