Announcements
Hi all,
We have an issue after upgrading to platfom 21 from PU12 with exporting large volume of data (400K records) in a data entity, the target entity is a single custom one , so nothing complex there, but we have a huge number of records, is there any consideration we should take into account to export such big entity? The weird thing is that the same system worked before we update to platform 21, we upgarded from Application version 7.1 and Platform version 12 to App version 7.1 but Platform 21.
Thanks
Hi Ievgen,
Thanks we actually tried that but did help, the set based entity has still "no" value. For what it is worth, I solved my original problem. I shared the solution at this post for future references: community.dynamics.com/.../884406
Public entities do not support set based SQL Operations, so check that "Is Public" property set to "No"
Hi Ievgen,
Thanks for you response, you are right I should have done more research before asking the question. But this time I did my homework and unfortunately I didn't succeed to have the value of "Supports set-based SQL operation" to true. In the link you provided it mentions that "In review properties to check in below screen shot, initially to create an Entity and after edit and reopen this entity then you found “Support Set Based Sql Operation = YES”" but I am not sure what it means, in any case I see that the "Supports set-based SQL operation" property of Data entity is grayed out no matter if I close and open it after creation. I also tried to make sure I dont have the following conditions for my target table (from the blog post):
but I am not sure what some of these means, like "Mixed SaveDataPerCompany", in my case the "Data per company" Attribute is set to No. Also I have one single Table (not view, no joins, Unions ) as data source and it is not Time-State enabled, no aggregated field either. So how can I make my data entity set based?
Thanks a lot for your help
1) it's a property on an entity, here is a blog talking about it [View:https://ajayvishwakarmaax.wordpress.com/2017/05/18/515/amp/]
2)here is an article about incremental export [View:https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/data-entities/entity-change-track]
And don't be lazy, you could have found this on your own having all keywords, do some investigation before asking next question.
Hi Ievgen,
Thanks for the response.
I will have to validate this but I am sure we export this at least a few times per week. Can you please help me to try your suggestions:
1. How can we enable set base operation? Is it at the SQL DB level?
2. How would we export only the new recors? Is that something to do with "Default Refresh Type"?
Thanks Again
400k is a big number and I would be generally ok with it especially for one off export. And if you doing it daily maybe you need to export only new records or rethink the whole process.
Also you may want to enable set base operation that should significantly improve performance.
We actually have started our upgrade in dev env and test and already have a no go for production due to this error. So our Dev and test env experience this issue but UAT and prod are fine as they are in the older version, pu12. I did clear the staging table and after the run the number of records in it reaches 400K , if I observe the progress in staging while the job runs, I see ~7K records being added to the staging table every minute. so the 50 minute time makes sense considering this progress. I used also activity monitor (though I am a newbie int it) and among all the metrics I could notice that the query "Insert into StagingTable" is executed 7K times per minute, which is again in conformity with what I observed myself. Is there any other metics I should look at? I see that the logging waiting time is a bit long, the wait time is 500 ms/sec. I am not sure if this could help but the wait time for logging is the max in the table.
If it is 50 minutes now compared to 4 minutes then definitely something is going on . Did you check from SQL side , if you see any query which is taking time ? In which environment you are executing this ? Is this happening in all environments? How many records are there in staging ? Did you try to clear staging if it has huge volume ?
Thanks guys for the prompt replys.
To Sukrut's questions, at the beginning I used to get errors, right now, it is just slow and I dont have any errors in "View Execution log". The error initially was about "Dupliate records". I resolved this by making the index defined on the staging table non unique. So right now there is no error, but we almost hit 50 minutes for the export rather than 4 minutes when we were in PU 12.
To your question, Andre, it is a manual export and then manual re-import into another system, so nothing recurring data job.
Thanks Again
Hi Pop,
In addition to Sukrut's question, I would like to know how you are using the export. Is it a manual export or recurring data job? Are you running in the cloud or on-premise?
So, please answer all questions. May be done in a single reply.
André Arnaud de Cal...
293,302
Super User 2025 Season 1
Martin Dráb
232,114
Most Valuable Professional
nmaenpaa
101,156
Moderator