This is a multiple question post -
We are using Data import Export Framework in Dynamics Ax 2012 R2.
1. While importing data for an entity we have ticked the Run Business Logic in Insert or Update and Run Business Validation. However we find that only ValidateWrite method of the corresponding table is fired. There too we are observing that if the validation fails it is not being reported as an error. It is simply appearing in the log section as entry with the actual message raised in checkfailed method in validatewrite. I was expecting the validate failure should mark the records as errored out so that these can be reprocessed. I was also expecting the errors to be displayed clearly so that these can be identified and fixed. However I donot see either happening. Am I missing anything?
2. The results of an import run is appearing in a small information window in the execution history under Error Log (log text view). If we process millions of records and if say one thousand of them fail due to validation failure it would be very difficult to analyze the result in this small window. Doesn't the framework have any better way to view and analyze the run results?
3. In the above run I donot see the View Error File option enabled in Error log window (from execution history) even though there are records not inserted due to validation failures in validatewrite. Is this normal behaviour?
Appreciate any guidance on the above and if possible any tips on DIF validation and error handling best practices
You have many questions in one post. I will try to explain a bit per question. If it is not sufficient, please let me know.
1) You can select the 'Run business validations' and 'Run business logic in insert or update method' on two separate places.
a) Target entities, Entity structure
b) Select entiites for processing group.
It should be checked at both places.
About the validation check, you have to provide an example so we can understand which entity.
2) From the execution history you can view the staging records. The staging records do have a status per record. You can validate to see if the values are correct (staging to target validation). It is also possible to drill into the log details (from Error log window) if the errors occur during import into the staging table.
I agree it is not perfect :-(
3) You can enable the file log within the Data Import Export Framework parameters.
Firstly apologies for the late response to your answer as I was away for a few days.
Thank you very much for taking time to answer each of my questions it is really helpful.
Regarding the validations, I found as you said that we can select 'Run business validations' and 'Run business logic in insert or update method' at processing group level as well as data source level.
Also I found (courtesy Microsoft expert) that we have option to call "callValidateField" and callValidateMethod" under Target Entities -> Modify target Mapping -> Mapping Details if we wish to.
So while "Run Business Validations" fire ValidateWrite() on the data sources, we can have finer grain control using the ValidateField.
In our test runs considering the above two new information (from you and MS) I also found that ValidateField is designed to throw exception by default in case a validation fails and we need not explicitly throw an exception ourselves. And the record gets identified as an error record by DFM. On the other hand ValidateWrite by default skips the rest of the processing and doesn't by default throw any error. We have to explicitly throw exception in ValidateWrite if we have to report the record as an error. That is the reason I was not getting the record reported as an error when using validateWrite. The moment I started raising exception from ValidateWrite, DMF started recognizing the record as an error out record. So that solved the error reporting problem.
Regarding the error reporting, since the standard error file is generated only for logging errors from source to staging during import, I guess if we have to enhance the default error logging happening in infolog during staging to target transfer, we have to write custom codes to pull data out from the backend tables that store the actual errors. Can you please confirm?
Thanks again for taking time to answer my queries. It is really very enlightening.
Your story is also enlightnening on the validate topic.
You can indeed enhance the default error logging, but take into consideration that this might impact performance. The next time I need this tool, I will surely look for opportunities to improve the error reporting for more user friendlyness and performance. We can make suggestions to Microsoft to improve the tool.
I went through the post but was not able to understand the purpose of two check-box. I tried the execution it with and without marked boxes , it worked the same in both the ways.
Thanks in advance.
There are check boxes on the target entity settings as well as the processing group. You have to enable on both settings to have it working.
Thanks for the message. I agree with your post but can you give me an example in which scenario it will be useful. I have used conversion, default or auto generate options for columns in entities while importing data from staging to target. But I wonder in which scenario will I use it.