Hi everyone.
I have this requirement to import over 15000 lines of vendor invoice journal lines. These were supposed to be imported on a single journal batch. But the import won't move past 2000 lines and gets pretty much stuck after 4 hours. So I decided to turn off the two business validations in the entity structure since I know these settings usually cause the slowness of the import. Please take note that I am importing these in a batch process.
The import went fast and consistent. It was completed in under 1 hour. However, during posting we have encountered errors regarding with these lines. The error was no exchange rate for the currency that I've specified in lines. It's weird since we had one setup for this. Therefore I did a test import for 1 line only with "Run business logic in insert or update method" turned on. After that, I tried posting this line and it went successful.
After doing that test, it was clear to me that running that business logic does something in the background while the record is being inserted. However, that business logic causes a really slow import. So to mitigate this, I've decided to divide the lines across several journal batches. Each journal batch have only 1000 lines each.
But even with 1000 lines. The import is still slow. I'm having like 1 record inserted per second as I watch the import status. And not only that, the slowness is not consistent. It gets exponentially slower every minute that passes. It's now like 1 record per 5 seconds. So far I have imported 3 sets of these, meaning 3000 lines in a whopping 12 hours of import.
Can someone please help me with this issue? I have also tried importing via X++ code, but to no avail. It seems like business logic is automatically included while importing with X++ and therefore the slowness is exactly the same with data management import.
*This post is locked for comments
Why would text/tab be any quicker? Can you advise/explain?
Hi Harvey.
Just import in Text format (TAB Delimited)
Hi André, thanks for your help. Actually your advise is really helpful when dealing with huge amount of records. I will definitely use this with other entities, especially the master entities.
However, as for this specific entity, the import fails when it is about to get done. I have not dived really deep on journal entities but it seems that most of them throws error when re-importing data in same journal. Same applies if multiple threads are importing on same journal. I had no choice but to wait for this entity to finish uploading via standard batch process. Thankfully, it went well after several hours of waiting.
Anyway, I will mark this as the answer, since on common scenarios, your advise would be really helpful.
Hi André,
I'll try your suggestion and will get back to you if I notice improvements. Thank you.
Hi Ludwig,
I have imported them using Data management, with Excel as the file type.
Unfortunately, all of the lines throw error if "Run business logic in insert or update method" is set to No. My only workaround at the moment is set it to Yes and wait for the import to be finished.
Hi Harvey,
Whenyou go to the (Data management) Framework parameters, there is a tab with Entity settings. When you then click on the button Configure entity execution parameters, you can setup a task count. This determines the number of batch tasks per entity which will be created and work will divided over several threads. This will improve the performance.
Hi Harvey Oroceo,
How did you import your invoice lines?
Did you use an Excel template or the DIXF?
Once the data were imported into your journal and could not completely be posted, have you tried making use of the 'post and transfer' posting option, which posts all correct vouchers and transfers the ones with errors to a new journal.
Best regards,
Ludwig
Stay up to date on forum activity by subscribing. You can also customize your in-app and email Notification settings across all subscriptions.
André Arnaud de Cal... 291,134 Super User 2024 Season 2
Martin Dráb 229,928 Most Valuable Professional
nmaenpaa 101,156