Hi Experts,
We have a very huge data from our legacy system which needs to upload to FinOps on our go live. But the problem is that data is too big i.e. we will have 50k customers and vendors, 50k Projects, 0.5 million Sales Orders, 0.5 million Purhcase Orders, 0.1 million General Journals etc.
The approach to upload the data is below:
We upload the data from Legacy system to our staging tables in SQL (since we are on prem) and from there we run our business logic in runnable class and do the processing on the data. This approach is taking very very long time like per document almost 2-3 seconds. Can someone share opinion on how we can reduce the time and what should be the strategy given such huge volume of data?
Thanks in Advance.