Hi all,
Every day the system will receive 5 files of different types. These files have more than 200k records/each. I want to import these files into BC and then explode them with another table to put the data that these files have.
The files are in .csv with 10 columns each.
What I have worked on so far have been two ways of importing the file into a Stagging file and then with a job queue to read the records, and then insert the lines into a Price Stagging.
1. Reading the file with xmlport and inserting records in price staging, and
2. Since the columns of the file are different, I used Data exchange definition for fields mapping and insertion in file staging.
But both methods are not efficient as the import takes more than 2 hours of time.
Is there a more efficient way to import?
P.s.: Not all lines imported into the staging table will be inserted into the final table.