It will depend on your requirement. In case of error, how do you want to rollback your transaction? Whether all should be rolled back or only erroneous record should be rolled back?
Just for 5 records, I am not sure whether you will see any drammatic differences in above 3 options. But Batch insert will always be more efficient for large number of records.
The database table structure is pretty flat. I don’t see very many scenarios when it would error out. Anyhow, in case of errors, either rollbacking one or all would work.
I am confused by option 1 and 2.
What do you mean by a list contains 3 fields? a list should contain multiple records and each record will have “n” number of fields.
Let me first try and see if I got your requirements right – You have n records with Fields 1-3 having as many different values. And then you have Fields 4-5, which are “fixed”, having the same value, right?
If this is the case, I think this is easily handled by IS’s “implicit mapping”. In one map, on pipelineOut, create Fields 4-5 under ORG_DATA. Then just map Fields 4-5 to ORG_DATA/Fields4-5. BatchInsert. Done.
Here are some thoughts with respect to performance:
Use batch (local) insert for the list and use update (No) for the other 2 fields.
→ I’ve never used batch insert in wM so can’t comment on this option.
Use appendToDocumentList to create the list of 5 fields. Use batch (local) to insert the list
→ having appendToDocumentList in a loop can be very expensive because the data object increases in size exponentially. It’s not recommended when processing thousands of records.
Loop over the list and insert (No) one at a time all 5 fields.
→ I think this is the best option. LOOP over ORG_DATA since this is a list and map the static values Field4 and Field5 to the adapter service.
The ART will keep the connection to the db open so this will perform best. I can’t think of any other options that would be faster.
I would recommend another way of doing it, if your document list grows upto more than 10k or so. Have a batch parameter, say 5000, to do a batch insert of those 5000 things at one go and drop off the list. Start off a new list and insert the same to DB.
Remember everytime you add up a document to document list it virtually creates a new array with a size of prevoius size+1 and copies the data into the new list. That drastically drops your performace and occupies lot of memory. usage of linked lists instead of append to document list gives u better results
This solution is only viable when you have thousands of records to be inserted