Multiple updates to DB

Hi All,
I have a flatfile which may contain more than 20000 lines where each line represents a record that needs to be updated in DataBase. I parse each line, perform mapping, validation etc and then call “execSQL” service to update the database. So I am calling execSQL 20000 times in a loop. well, as expected, performance is very poor in this case.
Is there any way I can send a batch (say 200) of update statements to DataBase at a time? does execSQL service allow us to do this ?
please give your valuable comments. also suggest the best way to increase the performance in this case.

Thanks in advance

Which DB are you using? And what function are you performing using the execSQL?

Use pub.db:Update Service which will do a batch update all the rows.

but before that make sure your document list (containing records) structure has to match exactly same like the field names of the respective updating table in DB and map the documentlist to $set (param in the update service).

This will help one time update process of all your line items what ever the count.


HI vinod,
We have DB2 here and I am using update statement in the execSQL.


well, each line in the flatfile represents a unique record in DB, which implies that my each update statement would update only one row in the database. So I will have to use pub.db:Update the same number of times as before and performance would remain the same.
let me know if I can send a bunch of update statements at a time using any of the built-in services in wmDB package.


Hi Hari,

    RMG solution is correct.If you are parsing flat file records/ 

lines into records.All these are mapped to record list.So now calling
update service which updates table with this recors in record list
as a batch process.There is no need to loop or branch.Direct
mapping.No need to call 2000 times of this service.Follow RMG


Following up on the bulk update, each line in your flat file represents a record in the same table in the database, right Ramesh?

If you are trying to update multiple records in the same table, you can create a record List and then invoke the pub.db:update statement to do a batch update.


Hi All,
I was not able to use the update statement as you guys mentioned.
let me explain the situation
say I have a table called ID, it has one primary key column by name “num” and one more column called “val”.

The structure of my flatfile is in this form


I have 20000 such lines, each of which is unique.
can you please tell me what to specify in $criteria and $set for such a situation?
Would really appreciate any comments/suggestions.


i belive you can ignore $criteria(if it requires then mention the primary key field criteria which will update the meet as you specified) and also directly map your documentlist (containing 20000 lines)to $set,which should work.

Have you tried this first,Pls let us know…

I did map a document list to $set. but $set appears to be a document not a document list. The “built-in services guide” tells us that $set is a document list but when i invoke update and see the pipeline content under “service in”, $set is displayed as a document. So, when I map a document list to $set, $set is set only to the first document in document list.

can you please let me know the record list structure (for the database description that gave in previous message) that I need to map to $set?


Hi Hari,

    Then take loop .Specify in -array as Your doc list.Inside  

loop call your update service.Map directly this Doclist(which becomes)
document inside to $set.Before that you have to connect,call startTransaction after update you have to call commit,ClearTransaction.
then close.In Catch block rollBack,ClearTransaction,Close(DB).

Hope this works fine.Other wise write java service.