I hope you will be able to help me out. We have a services that reads flat file and then inserterts the data from the flat file into the database. For each record the flat file adapter reads and the insert service is being called. This works fine but for it takes 40mins to do 50,000 records which is way to slow. Is there anyway to read all records and make one database call instead one for each record?
Hi Seb,
first check on the database table for indexes/trigger or something like this, since they are the Evil of massive loads. In case they are there: drop indexes and recreate them after load; about triggers disable them (if you can) or move the data elaboration in a different step.
If none of them are present, then you can follow this approach:
try
–open connection to database
–loop over files
----read a record
----insert the record on the database
–commit
–close connection
catch
–rollback
–close connection
this should help avoiding opening/closing connection each time.
Thanks for the reply. It looks like we need to use the BatchInsertSQL to do all of the bulk database insert. Did you or anyone happen to use it before? Having some trouble getting it to work.
When connecting to the table we get this error even though we don’t have any illiegal chars:
com.wm.app.b2b.server.ServiceException: [ART.114.243] Adapter Runtime (Metadata): Failed to run resourceDomainLookupValues service. Details provided in error log.
[ADA.1.319] Cannot get the list of table columns. "
(22025/1424) ORA-01424: missing or illegal character following the escape character
"
ORA-01424: missing or illegal character following the escape character
on tables.columnInfo [Ljava.lang.String;@1570717
java.lang.Exception: Adapter values are not available.
You need to change the transaction type to “LOCAL_TRANSACTION” in your JDBC setting to use Bulk insert. That’s the cause of the error although the message is misleading.
Also you may want to try different sized batches to get best performance. 200 records at a time worked best for me.
Guess you have all issues resolved. In case you are still wondering about limiting the no. of records inserted, basically you control the no. of records in the document list that you pass to the bulk insert service.
urs70, Hehe… That is exacly what I am trying to figure out now. You said that I can control the no. of records in the doc list. Hmm… Where do I set that again?
There is no simple option to set the no. of records in a list. It should be done programmatically. Here are some options:
In my case I just had to insert each line of a flat file in to a table along with other key fields. So I wrote java services to return the flat file data one chunk of lines at a time. If you are interested in this approach I can try to get some of that code
Since you have mentioned flat file processing you can use the iterate option in convertToValues function to read the whole file and return the records one at a time.
Using ConvertToValues create all the records. Then use a loop to transfer, say, 200 records at a time to a second array and insert those using bulkInsert. But this may not be a good approach if your files are big.
The logic will be something like this
x=200
Main Loop
–Loop x times or till no_more_records
----Logic to Read/create records
–loop End
–Insert Records in to table
–If no_more_Records exit Main Loop
–Main Loop end