Oracle insert running slow

Hi Guys,

I hope you will be able to help me out. We have a services that reads flat file and then inserterts the data from the flat file into the database. For each record the flat file adapter reads and the insert service is being called. This works fine but for it takes 40mins to do 50,000 records which is way to slow. Is there anyway to read all records and make one database call instead one for each record?

Any help greatly appriciated!!!

Thanks,
Seb

Guys,

Do you know how to insert example: 10 records at a time to Oracle?

We are struggling with performance when we do 1 by 1.

Help Appreciated! :slight_smile:
Seb

Hi Seb,
first check on the database table for indexes/trigger or something like this, since they are the Evil of massive loads. In case they are there: drop indexes and recreate them after load; about triggers disable them (if you can) or move the data elaboration in a different step.

If none of them are present, then you can follow this approach:

try
–open connection to database
–loop over files
----read a record
----insert the record on the database
–commit
–close connection
catch
–rollback
–close connection

this should help avoiding opening/closing connection each time.

Let us know

Cheers,
Sandro

Thanks for the reply. It looks like we need to use the BatchInsertSQL to do all of the bulk database insert. Did you or anyone happen to use it before? Having some trouble getting it to work.

When connecting to the table we get this error even though we don’t have any illiegal chars:

com.wm.app.b2b.server.ServiceException: [ART.114.243] Adapter Runtime (Metadata): Failed to run resourceDomainLookupValues service. Details provided in error log.
[ADA.1.319] Cannot get the list of table columns. "
(22025/1424) ORA-01424: missing or illegal character following the escape character
"
ORA-01424: missing or illegal character following the escape character

on tables.columnInfo [Ljava.lang.String;@1570717

java.lang.Exception: Adapter values are not available.

Thanks,
Seb

Hi Seb,

You need to change the transaction type to “LOCAL_TRANSACTION” in your JDBC setting to use Bulk insert. That’s the cause of the error although the message is misleading.

Also you may want to try different sized batches to get best performance. 200 records at a time worked best for me.

HTH,
Sridhar

Thanks! Got that resovled. Now ran into 2 other issues:

  1. When Array Document is linked the to BatchInput arrach document for some reason I can only get one record in. Should the link load entire array?
  2. How can I limit number of records per insert?

Thanks for all the help!!!

We got it working. We were linking to wrong document. We have redirected the data to new document list and now it seems to be working.

S

Guess you have all issues resolved. In case you are still wondering about limiting the no. of records inserted, basically you control the no. of records in the document list that you pass to the bulk insert service.

urs70, Hehe… That is exacly what I am trying to figure out now. You said that I can control the no. of records in the doc list. Hmm… Where do I set that again? :slight_smile:

Thanks,
S

seb,

There is no simple option to set the no. of records in a list. It should be done programmatically. Here are some options:

  1. In my case I just had to insert each line of a flat file in to a table along with other key fields. So I wrote java services to return the flat file data one chunk of lines at a time. If you are interested in this approach I can try to get some of that code

  2. Since you have mentioned flat file processing you can use the iterate option in convertToValues function to read the whole file and return the records one at a time.

  3. Using ConvertToValues create all the records. Then use a loop to transfer, say, 200 records at a time to a second array and insert those using bulkInsert. But this may not be a good approach if your files are big.

The logic will be something like this

x=200
Main Loop
–Loop x times or till no_more_records
----Logic to Read/create records
–loop End
–Insert Records in to table
–If no_more_Records exit Main Loop
–Main Loop end

HTH,
Sridhar