How to split a Document List?

Hi.

I would like to know if there is a way to split the document list records before sending to a database.

We have some services that transports more than 50,000 records inside DocumentList objects, causing timeout errors when sending then to foreign services.

For example, if I have a document list with 5,000 records I would like to send just 1,000 records per time.

How can I do this?

Please, see the attachment. I think that it will help to explain what I really want.

Thank you.
Screenshot_1.png

Can you use JDBC batchInsert sql template for this db work load or similar solution?

HTH,
RMG

Hi RMG.

Yes, I can and I already used the batchInsert. It runs fine.

But, the amount of data that sometimes I need to insert it’s more than 10,000 records, which caused timeout errors.

The database insert was just an example. This timeout error occur most often when I need to send 10,000 records to a webservice or when I invoke a service that will send data to another Integration Server in Overseas.

That’s why I would like to learn to divide the records before sending

Hi,

First, you can use pub.list:sizeOfList to get the total count of the list. Then you may just iterate using LOOP or REPEAT to break them into small lists. From there I think you can further process them.

Hi,

when communicating between IS in different continents you should consider de-coupling them by using a messaging system like UM or Broker.

In Broker there was a possibility named Gateway to route data between 2 Brokers which were connected to their local ISes.
I am sure that UM should support something similar I am not aware of the exact as we have not yet migrated to UM.

In such a case the messaging system will take care of the transport and IS is not encountering a timeout after sending the message to messaging.

Regards,
Holger

Hi, I echo with Holger here. I would recommend that the cleaner way to do this from design perspective is to publish all your documents by iterating to a JMS Destination and then configure a JMS Trigger with Max Batch Messages set to 100 or 200 or 500 (Your preferred chunk size).
The consuming service will give you the input as chunked document list of JMSMessage array with 100 or 500 documents in it. You can then just work on those chunked set of messages. either batch insert or loop over and call a WS etc…
To make this even more efficient, you can configure Processing Mode to “Concurrent” and Max Execution Threads to some reasonable number (say “10”).