Large data extract

Hi,

I am perfoming a call to Baan from webMethods that is returning me large volume of records in Values Object. Is there any other way to handle such large volume in webMethods as I am receving out-of-memory problems. Please help me with a solution it is holding my project.

Thanks & Regards,
Srivats

Srivats,

WM/IS can handle larger docs using NodeIterator,getNextNode services,please go thru the service documenation,it will help you moving further…

Search on this site using keywork “NodeIterator”,you will see lot of results since this problem is covered before in the existing threads.

Goodluck,

Also review the LargeDoc configuration in the wmEDI and wmTN packages as these are more specific to to non XML data. Search this site keyword, LargeDoc

We’ve done a test doing similar things, reading in couple hundred thousands of records out of SAP, and then processing the data. Yes, it’s doable, but it’s not at all straight forward. You have to remember that since everything is read into memory first (I assume it’s the same for Baan adapter), and then processed, you need probably 2-3x the memory of your expected results to just being able to anything with it at all.

To complicate things a bit, different JVMs for different platforms have different limits on how much memory you can assign to Java. I think commonly 1.8GB for 32-bit JVMs. You might have to move to 64-bit JVMs to give you more memory headroom, but they are not available for all platforms.

Hi,

I cannot use NodeIterator as it is not a large document/file on harddisk. It is like a select statement that returns large volume of records.

Thanks & Regards,
Srivats

Srivats,

I am not familiar with Baan at all, but since you mention this is like a select statement the following might help.
In SQL you can limit the number of results a query returns and the offset.
Example:
“SELECT * FROM mytable LIMIT 25,50” will return rows 51-75 from the table (in MySQL).

Maybe you can use the limit to limit the memory usage using the maximum return size and get all the results by increasing the offset every time you handled the previous result?

Again, I have no idea whether such a mechanism exists in Baan.

Regards,

Koen

Hi Koen,

Baan BOIs that I call internally have select statments and it just returns the large data in Values object. Unfortunately their BOI calls return large volumes.

Thanks & Regards,
Srivats

Srivats,

How many times a day is the “large result” BOI being called?

Is it critical that the data move to the target within a few seconds of when the request is made?

This situation is a struggle because you may be on the boundry between an EIA vs. ETL architectural pattern.

The EAI “sweet spot” is small packages of heterogenous messages delivered in near-real time.

The ETL “sweet spot” is large packages of homogenous data delivered on a schedule.

webMethods fits the EAI pattern very well. Tools like Informatica are much better at handling the ETL pattern.

As is usually the case with any tool, you can force webMethods to handle a problem out of its “sweet spot”. If you must do this here, webMethods “large document” handling is based on the assumption that you read data from a file so that you don’t overwhelm the server’s memory.

For very large amounts of data, this means you would need to come up with some kind of intermediary to persist the data to file. Beware! Moving a lot of huge “messages” through an EAI environment is aksing for trouble.