Error in Largedocuments hadlingFlatFiles works fine for 1MB file

Hi,

The scenario right now working for flat files of size 1MB is:
Get the file by file polling, check the user and route the file to TN.
TN calls the service mentioned in the preprocessing rule which uses
Wm.tn.doc.getContentPartData, pub.flatFiles:converToValues, Validate the data (Validation needs sum of individual invoice info and finally check with file trailer info), build invoiceheader and details and then inserting the data into database.

Right now largedocuments(FF) are failing at Wm.tn.doc.getContentPartData.
The error is
�Error retrieving content of content part ffdata of document 5fvl0m0v0h90ke4a000000h0�.
Sometimes I get a NSRuntime error.

Can anyone guide me How to hadle/service inplace of getContentPartData, the FF large documents and How to track the info which I am validating till end of all the records.

Given below the TNProperties and java heap size. Tried changing the tn.BigDocThreshold and JAVA_MIN_MEM = 1536m.

TN Properties:-
tn.BigDocThreshold = 1048576
tn.tspace.location = E:\TNLargeDocTemp
tn.tspace.max = 1342177280

Settings in server.bat:
JAVA_MIN_MEM=512m
JAVA_MAX_MEM= 1536m

Hi Experts,

When I tried to use ffIterator in pub.flatFiles:converToValues, its not repeating but not givinG the next node values. Can anyone guide me how to use ffIterator and get the next node of the flat file schema.

Sairao,

Please check this link,it will explain on how to use ffIterator
[url=“wmusers.com”]wmusers.com

HTH,
RMG