Large File Handling

Hello Experts,

I have parsed the large flat file(1,40,000 records) to EDI format for first time it was successfully executed. After the 2nd submittal……it appears the server is crashing during this mapping process. I have changed the below settings.

setenv.bat file:

set JAVA_MIN_MEM=1024M
set JAVA_MAX_MEM=4096M
set JAVA_MAX_PERM_SIZE=4096M

Extended settings in IS page:

watt.server.tspace.location=D:\FF
watt.server.tspace.max=52428800

Administration > Integration > B2B Settings > Configure Properties:

tn.xml.xqlThreshold=50000
wMTN tn.BigDocThreshold=1,000,000

Could you please suggest if I am missing any thing.

Thanks,
Prem.

Prem – Your configuration looks okay to me. But as you are stating that still you have issue, try to do this processing when you have less load on the server as there are ‘n’ no. of processes which keeps utilizing memory and other system resources. Also just try to change the below one and perform some testing in Dev or QA which may result to solve your issue.

set JAVA_MIN_MEM=1024M
set JAVA_MAX_MEM=4096M
set JAVA_MAX_PERM_SIZE=4096M

Thanks,

Tweaking your memory settings addresses a symptom of the problem but not the root cause. It may help you get by this particular instance, but if you get a larger file in the future or multiple of these large files at once, you will likely have the same problem again.

If I had to guess, the root cause of the problem is that somewhere between the time you receive the file and the time you map the contents, the entire file is being pulled into memory at once, for example, via a convertToValues call. Instead, you should try to pull only chunks of the file into memory at a time by using the iterator option.

An additional option you could consider is to configure the splitOption in the TPA so that TN breaks up the file into individual transactions. That way your service doesn’t have to deal with the entire envelope at once.

Percio