Large File Size question

Hi ,
I am facing a few problems wrt large files.
Well Actually i do not know if files in the range of 1MB-5MB actually be called large files here … but … FWIW i am using that term.

I am trying to send from my box -acting as client, running IS ver 4.6
files of various sizes to TN .

Files under 1MB do not face any problem.

For the 1 MB file i am able to send the file , and TN processes it but it takes a long time. When i check the HTTP header for status i get a NSRuntime error. Looks like memory issue or something. Can you please advise me as to what parameter i need to change.

For files greater then 1 MB but under 5 MB i encounter a runtime error while trying to do a bytestoString (after i read the whole files using getfile-read as bytes) or doing a recordtoDocument.

Please advise as to how i should do about solving the problem.


Use getfile loadAs stream (especially for larger files and you can make use of bufferSize option also).And also do http as a stream not as string or will see the difference in processing.1-5MB is not a big deal.


Thank you very much.

Another newbie question.
I have the code currently flowing in the
getfile-(as bytes)–>bytestoString->Map to my record->recodToDocument->Http to TN.

If i move from bytes/string to stream based processing other things need to change on the client side and TN side. I have never dealt with streams before.


Make sure you increase the maximum Java heap space in the server.bat/ file you use to launch Integration Server.

If the file that you are access is XML, the streaming support in the IS XML parser is very efficient. If your file sizes may grown much bigger than 5M or you may have lots of these files being processed at the same time (say end-of-quarter order processing) it may be worth getting comfortable with the streaming XML processing services in pub.web (NodeIterator, et al). See WmSamples sample.complexMapping.largeDoc.

Also, if you are calling TN on the same IS, you do not have to do an HTTP post. You can directly call the TN receive service, which will just pass the pipeline containing the reference an XML Node, rather having it to be loop through the network layers, which would happen if you do pub.client:http POST of an XML string into TN.


As Fred metioned above if your TNServer is on the same IS then use the routeXML or tn:receive service and map the input node convert it using stringToDocument.

For handling streams just include one more stemp in your flow.

getfile-(as stream)–>streamtobytes –>bytestoString->documentToRecrod–>Map to my record->recordToDocument->stringToDocument–>routeXML to TN.


Thank you very much Fred and RMG.

The TN is not on the same IS. In prod env it will actually be in geographically different loc.

It is highly unlikely that the file size will grow . The file is produced by one of our C++ programs and will produce files in the
1-5 MB range as desired by the user.

I will try the advised solutions and get back with results.
Thanks again for helping me out.


Got it,Let us know the feedback.

PS-Using streams a better solution for processing large files.


Thanks for your help.
I have successfully completed the test.
I made the following changes. - all the changes were suggested by RMG & Fred. Thank you very much.
Increased the JAVA max memory to 512 MB.
Reading the file as a stream and converting it to bytes before converting it to a doc.

In addition i have increased the net.timeout parameter to 5 minutes.

I have question though…

I am using the $Xml variable to http the data…
Is this the right way to Http data?

One thing i noticed is that the IS on the partner side is still in the http function while on TN the data has been completely processed …

Any ideas as to why this would eb happening?

Thanks for all your help.


I have a

The global variable is $xmldata not $Xml …also set the headers/content-type=text/xml before doing http posting


I am sorry . The variable i used is $xmldata and i have set the header/Content-Type to text/xml. I just did not mention it correctly here. Sorry again.


Great,Take it easy.

thanks a lot