any problems with setting large Java heap size?

I have an io adapter that needs to process a file once per day. It ranges anywhere from 1MB to 100MB. Once a week it’s probably around 50M and around holidays (thanksgiving) it will closer to the 100MB limit.

For the adapter, I’ve set the java heap size to 768m and it will process a large file. But since the large files occur only about 20% of the time, am I using up resources by setting the heap size this large? We are running all enterprise server software on Solaris 2.6/2.8


In the file poll configured operation’s options tab check the Batch output check box… and enter batch line /hierachy count say for example 10 0r 50… by this way you can process very large files with out getting out of memory error… but processing 100 mb data files will take a long time…

hope this helps…