I am getting a flat file from FTP server and I am converting it into document using pub.flatFile:convertToValues serive. This works well for small files upto 2 MB.But when I getting large file of 50 MB the service executes for 2 hours and It finally throws outofMemory exception.I have implemented large file handling for convertToValues i.e I am parsing record one by one. I have around 3 lakhs records in the file.
On Statics page IS memory gradually throws from 20% to 100% during 2 hours of execution time. and it comes down after the out of memory error.
Also My CPU utilization goes to 100% on the first few minutes of execution and its remains above 100% for the rest of the time.
When I enable GC verbose I found out that GC is running almost every second. And all GC run is for young generation. Is this a normal behaviour ??
And this GC routine starts running as soon I start my IS server.
I have MIN and MAX memory set to 2048 MB.
Any comment on this is highly appreciable !!
I would focus less on what GC is doing as it is very unlikely that it is not the cause of your problem assuming you are running a current HP JVM.
I would focus instead on ensuring that you are dropping unused documents and variables consistently so that only the minimum number of copies of these are in memory. You might also have somone look over your large file handling logic to ensure you are using it correctly.
Thanks for your reply.
I have dropped all the used variable, then also i vil hav someone review my code. And will get back to u.
But I was still wondering is it OK if GC runs every alternate second.?
The JVM runs GC when it needs to do so. Lots of posts here that point to articles on how GC works, but you’ll want to read HP’s docs on how (or if) to tune their JVM.
Be happy that the GC routine runs as often as you see it. In the older JVMs, it was necessary to force a garbage collection using a java service and a scheduler. You will see many argue over whether a service should call GC, however, it has always worked for me.
With regards to your out of memory exception. This is because you have not set up the large file handling properly. Dig around on this site and look through the documentation carefully, and you will find the answer that you need. It may involve some preprocessing of the file or some kind of file handler that you create for your system.
Happy New Year
Are you doing large file handling when Reading the input file, but still creating the whole output in memory? Just wondering.
I have set iterate parameter of convertToValues service to be true.
Thats all I have done and then I am checking ffIterator value inside a repeat step. Is their any specific parameter in convertToValues service or anywhere else Which I need to change to configure for Large Flat File handling ?
I am attaching my sample code for large flat file handling.
Please let me know if their is any hole in thr, which may be causing the problem
largeFlatFileTest.zip (8.9 KB)