We are trying to retrieve data from an xml file stored in our file server through the File Polling port.
The service that the port invokes reads the stream and converts it into bytes and then into string and writes a copy of the data in another archive folder using the ‘PutFile’ service. It then converts the string into Document using ‘XMLStringToXMLNode’ and ‘XMLNodeToDocument’ services after which it processes the XML.
This works fine when the files size is small (around 2 MB) but with large files (24 MB) the process is failing.
Is there any size limitation on the files for File Polling Port?
Can someone help me out with this?
How does the process fail? What are the error messages?
What is the error you are getting in the logs?
And if you are using filepolling port for polling .xml files did you set the content-type=text/xml in the filepolling configuration page?so that you dont need to convert stream to bytes and bytes to string steps,directly you will see the node object in the pipeline and then use XMLNodeToDocument and continue further with your requirement.
Actually the XML file contains a number of Credit Reports which we loop over and store one by one into the DB.
Now when the size is small the files go through. But after around 11 MB, the process fails after storing the file in an Archive folder; in the sense that none of the reports are stored in the DB and the files stay in the Work folder of the File Polling Configuration.
Do you have any idea on this?
There is no issue with configuration of file polling folder I guess. We are in fact using xml/plain as the content type and everything is working for smaller files (Tested upto 10 MB). But for large files (11 MB and above), the same doesnt work. And there is nothing logged in my Server Log or Error Log.
However the same thing on my team mates m/c is giving OutOfMemory Error.
We need to run files to the size of 50 MB. I hope that there is no size limitation as such.
Here are the server log entries for my server:
pub.xml:xmlNodeToDocument java.lang.reflect.InvocationTargetException: OutOfMemoryError
com.wm.lang.flow.FlowException: is coming in the root service catch block.
For getting rid of OutOfMemory exceptions and for handling large files in IS you could use NodeIterator,getNextNode service functionality for processing the xml files,this way IS will not keep the data in the memory and process each node (especially for number of loops in the data)you told that you loop the data and inserts to DB.
Please see the documentation of above service and do search on the above mentioned services,you will find various posts discussed on this kind of problems.
I tried with NodeIterator and getNextNode, but even that doesnt seem to be working. Now we dont get the OutOfMemory error but the file just sits there in the work directory of the file polling root directory.
Smaller files though are working fine as usual.
Could you have a look at the HTML view of the code that we have used and see if there is anything that we are missing out here.
I could finally make the whole thing run…thanks for the input on NodeIterator.
There was also a prob in the step xmlStringToXMLNode where we were passing the string as input to get the node and the process was failing.
Rather than that I passed the byte array in the $filedata field and everything fell in place.
Thanks a lot,
Sorry for my delay response,somewhat busy y’day…
Anyways glad to know it is working…