while processing 250MB file pub.flatfile.convertostring is utilizing total memory causing server crash. is there is way out to avoid that. is there any alternative aviod loading whole string to memory at once?
Pub.flatfile.convertostring causing server crash while handling large file due to memory utilization
For convertToValues, one can use iterate = ‘true’ and the ffIterator inputs to process one record at a time (we’ve written a helper that can load X records at a time for convenience).
But for convertToString – this implies you already have a large IS document loaded into memory already. Perhaps that can be adjusted as well. What is the source of that data?
If you specify an outputFilename, the data will be written there instead of to memory. If that data needs to sent somewhere other than as a file, then you can open that file as a stream and use that stream to send via HTTP or other.
You can also use convertToString in a loop to convert a record at a time. Might need to adjust the FF schema slightly. Loop over the list of the IS document, for each record, convert to string and write to wherever you want the string – just don’t put it all into some var/structure that will retain everything in memory.
The overall challenge here is to simply avoid holding the overall data set in memory. As @reamon mentioned the use of the iterator for FF services is your friend.
When converting your logic to this streaming approach, you will likely need a few iterations until all “load-everything-into-memory” points are removed. Because this is a case of the chain being as strong as its weakest link. In other words: If 99% of your logic use a streaming approach, a single invocation of thw wrong service will still ruin everything by trying to load everything into memory.