Process ContentStream on-the-fly issues

I’ve created an FTP port on the Integration Server that points to a package I have created.
The package takes the contentStream and sends to a process that needs to consume the stream as it comes in via the FTP port. I currently use the following to read n-number of bytes at a time from the stream so they can be processed:

-pub.io:read

Once I’ve read the n-bytes, I go do some processing on the bytes and return to read another n-bytes from the stream. This works only for part of the file. The process fails to read and process bytes after a certain amount. I have not been able to determine that amount yet. Was wondering if anyone has any ideas on how to get this working so that I don’t have to read the entire stream into memory before processing.

I’ve tried casting the content stream as a bufferedInputstream, etc with no luck.

_brett.

So is the read under a Repeat step?? are you also using pub.io:close after the read or at the end of Repeat of n-number of bytes??

HTH,
RMg

Yes, the pub.io.read is nested under a repeat step.

I have a branch that if pub.io.read returns -1, then I’ve hit the end of file or stream,
and to exit the repeat loop.
Then pub.io.close is called.

So after the part of the file does it returning -1 or null??

do you see any info in the logs during that time??

HTH,
RMG

Know that in Java it is very common for streams to only provide a limited set of bytes until you read the next part. When you use ‘streamToBytes’ and ‘bytesToString’, it is very possible your stream gets closed. Try using a Java service that takes a stream as input and writes to a file directly.

Chris