Reading large file by stream using getFile

Hi Experts,

We are reading large flat file(5-10GB) by stream using pub.file:getFile service and parsing via convertToValues using iterator.

There are few queries which I have observed:

a) During the parsing using convertToValues, we deleted the big file but still parsing was happening till last record. As per my expectation, we should not be able to delete the file at first place? Even if we were able to delete the file then converToValues should return error.
b) We use TN large File handing and observed the same behavior? is this an expected behavior?
c) If we were able to delete the file still parsing was happening which means file contents were available somewhere for the code to process. Do we know where those contents are kept and how can we control it?

How does that whole thing work ?and anything else I need to worry?

Regards
Sunny

Hi Sunny,

you should delete the file AFTER convertToValues is completed and access to the stream is no longer needed.

Regards,
Holger

The behavior might depend on the operating system. A deletion might be noted somewhere but not actually performed until all handles to the file have been released. I think I have observed this under windows.

Hi,

That would be the norm on Unix too: it is only when all handles for that file were destroyed that its inodes would be returned to the free pool.

Best regards,