Hi Experts,
We are reading large flat file(5-10GB) by stream using pub.file:getFile service and parsing via convertToValues using iterator.
There are few queries which I have observed:
a) During the parsing using convertToValues, we deleted the big file but still parsing was happening till last record. As per my expectation, we should not be able to delete the file at first place? Even if we were able to delete the file then converToValues should return error.
b) We use TN large File handing and observed the same behavior? is this an expected behavior?
c) If we were able to delete the file still parsing was happening which means file contents were available somewhere for the code to process. Do we know where those contents are kept and how can we control it?
How does that whole thing work ?and anything else I need to worry?
Regards
Sunny