I am converting a large IS doc which comprises of many docs and doclists to a string using the pub.xml:documentToXMLString service and subsequently writing it to a file.
This works fine with smaller docs but while processing a large IS doc, it throws an out of memory error.
Most of the TN and non-TN large doc processing is for inbound documents. Even TN has to have the whole document in memory at one point when sending an outbound document.
The only thing I can think of off the top of my head (with only one cup of coffee in me this morning) is to break the document into sections by running documentToXMLString against individual complex nodes within the document (one at a time), append each resulting string to a file, and create the file in small chunks that way.
Various threads discussed procedures on the IS large file handling capabilities…worth to check this thread or search in this forum with keyword (node iterator)
As Said above, using the Stream or note iterator you can achieve handling large documents. The xmldoc to string fails , due doc crossed the limit of the string size.