Handling Large XML doc created within IS

Hi,
I have seen documentation on handling large xml docs by streaming them into IS and manipulating them using various built in services, but what if the large xml doc is created in IS? We will be receiving around 20,000 records (130 columns) from a select statement to our ERP. We were then going to put those into an xml doc using IS. I am concerned about memory issues with this. We will then post to TN using the nodeIterator services etc., but then we would like to receive the whole document back so that we can then deliver it to external customers. First, am I correct to anticipate memory problems in IS, both at xml creation and when we get the doc back from TN? Secondly, is there a more sensible way of doing this?
Thank you in advance.

Kirk,

I believe you are on the right path choosing the Nodeiterator in the IS and also configure the TN Largefile handling capability,so that the processsing will be fast when the TN process the document to external world and it will not keep the document in memory instead stores on the disck(tspace).Even when retrieving the document from TN using getContentPartData service it gives the chunk of data and calculate the threshold if document considers as large or small.

Please search this site using keyword “largefile handling”,you will see various threads discussed on this process in a detailed manner,it verymuch helps.

HTH,
RMG.

RMG,
Thank you very much. I am still wondering if I will have any memory problems when I first convert the records received from my ERP to an xml doc. Anyone have any thoughts on this?

Kirk,

What you may have to do is loop thru and map the records/fields to xml documenstructure,and but while executing the service just run the service and also try to increase the developer tool memory heap.The setting will be in bin/developer.bat,so look for the setting -mx128M increase to 512 or above.Another best option is schedule the service and for testing the mapping convert the xml structure to xmldata string and write to a file (but 20,000 records) is not much huge.

HTH,
RMG

RMG,
Thank you again for all your help!
Kirk

Glad to help…