Mapping large XML files XSLT or nodeIterator


We will map large XML files into other large XML files. The files we will handle can be up to 50 M, but likely to increase in the future. We are thinking of doing this either by using nodeIterator approach in a flow service or creating XSLs and run mapping using XSLT.

Has anyone run XSLT on such large files ? How is performance using XSLT compared to nodeIterator ? My primary in concern is memory usage since we today are running out of memory more and more often…

Appreciate any input on this and otherwise using XSLT,



We do lot of XSLT mapping also handled large files EDI,XML but very least performance issues we have seen,but i am sure it will be very faster in doing XSL mapping vs entire mapping inside webMethods flows.

Ofcourse NodeIterator does the job for handling largefiles in IS,i have seen above 30Meg files run fine.


1 Like

Thanks RMG,

Is it possible to say anything in general in terms of memory usage XSLT mapping vs nodeIterator mapping approach ? What do you think is the most efficient ?

Also, have you used grapihcal development tools like XML Spy for creating the XSLs ? Do you know if the generated XSLs from such tools are efficient or if it becomes messy like using Frontpage for creating HTMLS…

Can you see any benefits in development time using XSLT instead of NodeIterator approach ? Reuse and debugging ?


“…will be very faster in doing XSL mapping vs entire mapping inside webMethods flows”

What observations or data is this statement based on?

For XSL Tools, you may consider Xselerator or XSL Designer/Debugger or XML CookTop;

XSL in general will perform badly for large files (as compared to small files); but carefully written XSLT on 50 meg file should not be an issue at all;

I have no comparision data with nodeIterator.

Other considerations before choosing either approach (apart from performance), should be maintainability!!

XSLT tends to be more complex.