I am trying to load some big amount of data in Tamino server. I have the schema defined and all, but it seems like the memory usage of Tamino is very inefficient, since it causes my computer to thrash like crazy when I try to load a XML file of size around 50MB, but 12MB worked fine for me.
Buffer pool size: 64MB
I would be very grateful if somebody could tell me what my problem is: is it because of the limit on my machine or is it the Tamino server?
BTW: I have 384MB on my machine.
your massload problem probably occurs because of insufficient space on your machine.
The temporary working space needed is approximately 10 times the size of the raw data, i.e. for 50 MB XML data you need ~ 500 MB temporary working space. The Tamino massloader can handle files of 500MB and more, however your machine has to provide enough working space… Please make sure that in “Tamino Manager --> your database --> Location Settings --> Temporary Working Location” enough space is available.
Well, I have gigs of free space on my hard drive, and I made sure that the working location is on that hard drive…
The current version of Tamino, build a DOM when handling the each doc. If you have one doc of 50Mb it will use “a lot” of memory (just try loading the doc in MS IExplorer !)
I have heard rumours that Tamino v.4 will have improved the loading of BIG documents.
BTW if your doc actually is 50Mb, have you considered breaking it down into smaller docs ?
Thanks a lot for the reply, Finn. Actually that’s exactly what I did, but it would be kind of hard for me to break down a doc that’s about 500MB in size… So I guess there is actually no way out of this now…
Could you tell us which version of Tamino you use, and which operating system you use. Also can you say what is the physical amount of memory you have or is 384MB the total memory available?
Software AG (UK) Ltd.
I am running Tamino 220.127.116.11 on Win2000 Pro with 384MB RAM, and 550MHZ PIII CPU.