Failed to load a 134M file

My database’s buffer pool size is set to 800M.
My machines physical memory is 2G, which is set in shxmmx parameter

I tried to load this 134M file using inoxplorer(which I modified the command with java option -xms 1024M -xmx 2048M, it still gives me a java.lang.outofmemory error after a long time running, almost 2 hours

I also tried to run a java app to load the file also with high xms xmx settings, and after 20 mins, it gives me a error “transaction is taking to much time” and failed.

Is there anyone have experience loading such big files into Tamino? Actually I have thousands of fils of this size needed to be loaded


XPlorer is not really tuned to handle such large file. Please use inoload (Tamino data loader) instead



Thank for the suggestion. I tried using inoload to load a 50M file, it is good. However, when I tried to load my 134M file again, after about 20 mins, I got “memory allocation failed” error like following:

<ino:message ino:returnvalue=“8501” 1" ino:docname=“medline06n0341”><ino:messagetext ino:code=“INOXME8501”>Memory allocation fai
<ino:message ino:returnvalue=“0”>
<ino:messagetext ino:code=“INODCI6565”>Data loading completed, number of documents processed 1, loaded 0, rejected 1</ino:mess
<ino:messagetext ino:code=“INODCI6564”>Session 89 ended</ino:messagetext>
<ino:messagetext ino:code=“INODCI6546”>Elapsed time: 271 second(s), data processed: 123.70 MB</ino:messagetext>
<ino:message ino:returnvalue=“6578”>
<ino:messagetext ino:code=“INODCE6578”>Tamino Data Loader finished with errors at 2006-07-20T09:49:52</ino:messagetext>


Unfortunately Tamino has some problems with “big” files. As you have already recognized the performance is not the best and the memory consumption is quite high. My suggestion is to refactor the document structure to make the documents smaller. It is hard to give a number for the optimal document size because it depends on various parameters, but as a rule of thumb I would suggest a document size not bigger than 10 Mb.

Best Regards,