Cannot insert byte[] bigger than ~2kb into BLOB

Hi wM Users!

I have a strange problem when trying to insert a byte array into a database, if the size of the byte array exceeds 2000 bytes (approximately).

I use an InsertSQL adapter (against an Oracle database), where the input column for the blob has BLOB as column type, BLOB as JDBC type, and byte array as Input field type.
I use pub.string:stringToBytes in order to transform a string (string of an xml file) into a byte array. This byte is the input for the InsertSQL service.

Inserting works fine up to a size of ~2000 bytes. After that I got the following oracle error message back:

ORA-03120:two-task conversion routine: integer overflow

However, I do not really know what to do with the description for that error (see:

Does anyone know, how I can set the buffer length? Or any other remedy for this problem?


You have convert xml string to InputStreamReader and then the insert would work.Write a simple java service which would convert string to InputStreamReader. it should be something like this…

InputStream is = new ByteArrayInputStream(xmldata.getBytes());
InputStreamReader clob_ouput = new InputStreamReader(is);

BTW, I did this for CLOB I’m not sure if it would work for BLOB.

Hi salvo!

Sorry for not responding earlier, however, we managed to get a resolution now:
Formerly we used the JDBC Thin Driver from the library for JdBC connections to Oracle. This is an old version and seems to have this size restriction problem.
We replaced the with a newer version of the Oracle Jdbc driver library, i.e. with ojdbc14.jar (for use with JDK 1.4 and 1.5)
Now we no longer have this size restriction problem.

We already wanted to update the driver adapter some time ago. However, the reason we had the still in use, was that when we wanted to update, we encountered a problem with TIME_STAMP data fields.
We can currently not reproduce the error situation. This scares us a little, since it may pop up some day and then we may have double trouble.

Does anyone know about such an issue with the ojdbc14.jar library?

kindest regards,

You’ll want to read up on the Oracle driver handling of TIMESTAMP columns. Over various versions the default handling has changed. Search the 'net for Oracle JDBC driver and timestamp to find the information.

Thanks Reamon!

I did that and found the following:
So it’s officially documented as problem.
And the solution that I followed was none of the ones mentioned on the page above. I updated to ojdbc5.jar (after downloading it from
At the moment I use it on our test system and I did not yet experience any problems. Let’s hope it stays this way.