Hi wM Users!
I have a strange problem when trying to insert a byte array into a database, if the size of the byte array exceeds 2000 bytes (approximately).
I use an InsertSQL adapter (against an Oracle database), where the input column for the blob has BLOB as column type, BLOB as JDBC type, and byte array as Input field type.
I use pub.string:stringToBytes in order to transform a string (string of an xml file) into a byte array. This byte is the input for the InsertSQL service.
Inserting works fine up to a size of ~2000 bytes. After that I got the following oracle error message back:
ORA-03120:two-task conversion routine: integer overflow
However, I do not really know what to do with the description for that error (see: [URL]Oracle Error Code Detail).
Does anyone know, how I can set the buffer length? Or any other remedy for this problem?
You have convert xml string to InputStreamReader and then the insert would work.Write a simple java service which would convert string to InputStreamReader. it should be something like this…
InputStream is = new ByteArrayInputStream(xmldata.getBytes());
InputStreamReader clob_ouput = new InputStreamReader(is);
BTW, I did this for CLOB I’m not sure if it would work for BLOB.
Sorry for not responding earlier, however, we managed to get a resolution now:
Formerly we used the JDBC Thin Driver from the library classes12.zip for JdBC connections to Oracle. This is an old version and seems to have this size restriction problem.
We replaced the classes12.zip with a newer version of the Oracle Jdbc driver library, i.e. with ojdbc14.jar (for use with JDK 1.4 and 1.5)
Now we no longer have this size restriction problem.
We already wanted to update the driver adapter some time ago. However, the reason we had the classes12.zip still in use, was that when we wanted to update, we encountered a problem with TIME_STAMP data fields.
We can currently not reproduce the error situation. This scares us a little, since it may pop up some day and then we may have double trouble.
Does anyone know about such an issue with the ojdbc14.jar library?
You’ll want to read up on the Oracle driver handling of TIMESTAMP columns. Over various versions the default handling has changed. Search the 'net for Oracle JDBC driver and timestamp to find the information.
I did that and found the following:
So it’s officially documented as problem.
And the solution that I followed was none of the ones mentioned on the page above. I updated to ojdbc5.jar (after downloading it from http://www.oracle.com/technology/software/tech/java/sqlj_jdbc/htdocs/jdbc_111060.html).
At the moment I use it on our test system and I did not yet experience any problems. Let’s hope it stays this way.