How to handle Blob field of a table using builtin DB services

I like to know any of you have experience to handle Blob field of an Oracle table such as insert, select and update uinsg the built-in DB services.

I am trying to insert a binary file to a Blob column called PICTURE. I have create a db alias for all the connection information. My simple flow service is like this:

(1) pub.file:getFile: map the service out parameter body.bytes to a pipeline out parameter called PICTURE defined as an object type.

(2) pub.db:execSQL: this is generated by using Server Administrator’s “generating from table”. It generates dbParamType as OTHER for PICTURE.

When I run it, it works fine with a small file (< 1 K), but once I use a bigger file (30+ K). I always get “java.sql.SQLException: Io exception: Connection aborted by peer: socket write error”.

I try to change the dbParamType to other type, but then I always get Invalid Column exceptin.

I like to know if any of you have similar experience using built-in service. Or perphaps I need to write Java Service to handle this case.

Hi Kevin,

I have had a similar experience while working with CLOBs, which would be similar to BLOBs as you are using. There are a number of ways to manipulate CLOBs in an Oracle table.

You can write a store procedure in Oracle that performs the insert, select or update (remembering that they would have to be separate functions). The store procedure can be called by invoking pub.db.call from the WmDB package.

Another method is to write your own java services that directly connects to the database and performs the insert, select and update. This is the method I used. Let me know if you need a hand with the java services should you choose to write you own services.

I suspect that a file over 30k exceeded what the parameter types can hold so it threw the exception, not sure why it was an IO exception though, have to investigate a bit more.

Hi Alex,

Thanks for your reply. What Oracle driver do you use? I am using the Oracle thin driver, and I am suspecting maybe it is driver problem. I haven’t tried OCI driver yet, but from Oracle’s site, they mentioned that it is slower to access Blob column using thin driver ([url=“Technical Resources | Oracle”]Technical Resources | Oracle), but they didn’t mention about any size issue.

I did write a standalone applicaiton using JDBC to insert the Blob column. I tried two ways based on the information I found on the web. The first method is to use Statement.setBinaryStream() method to set the parameter for Blob column. When I try this, I got exactly same problem I encountered using webMethods execSQL service.

The second way is a propriotary way directly using Oracle’s BLOB class. It involves insert an empty Blob using Oracle’s empty_blob() SQL call first, then using select this record for update. Using the ResultSet.getBlob() to retrieve this column and downcast to Oracle’s BLOB class (oracle.sql.BLOB). This BLOB class has implement an extra method called getBinaryOutputStream() to return an OutputStream. Then I can write a file to this OutputStream, and finally commit. This method works fine with any size of file. Because of this, I suspect more about the driver issue. Possibly the thin driver does not handle setBinaryStream() well for Blob column. But I am not quite sure.

So it appears to be writing a Java Service is the way to go, or I have to give OCI driver a try to see if it works with execSQL service.

We used SQLJ driver and concocted our own BLOB insert method.

Ray, were you able to use the built-in webMethods database connection pool?

I’d like to hear more about what your team did. Thanks.

we handled the connection ourselves, through the custom SQL-J package. Open and closed each connection. Not very efficient, but neither is moving BLOBs back and forth. We actually wrote CLOBs to Oracle rather than BLOBs, but the principles are the same.

We tried to use the built-in WM stuff in every way possible, but could not get it to cooperate. One of the guys on the team is a java genius and he wrote the .jar file in about 4 days.

I can’t take the credit for the work on this one, but I know how it was built.