I have this table:
CREATE TABLE SCOTT.san_clob_blob
(
id NUMBER NOT NULL,
clob_field CLOB,
CONSTRAINT PK_san_clob_blob PRIMARY KEY (id)
);
/
I have created this PL/SQL Stored Proc in Oracle9i:
CREATE OR REPLACE procedure san_clob_insert (rid in number,clob_data in clob) as
dest_var clob;
BEGIN
select clob_field into dest_var from san_clob_blob where id = rid for update;
DBMS_LOB.APPEND(dest_var,clob_data);
commit;
END;
/
I then created a JDBC Adapter Service for the above stored proc and gave PARAM JDBC Type as “NUMERIC” for rid and “CLOB” for clob_data; and made both input types as java.lang.String.
But when I am trying to insert bulk clob data (string of size say 100kb) through the same I am getting this error: “ORA-01460: unimplemented or unreasonable conversion requested”.
But when I execute the same for little clob data (string such as “This is a sample string”) it gets appended perfectly.
Passing data to a stored proc has size limitations. Passing more than 32k will fail (I think the exact limitation depends on a number of factors). You’ll need to resort to using Java services to store data that is larger than that. Here’s an overview of an approach:
Change the stored proc to create an empty clob when clob_field is null.
In your service, call your adapter service, passing null for clob_field.
In your service, call a new Java service that selects the record for update, gets a clob locator, then writes the clob data.
I forgot to mention that I have already created an INSERT Adapter Service (I named it adsInsertClob) in which I create a new record with an EMPTY_CLOB assigned to the clob field. Then I do exactly the same thing that you have mentioned. That is, in the Java Service “jsReadInsertFileInChunks” I first read a file in chunks, process the chunk and before the “first” chunk I INSERT an EMPTY_CLOB. Later on I use the “adsAppendClob” Adapter Service (which is the JDBC Adapter Service that corresponds to the above Stored Proc) and pass the chunks to the “clob_data” parameter of the same. However, when I pass a large data (I found the breaking pt to be 10907 bytes) it fails with the ORA-01460 error as I mentioned earlier.
Just for additional info, what I am trying to do is, I have a huge txt file (of say 10MB) which I have to read in chunks of 500kb. For each chunk I do some processing and then insert the same into a database (clob). I have to do the same in chunks (including inserting/appending the clob field in the database).
I also tried by creating a Custom SQL JDBC Adapter Service with the SQL as: declare dest_var CLOB;
begin
select clob_field into dest_var from san_clob_blob where id = ? for update;
DBMS_LOB.APPEND(dest_var,?);
commit;
end;
Gave the JDBC Input Type as NUMERIC and CLOB respectively. Even this works for a small chunk of data but fails for larger chunks as in the earlier case and with the same error code (ORA-01460).
But strangely, it did work for the first time when I tried with 100kb data!!
You will not be able to use a JDBC adapter service for this. You’ll need to do the select for update and then store the clob data within the Java code. The JDBC adapter doesn’t provide support for chunking data.
The downside is that you’ll need to configure the JDBC connection info somewhere that your Java service can get it. One approach is to define a WmDB connection and then read that configuration to get the connection info for your service.
As mentioned previously, I am performing “chunking” in the Java Service “jsReadInsertFileInChunks” only, as you suggest. The chunk of data (say 500kb) now needs to be APPENDED (after the first chunk has been INSERTED) using an Adapter Service. That is, I will have to “invoke” the Adapter Service in the Java Service (jsReadInsertFileInChunks) after I have processed a chunk. While the chunking logic is working perfectly fine and I have been able to save (append) the chunks to a flat file, but the same is not working incase of a database. The requirement is to store (append) the processed chunks in the Database (clob field). Latest, I have tried to do this using a Custom SQL JDBC Adapter Service with the custom SQL as:
declare dest_var CLOB;
begin
select clob_field into dest_var from san_clob_blob where id = ? for update;
DBMS_LOB.APPEND(dest_var,?);
commit;
end;
The JDBC Input Type as NUMERIC and CLOB respectively.
When I “invoke” the above Custom SQL Adapter Service by passing the processed chunk to the clob_data input parameter, the same works only for small amounts of data chunks. However, it fails for larger chunks with the ORA-01460 error.
I think you may need to stop trying to use a JDBC adapter service to write the chunks–the number of selects and commits you’ll do will make this a very slow operation. Do all the clob work within your Java service–do one select, read a chunk of file, write that to clob, repeat until EOF, commit. This approach too will have a chunk size limitation (I think there is a call you can make to get the max chunk size) but you’ll have just one select/commit pair instead of a bunch.
i am also facing same problem,i am using 6.0.1 version ,oracle 9i,thin Drivers
i need to insert CLOB data in DB Table,so i used the java.lang.CLOb as inputType to Adapterservice ,i need to convert XMLdata (output of documentTOXMLString) to CLOB Data,please provide me java service for conversion of data to insert in DB table,my requirement is use thin drivers not OCI Drivers…