Using JDBC 4.2 against tables with >100 fields

We have a number of cases where I need to build configured operations against tables that have >100 or >300 fields in them. In a prior project I was unsuccessful in getting the adapter to introspect these large tables. Even having added memory to adapter or EI; adding a single field, saving, then autofill; single field by field; never have we been able to sucessfully introspect these large schemas.

A workaround done previously was to build the configured operation in VI then take the broken component and build the WHERE clauses, etc. in EI. This, however, is not functioning for the large schema (389 fields).

Has anyone had any luck or resolutions for similar issues? Watching the JDBC adapter in debug it appears that it is doing a very laborious traversal of the full schema for each element and therefore time would be the crucial component. Even extending the timeout on EI didn’t give me resolution.

Any help would be excellent!

Sincerely,

Scott

Basically I’ve moved away from using the SELECT operation for that very reason.