Is that possible to populate high volumelarge files in DB


I have a scenerio where XML files of about 50Kb (actually transformed from Flatfiles) n very high volume say, every 5 mins need to be populated into the Database.

Is that possible to populate the xml files of that big size directly into database without any issues (like Adapter failure for handling high volume or big files) or do i got to think of dumping into the file share n schedule later for those files using webMethods to populate into the database.

The idea of file share is just to make sure that files doesnt miss in case theres problem with JDBC Adapter handling high volume files.

Appreciate ur valuable suggestions.



If you are trying to store whole XMLFile(String)into DB,then try to store xml as BLOB FieldType in DB and for high transactional volume ie hitting every 5mins to DB is all depends on your DB connectionpooling (max/min connections)of your backend.

So invoking the DB is not a big deal in the timegap of 5mins.

Just some thoughts,


Actually, XML files are more than 350Kb and trying to store XML File as rows/columns pattern n not as a string into DB. does that make any affect on invoking the DB (hitting every 5 mins).

Suggestions please.


I believe then IS should not make any effect on invoking(hitting DB every 5mins is not a bigdeal)with that file size.
Also make sure your flow should be handled such a way your Mapping/transformation/looping records/businessrules follow the bestpractices and server tuning take care of processing smoothly.

Just some thoughts,

One of the things that people tend to do is to do loops for every record (in-array = XML, out-array = DB). That tends to eat up CPU. Since IS does a lot of “implicit looping”, and the DB operations can be “bulk” – pub.db:insert can do multiple rows; or if you’re using JDBCAdapter, there’s a BatchInsert – Be sure to avoid looping as much as possible, as explicit loops are more costly, and doing row-by-row inserts greatly hinders DB performance .