Batch loadsfile feeds performence

I am considering file feeds/batch loads of flat files into our DB to be read by WM and then use our middle tier to apply some business logic to our data so we don’t duplicate validation procedures etc.

My question is:

a) Has anyone tried this and what was the performence like? Could a few tens of thousands of records be handled in this way in reasonable amount of time (ie. 30-60 minutes)?
b) What is the best practice in this case?


I did a couple of batch integration with the wm. You can achieve your requirements well withing the timeframe. It also depends on how you design your application. eg. pub/sub. any transformation required ?

You can find a document on ETL processing from the GEAR which talks bits more information.



I have worked couple of integrations with webMethods flat file batch integrations. I don’t see any performance issue. This kind of integrations requires database tuning.

Mike, could you point out the GEAR document that discusses “ETL processing” in webMethods ?



Its’ in the GEAR6 doc. Maybe in the whiteapaper/tools section.