Notification using Enterprise versus just using IS

Suppose a client has an application that inserts rows into an Oracle database. They want to send this data to a third party for validation as XML but are happy with sending this data once a day. They don’t anticipate that any other app is interested in this new data.

What are some of the issues you can think of regarding using Enterprise server with db notification versus just creating a trigger to insert into a buffer table, and having IS query this database once a day, transform to XML, send to partner and flag the record? I’ve got a fairly good idea on how I would do this with only IS and possibly TN. I’m thinking about guaranteed delivery issues, errors in adapters vs errors in flow services, complexity, scalability, performance, etc.

Assuming the Enterprise server were used, does the broker send the notification to IS for transformation and delivery? Does a component subscribe to the notification events and create the XML? Who bundles up the events for one daily delivery? I would appreciate a typical flow description between all the components.
Thx

1)Using B-E bridge(B2B-Enterprise bridge), broker can invoke a IS service, the IS service can be coded for transformation and delivery to the partners.
2)Rather than a component subscribing for a event, u can set in such a way, the event triggers the IS service which does the work of xml manipulation,routing etc.

I would recommend using IS. This keeps the number of components down. When you upgrade to 6.0 then you can convert to using the DB adapter (if necessary) with minimum impact to the existing implementation.

  1. Create a buffer table.
  2. Add a trigger to the table being monitored to write the key (rowid or whatever makes sense for the data) of the inserted/updated/deleted record to the buffer table.
  3. Add a scheduled service to IS to poll the buffer table.
  4. For each row in the buffer table, the scheduled service passese the key to a transformation service.
  5. The transformation service reads the data from the table for the provided key, creates the proper XML and sends it on its way (via TN perhaps).
  6. Update/delete the entry in the buffer table so it isn’t processed again.

Or if you want to do batch-style processing:

1 & 2 as above.
3. Add a scheduled service that invokes your transformation service.
4. The transformation service reads all rows in the buffer table.
5. The transformation service creates the XML file and sends it.
6 as above.

You may want to add some sort of sequence number to both the buffer table and on the transformation service side. This can be used to avoid duplicates in the event of failures that could occur after the data has been read and sent but before the buffer table was updated.

These steps mostly mimic what the DB notification does. When you upgrade to 6.0 you can replace 1-3 with the DB adapter (I’m assuming the adapter will not require publishing an event to the broker and can instead just do all its work within IS–someone correct me if I’m wrong).

For step 2 described by nani, there is still a component with a subscription–the B-E Package. ES almost always uses pub/sub (the “deliver” model is used rarely, if ever, for anything other than replies). Of course 6.0 eliminates the B-E Package but then an IS component registers the subscription.

HTH.

Will: what did you end up doing for this need?

Thanks Rob. Let’s just say that for my original question the need for this never materialized. ;o)

However, we are doing what you mentioned on my current contract. Reading from a shadow table, inserting the records into SAP and then deleting the records - all without ES broker/adapter components.

Will