Publish and subscribe

Hi All,

I need a small clarification on a scenario regarding publish and subscribe. Right now i’m working on webM6.1 version.

I have selected some records from a table and mapped it to a publishable document and I have published the document. After that I have created a trigger to subscribe the published records. In the subscribe service I need to insert the records into another table. So I have created an insert adapter and I have inserted the published records into the table.
Is my approach is correct or this approach will become an performance issue. If so is there any other way to do it ?
Plz clarify my doubt.

Thanks in advance,
RKK

RKK,
Well to be completely honest there are about a million different ways to do what you are doing in the Integration Server. Typically when we approach a pub/sub situation we do the following:

  1. Extract your source data via whatever mechanism works, ie jdbc notification, ftp, web service, scheduled job, TN … whatever works.

  2. Publish that data into your broker in its raw format (**Note it is possible esp. in the case of web services that you data may be already in a canonical format).

  3. Have a subscribing service retrieve the document via a trigger and format the data into your canonical document.

  4. Publish you new canonical format.

  5. Have your subscription services pick up the document via their triggers and convert the data into the format they need for the target system.

This pattern can be repeated for most pub/sub situations. I would argue as well that it can be and should be used in point to point situations as well. You have isolated your source system from your target system and made it easy to handle adding more subscribers later on by putting in this canonical layer.

Performance : The Integration Server and the Broker are both high performance components. Where you might run into trouble is publishing really large documents into the broker. I would suggest using a more event driven approach instead of batching them up if that is possible. In your case this could be accomplished using the JDBC notifications. Also the degree of transformation and data enrichment that the data goes through will also have an affect on performance. When properly constructed the Integration Server and Broker can handle millions of transactions per day on a relatively small platform.

Of course you could do all the extract transform and then load in a single flow service. But then again if you do it that way, you probably didn’t need to purchase the Integration Server and broker to start with. It’s a pretty expensive batch tool.

markg
http://darth.homelinux.net

Hi Mark,

Thanks for the clarification.

I have kept a notification on the table. If the notification notifies any changes on the table then i’m selecting all the records but i’m publishing single record at a time and while subscribing i’m inserting that particular record.
So there is no possibility of getting a large document.
Once again thanks for the clarification.

Thanks & Regards,
RKK

RKK wrote:
“…table then i’m selecting all the records…there is no possibility of getting a large document.”

Are you sure? What happens when, for some reason, the integration layer is down for an extended period of time? How many rows will be waiting in the table to be published? Depending on volumes, it could indeed be possible to create a large document. Just thought I’d mention this scenario so you don’t get caught by surprise.