I am involved in an integration where the source system is packaging different xml transaction types in a batch. There are multiple subscribers to the individual transaction types. The two options I have is :
-
Publish the blob from the source system to the broker as a single event. A trigger is created for every target system that wants to subscribe to a transaction type within the blob. The trigger for a has a single service that loops through the xml transactions in the blob and call different services to map individual transaction types to the target transaction format.
-
option 2 is to publish the blob to the broker, have a trigger associated to call a service that loops through it and publishes invidual tran types back to the broker. There is one trigger and service created to subscribe to individual tran types on the broker. These serices map the xml file to the target tran format.
So the question is, is one trigger/service/document type better than having multiple triggers/services/document type one per tran type.
is looping through an xml blob in memory better than publishing individual tran types to the broker ( causing more disk I/O).
Does publishing a blob as a event versus a transaction a good way of implementing pub/sub.
Thanks,
Mow