Hi All,
Here is the situation for which i am not sure the best webMethods components to be used for;
Source system calls a webMethods interface based on certain conditions, expecting it to publish the event to the target system.The events are expected to be in a queue, and be available as and when target system needs to pick them up. There will need to be a certain amount of events in the queue at all times, around 1000 to a worst case of 10000 (in case of a backlog or when target system server shutdown).
Is this very well possible in webMethods…??
If yes, then what are webm components required (Broker or MQ or …??)
Broker and IS. webMethods does not have a product with “MQ” in the name.
There is a concepts guide and a component overview doc that can be very helpful in understanding the wM product line. Browse around on the wM site (or the Advantage site if you have access).
Thank you much to both of you for the quick response.
Even for me also, Broker was first thought, but how this case will be handled in Broker
“The events are expected to be in a queue, and be available as and when target system needs to pick them up”…
“The events are expected to be in a queue, and be available as and when target system needs to pick them up”…
This describes precisely the primary function of the Broker. This is what it does. Documents are published to the Broker, the Broker places the document on one or more subscribing queues. The documents remain in those queues until the adapter (IS in the usual case) picks them up.
Hi Rob,
Thats absolutely right what you said about Broker functionality. In usual case once the document is published to the broker it is picked up by the trigger(), which executes a service defined in the trigger. And when the triggered service is executed successfully, the document is deleted from the Broker queue.
But in our case the target system needs to pick the document published to Broker. May be through one of the webMethods service which will be run by target system (not to be triggered by the triger defined for that publishabe doc)as and when they required the documents. IN usual pub-sub case, trigger is automatically run once the document is published to teh broker…
Broker is not really meant to be used that way. Its really not a long term storage mechanism (yes it does persist but its not meant to store data like a database) and not really meant to be queried on a pull basis from an external source. You could probably rig something up with an external broker client but even then I don’t think that is the best architecture. Without an active subscription ie a trigger in IS terms the documents have no where to go hence no queue. Although you could do something with the Broker API again I not sure that is the best way to go in the is case.
There are probably a hundred other ways to do this, a couple: 1- You have have the trigger fire a service and populate a database table and have the external source poll for that 2- You could have the trigger fire a service and store the data on a remote file system and it poll the data off that 3- You could modify the external app so that it could receive a push instead of it having to poll. Any reason you can’t do that? Is the app not up all the time? Is there other processing involved that has to complete first?