Ordering in webMethods

Architecture question:
Does webMethods have any mechanisms to control order of documents processing in a more refined way? Say you want orders from the same customer to be processed in the sequence the orders were produced. But you don’t care what is the order for two different customers.

Of course you could build the integration to process all documents serially which of course would assure the right order not only for each customer but throughout the entire system. Serial processing could however be unacceptable for performance reasons. Ideal would be to process documents in parallel but only documents relating to the same entity (customer) would be processed serially.

Are there any built-in mechanisms in wM6 that can do it?

In the old Enterprise Server ATC there used to be something called Level which could be used to do it. But what about wM6?

TIA for your thoughts.

I haven’t thought this through completely but it’s possible that using TN rules can do what you want. Set up a rule for each trading partner. Configure the rule to run synchronously. That will make each partner’s doc serialized and give the ability to process multiple partner docs at the same time (roughly). You’ll have to make sure that the entire thread of processing completes however. For example, if you use TN delivery services after you’ve processed/transformed a doc, then you’ll hit the same race condition problem since delivery services are threaded.

If you’re not using TN already, getting just this benefit probably isn’t worth the effort.

My .02

Rob

Thanks Rob for your comments.

In fact I was thinking about the architecture without TN. The architecture I meant was EAI architecture involving IS, IS adapters, Broker.
Let say you have an application (e.g. CRM system) and the webMethods adapter that publishes notifications when some details of the customer change. This notification goes to a backend system (say a Billing system) which processes the message to synchronize the customer info in its database.

Would that be possible in webMethods to process serially the notifications for one customer while processing in parallel for different customers?

Thanks.

The broker queues can be configured to control ordering. And in fact, it can only control ordering of messages when they are from the same publisher. So as long as your notification process in IS is serialized/ordered for a given customer, the handling by the broker will maintain the order.

You may need to control the backend too, to make sure a race condition isn’t introduced there.

Yes, I know that broker can order by publisher. But if I send out the notification from the source application using webMethods adapter, it will be the same publisher. In other words I will process all requests serially, which might not acceptable for performance reasons. If I do ordering NONE on the broker than the order is not maintained.

What I was hoping webMethods could help with is processing the notifications serially for given entity (say Customer in the CRM system) while in parallel for different entities (different customers). And this would still be possible for one webMethods publisher, e.g. Adapter.

Thanks for your thoughts.

Fenek,

Can you setup multiple brokers all subscribing to the same document, but each with a customer specific filter?

Source -> Broker -> Trigger_1 -> mainProcessingService
Source -> Broker -> Trigger_2 -> mainProcessingService
Source -> Broker -> Trigger_3 -> mainProcessingService

Each trigger would process each customers documents serially, but the main processing service would be running in parallel as long as your customer load is evenly distributed.

Jesse comment raises a point I hadn’t considered–which direction things are flowing may make a difference.

IS can connect to only one broker. So stuck there.

Can multiple brokers push things to a single IS? I haven’t had much hands-on with 6 so I don’t know.

Since IS can connect to just one Broker (a serious flaw IMO), and IS uses a connection pool to manage ALL communication to the broker, then it would seem that the IS is always viewed as 1 publisher, and thus serialization is the only solution if document order is important at all. Hosting multiple adapters on a single IS is hampered yet again.
:frowning:

I am not sure if the thread has been distracted by chance in thinking about complex solutions. The way I see the problem is to introduce parallelism in the consumption side. So if I created multiple triggers with conditions on customer name range, e.g., A-J, K-R, S-Z (or more depending on the need for throughput) then wouldn’t it solve the problem by creating separate client queues while maintaining order within each queue?

Regards
Ashok

Another point to be considered…
Taking the same example as that of CRM & Billing system.
Would the Billing system have the ability to consume data in parallel for different customers?
If yes, then any of the approaches above will help.
if no, then that needs to be added in the receiving application if possible.As Otherwise the performance still gets hampered at the receiving end.

Ashok–going from Broker to IS, yes, order would be maintained via multiple Broker queues (would need to ensure order is maintained appropriately during IS processing). Going from one or more IS adapters to Broker, one must process the docs serially as this is the only mechanism IS has for guaranteeing order.

Of course I could be mistaken so someone please jump in if I am.

Thanks for all your inputs to the discussion.
I see that probably many of you come from B2B world. I don’t think that the solutions posted so far will be suitable for EAI integration, say the CRM & Billing scenarios. Creating multiple brokers and filters might work for small numbers of customers, but imagine I am a telco operator and I have 5 millions of customers… What then??
Let me present the problem in more detail. Let say we have a CRM system and an IS adapter for it. In the CRM system we have 5 millions of customers. Then we have several backend systems, SCE (say SAP), billing system and mobile network activation platform.
The problematic scenario I see could happen when a customer places and order through the CRM system. To order completion process may take couple of days. This would be a long lived process, which would go to SCE system to say order a phone. After the order is executed in SCE the process continues and creates an account in the billing and finally activates a customer on the mobile network. Now what if the same customer want to place another order while the previous is still being executed? Of course you could limit your CRM system functionality and disallow the customer to place another order. But this is too much of a limitation. The desired would be to allow the customer to place another order and have the middleware layer to assure the correct execution of the order. Say now that the other order is to add some extra mobile services (call barring, etc.). To execute the order one needs to send it only to the mobile activation platform.

This scenario shows that the target application queuing is not a suitable solution. The network activation platform will not know about the first order if the first order is stuck in the SCE. It is the middleware layer that needs to maintain the order.

To summarize the problem the source application is a single IS publisher. It has huge number of customers. The customer may place and order without waiting for completion of the previous order. Each order triggers a process and its execution may involve many different target application, say in A, B and C applications or only in C application.

IMHO, webMethods does not support this kind of solutions. To make it work one would need to build some kind of locking mechanism in the middleware layer. I was thinking in trying to use Xref and latching wM tables and use them in this specific customized way.

What are your thoughts?

Thanks again for your opinions.

Looks like what you are purposing are long running processes. You are right, this not a normal pub/sub situation. I would recommend looking at the process modeler. It is capable of running these long lived processes with multiple paths/filters/branches. The processes can run in parallel or single thread depending on your model. So the incoming message can take a different path depending on your criteria. The state of the process is maintain in the database although it does use messaging to communicate state changes.

We haven’t used it a lot here, but we have talked to some of wm’s bigger customers who have. They have used to it to do pretty much exactly what you are talking about.

Mark, thanks for your comments.
I was in fact planning to use wM process models and the Modeler. But I still believe that Modeler does not support my requirements with regards to this ordering.
You pointed out that I was looking for a solution for long running processes. In fact I pictured this scenario with long running processes because it is easier to describe the problem. But this issue still exists in short lived processes, say “near real-time” synchronization. It is just much less likely, almost unlikely, for this race conditions to happen.
Thanks again.

Ashok, this is exactly the way we implemented it in our integration solutions. Define a key field for your document (e.g. Customer_Nr). For a numerical key field you can set up ten triggers that filter by the last digit of the number using regular expressions in the trigger filter (triggers for digits 0-9).

I just wonder why this is not a build in feature of webMethods to do parallel processing of documents. You should be able to define a key field for a document that you trigger on and webMethods makes sure that documents with the same id are handled by the same thread.

Sequential processing is very slow for triggers (maximum of 400-500 documents per minute) even if the service, which receives the document, does nothing. Therefore we often rely on parallel processing.

Fenek,

The problem definition has changed a bit in last few posts.

The need of related messages to be ordered in a long running transaction cannot be handled by middleware. It is the business process that will govern how the SYSTEMS should behave. for example when the related second order comes to phone application, whether it should check for existing order and what should the application do in case the original or first one did not yet reach the target application. On the other hand what should happen if the second order is unrelated one. All these are specific to business needs and no middleware can ever automate such intelligence.

That is my opinion.
Regards
Ashok

Ashok,

I did not change the definition of the problem. I just described a scenario of a long running process because it is easier to picture the problem. But the problem still persists with short-lived processes.

I often hear people say that it should be middleware responsibility to assure the correct order of execution of the related messages. And the related messages are the messages pertaining to the same instance of an entity, say the same customer.

I agree with Ulf that it would be nice to have a mechanism in the middleware to define business objects and their keys and have the middleware assure that 2 separate related messages should be processed serially while unrelated messages can go in parallel.

Thanks for your comments.

Ulf wrote:

“I just wonder why this is not a build in feature of webMethods to do parallel processing of documents. You should be able to define a key field for a document that you trigger on and webMethods makes sure that documents with the same id are handled by the same thread.”

There is a way to do it with TN. TN can easily discern different attributes from docs and process them as needed.

Fenek wrote:
“I see that probably many of you come from B2B world.”

Not sure what you mean by that. This problem isn’t any different whether you consider EAI or B2B and the supposed differences between the two (which I contend are no longer meaningful). If IS is to be the adapter run-time environment, then you’re constrained to its capabilities. TN can facilitate what you need to do, based on your description. However, it may not scale well to millions of customers either (lots of TN profiles, lots of processing rules–blech) without some custom handling at some point. The old-style ES adapters really didn’t provide a solution for this at all, other than multiple adapters. I don’t think the ATC and its level controls would have addressed this particular scenario.

If you can’t use TN for some reason you might think about writing a “TN Lite” piece that performs as Ulf described–dispatching docs to company-specific threads (which you will have to spawn and manage) to maintain doc order while in the IS environment which in turn will publish to Broker which will keep order intact as well.

Hope this at least triggers some additional thoughts toward a solution.

Hi Rob,

Thanks for the thoughts. When I mentioned the “B2B world” I meant that the solutions posted here (like creating trigger filters for customers) would be good in B2B scenarios where the number of customers (trading partners) is relatively low compared to millions of users of mobile phones.

Anyway I see that if not using TN the solution requires some custom development like the “TN lite” you mentioned or some tricky use of Xref tables to perform the locking, which I thought of.

Thanks for all your inputs.

Who can explain to me how can I configure the order of traitment of documents.
I’ve serialized all my trigger but i don’t know the order of traitement of my document and it’s important for me.

Thanks for your answer.

Document order depends on many things. Broker can keep documents in order when published from a single publisher. When documents are published to the Broker from multiple publishers, order is more or less undefined.

If Broker is not involved in your solution, we’d need to know what components are being used to provide more useful guidance. The general rule: if order is important the solution cannot have parallel processing at any stage along the integration path. That means avoiding the use of multiple threads within IS. If you’re using Trading Networks, you’ll need to avoid the use of async processing rules and the TN delivery services.