Sequencing of Interfaces.

Hello All,

I have a requirement where I need to control the wM runs. we have 3 flow services( say A, B, C ) which receive the different data from TP via AS2.

Once it is received, the processing rule will publish a Broker document and the target IS will subscribe and posts it to SAP.

Now, The requirement is when a TP sends A and B data. wM should process A first and after successul posting to SAP. It must start processing B.

How can I achieve this.

we’re using pub-sub model.

Thanks,
David.

How do TP sends? do you receive the diff data files in the same AS2 transmission or parallel transmissions?

Is the same trigger set to serial or concurrent mode? Also is IS clustered?

HTH,
RMG

Hi RMG,

Thank you for the prompt response.

we get the data via AS2/https. Its different XML data for all the 3 interfaces but uses the same connection. All the 3 interfaces use pub-sub approach. They do not use same Broker document/Trigger.

Thanks,
david.

curious to know - what happens if the SAP processing fails - do you want the data B to processed in sequence or not? what if B comes first and then A?
also, what kind of data is flowing - type of business data - trying to understand is there any logical reason why the data should be processed in sequence!

between, though I am not sure…You might need to look at “Queuing” concept if there is a solution.
I wil continue to look for possible solutions if possible!

Hi PRP,

Thanks for your response. Yes. its the business rule where they want the data in the sequence. The data is related to financial postings. So, They want the data to come in an order.

As you asked, If data B comes, SAP should reject the data as A hasn’t arrived. But from wM point of view, we want to make sure that if A and B comes, A goes first and after successful posting, It should process the second one.

The Integration flow is this way:

Routing Instance ----> Source Instance—>Broker—>Target Instance—>SAP.

All these 3 interfaces use different broker document types, triggers. They are independant of each other at present.

Any suggestions are much appreciated. This is very important for me.

I have suggested the TP to send the data in certain intervals of time of 1 hr each. But this still doesn’t work because if A comes first and B after 1 hr.

If A fails, but B is a success. Then its not a sequence for SAP. They want A to be successful then only it has to process B.

hi RMG,

The processing mode is Serial for the triggers and the BrokerDocument type is guaranteed delivery.

One approach may be to create a single trigger with a join condition. IS will hold the rec’d docs until all three have arrived. Then will kick off your trigger service.

I haven’t done this so can’t be more prescriptive.

Another approach:

  1. As each doc type arrives, write to a DB/file/whatever.
  2. Periodically, have a service wake up to process the collected docs. It would group the docs as per the need and send to SAP.
  3. When mismatches occur, (have an A but no corresponding B or vice versa) either wait for it to show up. Or keep track of time and notify when an “orphan” exists for longer than desired.

One other possibility: ask the TP if it can send all the needed data in 1 document. Reassembling the relationship between different documents will be a consistent source of production support issues.

Hi Reamon,

Thank you for your reply.

I am considering to use the JOIN conditions in triggers. Let me know your opinion.

In my case, we have 3 different subscribers for the 3 documenttypes that are going to arrive. So, I am thinking of publishing a broker document back if process A is successful( some status document )

Then I’ll include this Broker document in the JOIN condition for the processB trigger. Then only when both the documents are available, process B will kick off. If process B is successful. It will publish the same broker document back.

And I’ll include the this document along with the other broker document in processC trigger JOIN. So, Process C won’t kick off until both documents arrive.

My only question is after the JOIN time expires…The Broker document loses the data correct? Also, What do you think of this approach.

Thanks.

I’m not sure what happens when the join time expires. You’ll want to review the documentation and/or try it out.

The relative complexity would have me concerned. I would try to make changes so that all the data is in one document rather then trying to manage the timing of the processing. It may seem like this should be a simple thing to enforce but given the uncertainty of the arrival of 3 different documents and such, I’d really try to get the TP to send a single doc with all needed data.

Hi,

Here is my suggestion on this,

As per my understanding in the post. When the B flow service publishes the data to SAP system once A flow service successfully send the data to the SAP system. otherwise SAP system isn’t accept the B flow service data.

In the above case, is there any unique number to identify the incoming documents? If yes, when A flow service receive the data, insert the unique number and put the process status as ’N’ until it successfully sends to SAP system. Once process is completed update the status as ‘Y’ in the database table. If B flow service receives the data with same unique number, check in the database table whether ‘A’ flow service successfully process or not. If it successfully then process ‘B’ flow service data to SAP system.

In case of ‘B’ flow service receives some data, then it checks the database table whether the ‘A’ flow service data successfully processed or not. If there no data comes in the A flow service, then throws an error to sending system that the particular data is not received or wait till the data is received and processed

Hope this will helpful for you.