combining the result of more transactions within one envelope

Dear All,

I have a case now which looks a bit hard for me to handle in webMethods and any idea/suggestion/solution would be nice and appreciated.

We are doing EDI to IDOC/FlatFile conversions in our company and now I have a case where our customer sends multiple UNH-UNT drops within one EDI envelope (UNB-UNZ pair) but I need to produce only one output file (self-defined flat file/in-house).
But during WM’s default recognize and submit services we use it splits them as many transactions as many UNH-UNT we have and during submission of course our model/mapping will run on every transactions separately and produce the same nr of output file as many input transactions we have within the envelope.

For the time being I see only two possibilities to solve this: one, if I modify the original EDIString in the very first stage and combine them together (don’t like these kind of solutions), the other one if I produce the output file and if the next one has the same sender/receiver/ICnr then append the output to the same file (this one is dangerous and also not nice) :frowning:

However I was digging the WM documentations I cannot find yet an official solution if exists.

Could you please give some remarks/ideas if you ran also in this case or you have experience in this?

Thanks in advance!
Bye,
Balazs

You should be able to define a partner-specific TPA for this customer with the splitOption value set to interchange. This will keep TN from splitting up your envelope into individual documents. You can then have your processing rule be associated with the envelope document and call your customized mapping service to create the flat file. HTH,

Tim

Also What is your processing rule DocumentType?? Set it to X12Envelope and the triggerred service pipeline will have editn_env variable, you use this for further parsing and loop on the UNB/UNH (multiples) then do mapping to FF and append it to your flatfile which will have one output file (with multiple records) that you are expecting…

HTH,
RMG

Another approach is to do the same sort of batching that is described for EDI. In this case EDI for TN would still split the transaction sets and your mapping would produce each output IDoc. Use rules to queue the IDoc documents and then have a periodic process to combine whatever has been queued into a single IDoc (this may be non-trivial) and send it.

Of course this isn’t a good option if the transaction sets within the group must be kept together–but I’d be surprised if that’s the case.

Another thing to keep in mind is large document handling. Depending on the type of transaction sets you’re processing and the nature of the business, they can be quite large. When processing a complete EDI group you may need to take care to avoid loading the entire thing into memory.

If indeed it is necessary to keep the transaction sets together, you may prefer to leave the split option at the transaction level, but to configure your processing rules for the affected sender/receiver so that the envelope is processed to convert transactions within to flat file and transaction documents are ignored. This would allow you to still have easy visibility to each transaction in TN.

Hi All,

Thanks for your responses. I am going to think through the possiblities you advised.
However it is important to keep the trasactions sets together also in output files, so queuing and batching I think cannot be an option here. But maybe the splitOption together with a specifiec processing rule can be a solution.

Thanks again! Will do a post if I have a working solution!

Bye,
Balazs

Is it “important” or is it “necessary?” It is rarely the case that documents really must be kept together–batching a group of orders introduces an artificial grouping but doesn’t make the orders related to each other. I bring this up just to make sure you end up with a solution that balances process need with technical need.

Hi All,

Sorry for the late response. …working like hell. (like all of us, I think)…

Rob (thanks your remarks!), it was necessary to keep the mentioned data together, because in my case these are invrpt/slsrpt edifact message type, just customer sends them not within one UNH-UNT, but more within an envelope.
Our WM implementations is configured to split and process incomming envelopes per transaction (in case of edifact it menas one UNH-UNT pair). In case of big invrpt or slsrpt file it is not acceptable if we map it to inhouse (comma delimited) text files per each transaction and deliver it this was, because the user can get 100s of files (and 100s of warning mails).
The change to not to split per transaction in the default inbound TPA of the specified Trading Partner is not an option, because it would affect the other message types of the Trading Partner (and all of our mappings are based on transactions-processing instead of envelope processing).

So, the solution I choosed and implemented is a database-based solution; with a few words: process-store-group-deliver

First - while the data is still envelope - I save the header data into a table (e.g. senderID,ICnr,NrOfTransactions,Status). After that, during the processing of each transactions I save the output into another (detail) table with the senderID+ICnr key and also with the transactionNr (sequenced nr within one interchange).

Another service is scheduled to run every X hrs and which first makes a list from header table, then loop over on the list and takes out the detail data forr the specified senderID+ICnr and do the groupping and the delivery to the destination server.

Third service is scheduled once a day (or whatever) to clean up both tables, where the status is ‘DELIVERED’. This case our database tables do not grow that quickly, or let’s say, they are almost empty, only the data of the current day is there.

Any of your remarks are welcome and I hope it will be also helpful to someboday who will meet the same requirements and has the same trouble.

CU All and thanks again all of your posts!
Balazs