How does webMethods EDI module compare to lets say, Sterling or Inovis translators. webMethods appears to require much more coding and not as many built in features as the other two. Any thoughts?
What’s the difference between “true EDI translation” and “EDI translation?”
I would say that the characterization of requiring more coding and effort is fair. Where IS shines is in its mapping and processing flexibility.
It appears to me that a strong knowledge of EDI is not a requirement for webMethods but rather a strong knowledge of wmDeveloper is. It appears everything is in webMethods but it all must be developed. FA acknowledgment must be added to each document. Other software FA processing is just a check mark.
Take a closer look at WmEDIforTN. It handles de-enveloping, FA generation and more.
Strong knowledge of EDI and Developer/TN are needed to be successful. Are you saying that the Sterling UI doesn’t require knowledge of how to use it?
We currently use Inovis and Gentran. These are EDI tools only and an EDI Skill set works fine.
I worked on webMethods as well as Gentran/GIS suites…I feel both are good as for EDI/XML translators…
The more i like in webMethods is very good user friendly mapping tool (Developer IDE),better pipeline visibility for debugging compared to GIS (MapEditor/AI)…
Also wM Trading Networks add-on component for better managing trading partners/delivery/transactional view vs GIS dashboard,performance/throughput wise faster than GIS…but i agree both apps are Java based framework…
Both wM/GIS Admin dashboards are very cool…
I am a veteran of Gentran, Mercator and other X12/EDIFECT tools now working in WM.
There are multiple aspects to EDI that not every mapper knows or needs to use. There are mapping aspect and data flow rules that administrators tend to not understand. My first EDI job I became somewhat of an expert on what documents contained what segments and how they were constructed and how they should and should not be used from the point of view of Federal and DOD conventions. I knew nothing at all on data managers, partner setup and quirks in different mappers, and there are many.
First question to ask, is what aspect of EDI am I talking about? Partner set up? Mapping and visibility when troubleshooting (This is where WM is great, but you better know your rules well), or a top down view of the mapping process as a whole, such as loops and groups(WM not the best)
There are many things that WM does well out of the box. But there are a lot of utilities that you may need to construct. Then ask your self, what do I really want or need.
WM is not a specific EDI translator. It is a very powerful mapping tool in general. Mapping from X12 to a flat file is an extremely simple task. it almost does its self in a convertToValues call. you just need to order them up in a few mapping steps from there. Every demo you have ever seen is most probably in reading and converting from X12.
The mapping of a flat file to an X12 document is a little more involved. You really need to understand the flat file schema and your mapping rules before you start. Once you get the flat file into XML or rather parsed into a document, you want to build each segment one at a time and then assemble your document. You really need to get into the weeds compared to other translators. Otherwise, it will allow you to build non-ANSI compliant X12, and wont help you figure out what you did wrong.
WM doesnt give you the same drag and drop or mapping group aspects that are common to other translators. There are bolt on third party translators that can be employed or called outside of wm. You can even call or use Gentran in a WM flow service if you wanted to.
I would say that the dedicated X12/EDIFECT mapping tools are simpler to master, but thats only 1 part of the puzzle. TN and WmEDI are very good at what they do. TP profiles and agreements allow a great deal of flexibility that others dont. However, they were not designed to do it the way Gentran does it.
I think one of my first complaints, was the inability to “manually Ack” an outbound interchange/Group. There is no interface built in that will do this for you or alert/mark an FA as late. Is this an oversight? Yes if you are Gentran junkie. No if you are relying on the business system to take an active role. I can go on a rant about the importance of a data lifecycle, but I wont.
The reply I got from every WM support person or contractor was that your partner should always send you an FA. In the real world, we know that doesnt happen. In the real world where you deal with external partners, you may need to rely on a person to tell you they received an interchange or group. In a SOX stained reality, that person may not be able to re-send or dummy up an FA to re-send.
In a nutshell, there are things you can do with WM that very few other tool sets will allow. For better or worse, the added flexibility and ability to receive and recognize Non X12/EDIFECT EDI related transactions, SOAP and XML based data makes this a true EDI translator
Excellent overview Jim.
It took IS/TN a couple of iterations but it eventually got to the point where it things are manageable. The manual ack point is a great one. FA stuff in general has been a relatively weak area. I remember a version where acking to the transaction set level wasn’t something supported out of the box–“left as an exercise for the reader since transaction sets were inherently partner/implementation specific.” They were really wrong on that one.
I’m trying to remember if I was one of the folks that said that partners should always send an FA. I’m not sure if I did.
But your point about this reminded of another rule of thumb–be prepared to accept transaction sets from partners that use codes that are not part of the version/release being used. Or not even defined by that particular element in any version. This practice seemed more prevalent with EANCOM to me but perhaps I just hit the jackpot with partners that liked to mix and match! To do this means controlling validation with the TN processing rules.
Regarding the batching we had discussions about, I see that in a later version (6.5 I think?) that the enveloping services now allow you to specify “leave my control numbers alone” which was one of the options you wanted to have. Perhaps you were the reason for this update?
Anyway, good post. I’m sure people will find it helpful.
I wasn’t going to call you out specifically, but you were one of them. You did have a point from a purest point of view, but we still had to contend with an existing business need.
We did wind up creating a query of all groups in the EDI tracking table that were less than two weeks old that lacked an “ACCEPTED” code in the FA status. Then loop over that list and subtract the current date. Any “NONE” record that was greater than 48 hours as well as any rejects/errors made the final list. Anything older than 2 weeks just didn’t make the report as the Business should have already resolved this.
We also created a simple page that accepted a TP code, group number and desired status. We can force the DB record to whatever status we desire. However, there is no way to distinguish one that was set by receiving a 997 and one that was accepted manually within the TN/WmEDI environment.
My point of a business data lifecycle fits in more with what you would agree is best practice, and we are incorporating that idea into an enterprise level user interface. Our procurement group is theoretically on board as they would love to see the partners response to 850/860 data. I just need to sell it to the other groups. That way, any invoice/PO/remit would have a 1 to 1 relationship with an acceptance or rejection that the business could act upon.
I think that WM has been a little near sighted in how one could replace an existing EDI solution, but they are right on if you are building one from scratch.
“You did have a point from a purest point of view,”
I do make purist arguments from time to time, eh? Partners should send an FA but the reality/practicality is that not all of them do.
“We did wind up creating a query of all groups in the EDI tracking table…”
Custom FA reconciliation is indeed a standard need when using IS/TN. I’ve had to do the same thing at virtually every client. The “sliding date window” is exactly what I’ve done as well. Anything older than “X days” simply falls off.
The tracking table was a nice addition but it would be cool if they could take things a step further to help us all in terms of FA reconciliation.
Another cool thing might be to somehow tie the receipt of another transaction set to the FA of the corresponding outbound transaction set. For example, when the PO Ack arrives, it is crystal clear that they received the PO. Obviously this would need to be on a case by case basis but might be a helpful tweak.
Unfortunately, the AK02 is optional in a 997. it is completely valid from an X12 point of view to partially accept or reject a group without noting specific sets that were a problem. That makes it impossible to clearly state that PO number x was accepted if it was batched in the same group as 10 others.
What we are proposing is a meta data repository that accepts little bread crumbs from each document/transaction that is parsed out. Then when the 997, 824, 855 or whatever come back, we can match up the AK2 if available or make the broad assumption if it isn’t. If we get a partial accept, it casts a doubt on each transaction included, and meta data would be collected accordingly. This is all being handled outside of WM using the thousands of meta data packets we publish for each interchange as we translate batch and deliver.
In this case, the actual reconciliation is now the domain of the business, and the EDI tracking table is simply our matter of fact. The meta data database table is updated by the AK if we get one, or by the BU if they choose via a UI that captures the user name and the changes they make. The WM/TN tracking table is untouched, making both SOX auditable. It is possible and acceptable for the EDI tracking table to maintain a value of “NONE”.
This keeps the BU and business logic out of TN, and allows each partner to see their own data and its life cycle. The DB and a central security model would handle user access.
Big talk, as we still need to build it.