Using Trading Networks for internal data exchange

Hi,

If I am to use Trading Networks for data exchange with external partner, should I not also use it for my internal exchanges (between applications that are within my company boundaries)? Meaning no broker in the exchange architecture, this role being fulfilled by TN for all flow types. Otherwise it sound like two fairly different architecture to implement and maintain (trying to minimise costs and complexity here). Would you agree to that ?

Thanks

Philippe

Philippe,

Ofcourse you can use TN for dong internal exchanges(EAI) too,if you have lot of application to application traffic for example one DB-another DB synchronization,daily data feeds to another application,Datawarehouse feeds etc…So use of TN architecture inthe middle gives more flexible to transactional management,resubmissions for unprocessed flows etc…

We have used the same architecture,but in addition we have Broker IS/TN/Broker (pub/sub)for EAI implementations.So the bottom line is its all depends on Architects/Project Owners how they want to design the EAI flows.

Just some thoughts,

RMG,

I am actually considering using TN for all flows and for all steps of their life cycle:

  1. For exchange started from a back-end application, an IS adapter gets the raw data and publishes straight to TN that acts then as a broker, invoking the different flow services participating to the flow process
  2. For echanges started from external partners, the process would be the same except that it would start straight from TN

Benefits of this implementation being that I can follow the complete lifecycle of any document using the TN console. I could just do the same with Monitor (when only using IS + Broker) although I would not be able to perform queries against the functional contents of the document - please correct me if I am mistaken here - and that’s a big point.

TN provides document duplicata checking, document persistence/resubmission, synchronous/asynchronous service invocation, functional monitoring console, content based routing and even some kind of pub/sub (using the IS flow service to post N instances of the document to TN).

When looking at a project where Trading Networks is required anyway to manage your external exchanges, what would be the arguments to add the broker component to the picture ?

Also if I am to use TN also for my internal exchanges (no broker here), would you then create as many profiles as you have internal applications or just use a single enterprise profile for all internal apps?

Thanks

Philippe

Phillipe,

As this is an architectural question, you’ll get conflicting answers on this.

I recommend using TN to persist the data as soon as it enters the integration environment, but use the Broker ( the webMethods component that is designed for publish/subscribe out-of-the-box ) instead of making TN act like a broker ) to publish the document.

To exploit the Enterprise Service Bus model, I highly recommend that you almost always use publish/subscribe. Try to avoid request/reply and point-to-point integrations. This will expose your business events for future uses that are not yet anticipated.

Always convert the document to a canonical form before publishing it to the enterprise at large.

I can tell you over the years that I have seen valuable returns from these investments.

Regards and respect to those with differing opinions on this.

Philippe,

Its better to create individual profiles as many internal Apps say forexample HRMS Application(Sender) to Payroll Application (Receiver) and viceversa.This way we know where the document is flowing.

If (Broker)component in the middle,there are several advantages when doing EAI or external integrations like the when you use Canonical formats (Internal used docTypes by Enteprise wide),so same publishable canonical docType can be subscribed by another application which uses the similar structure in the integration process either inside the broker territory or outside the territory gateways with in the enterprise (like NA region or Europe region using webMethods hub)

HTH,
RMG

So as Ram says,

Both TN and the Broker have different strengths that compliment each other. You can create a very nice interface by using both to their advantage.

Regards

Phillippe:

As Mark alludes to, there are others with different opinions. You might find this thread useful.

[url=“wmusers.com”]wmusers.com

Ever since TN came out, I have had just one project that needed to use Broker. For the one that needed messaging (not for pub/sub but for what amounted to client-side queueing), we used MQ Series. For the rest, TN was used as a general broker (lower-case “b”) for external and internal integrations.

That said, Broker may indeed be the way to go for your implementation. To guide the decision you’ll want to consider number of participating applications, the message/document volume and the anticipated change in the coming months and years.

HTH.

Hi all,

pmo you may also take a look at architecture thread
[url=“wmusers.com”]wmusers.com

I am using IS+TN+BI architecture in version 4.6. One of the main problems I encountered is that just one small problem with adapters in IS (db connection over firewall) will eventually cause full IS+TN+BI stack restart causing temporary service unavailability…

Of course there are several solutions to this, as reamon described in the architecture thread. They usually require more IS instance running together with external queuing technology causing raised complexity and maintenance/development costs.

I think mixed architecture (Broker together with IS+TN) could be the best solution. My advice: choose most powerful solution your budget allows. Against more complexity at the beginning you’ll get more benefits over the time while staying fit to only IS+TN over the time will become less cost effective (scalability, maintenance, complexity) and also difficult to change!

However if you choose only TN solution consider Modeler as a good investment that enables you to follow and manage full business process. From my experience business people really appreciate it as they can simply understand and follow the integration processes.

It all depends on what is business throughput!

Regards