Bizdoc built in services uasge

Can’t we use the bizdoc built in services like bizdocToRecord in the integration server which does not have access to TN (or TN DB). I published a canonical document that has a bizdoc thru a broker to another IS which does not have access to the TN DB(but it has the TN package installed) and tried calling the bizdocToRecord service. This service gave me a Class Cast exception at

After some analysis looks like these bizdoc built in services can only be used from IS where we have access to TN DB. Can someone confirm/comment on this ?. Are all TN objects(bizdocs, profiles, etc) built in services use the TN DB internally ?

Any inputs would be highly appreciated.

This might be another good reason to add to the list formed in other threads for not publishing a bizdoc from one IS to another. The next thing you’re may likely run into are errors due to the fact that none of the TN components that the bizdoc refers to will exist–the bizdoc has IDs for profiles, rules, doc types, attributes, etc. None of which will exist on the target system. You may have a good reason but I’m not sure I understand your reluctance to publish the bizdoc content rather than the bizdoc itself. (And I’m not sure I understand why Broker is being used either.)

The obvious quick solution is to create a TN DB on the target IS and sync up the definitions–export all profiles, rules, doc types, etc from one and import them to the other.

For Balachandar,
I think you spoke about something similar to this in some of your previous posts. You do have to remember that you are not sending large amounts of content when you are publishing the bizdoc. It is a reference to various content in the TN DB. This will help you to not create loosely coupled processes with tightly coupled data objects. A good thing to know is that if you are not explicit saving data to a particular location, it will only be local to that version of the webmethods product. That is for document types, users, groups, restrictions, services, etc. Try best to extract the content beforehand and possibly encrypt it if you need some security passing data between systems.

For Rob,
With the TN DB in sync the first time, won’t there still be problems with every new document that is passed in. Balachandar will not get the very next document content that freshly populates the first DB because that has not carried over to the second. So there will still be nothing for getContentPartData. It would be good though to have the rest of the information (assuming static enough that changes can easily managed across TN’s) as it would allow the rest of document to have a basis for translation (like knowing which ID is setup and protocols to use).
Good day.

Yemi Bedu

First of all thanks a lot to Rob & Yemi.

I think I need to add a little more explanation to my situation here. There is one TN IS layer for handling & routing partner documents and the other IS(mapping) layer to do my B2B transformation(from partner format to my internal application format). Both these layers have been connected thru a broker. The mapping layer is expected to do just the mapping and it is NOT supposed to have access to the TN DB.

The reason for me to pass the bizdoc to the mapping layer is that the mapping layer can extract the data from the bizdoc and do the transformation. Why not send the bizdoc content to the mapping layer?. The idea is to have most of the processing(like extracting the content from bizdoc, transformation, etc) to take place in the mapping layer. The TN layer receives the document(bizdoc) wraps it in a canonical and publishes it to the mapping layer. We expect that this will reduce a lot of load and processing on the TN layer. The idea of having the mapping layer get the bizdoc and extract the data from it seems to be a failure as the mapping IS is not able to call the bizdocToRecord method for the above said reasons.

Any additional comments ?

I would not call it a failure as you have to understand what is underlying. It is not magic pixey dust. You said that you have TN make a canonical. So this would mean that you have a local IS that does the initial transform and publishes to the Broker. This is were you should fully “ALL” data from the bizdoc. Before I continue, this is what I see:
{route -> translate} -> {map}
{[TN] -> [IS1]} -> {[IS2]}
Well I would suggest that you do full translation. The reason things are the way they are now is because there would be such a lag in performance moving that much data around for each bizdoc being instanced by each service that was running with it. Giving that some flow services manipulate parts of the data, you would have many copies. This may not be your situation now, but if your project would be used in a 10,000 document per hour environment, you can see the hefty weight you would be making if not needed.
If you want security, as I am seeing, you need the full translation, this way the second IS will not get anything of the bizdoc (a somewhat security whole in its own right) and only need to map out from the canonical. Try to see if that is feasible because you are already using one IS to publish the document and TN doesn’t do anything really but routing documents anyway so you will have plenty of left over cpu cycles.
Good day.

Yemi Bedu