Moving to 60

My WEBM rep/sc has told me that moving to 6.0 from 4.6 I need to “re-write” my process logic as there is no conversion tool planned.

Also I need to “rewrite” any custom adapters to take advantage of the new 6.0 architecture’s new performance.


Have has anyone heard the same?
Is there more to the story I am not hearing?

Sounds like I will need to spend the $$$ and go to the conference to find out what it going on… Maybe you folks can save me the money with some more insight…


Well that really goes against what has been done in the past. In the past they provided the tools necessary for customer to migrate from one version to the next.

I just had the same message confirmed today, this very concerning, any link to verify this.

Frank, I think that is an untrue statement.

From as long ago as February 2002 at the webMethods Users Winter Conference 2002, webMethods has said that it will provide migration tools for its customers. webMethods has also said that they will continue to support customers in production with prior versions of the product.

I was listnening to one of the partner briefings and heard that webMethods is going to provide the migration tools along with its GA version some time in Jan/Feb 03. I am not sure if there is any specific product component that they don’t want to migrate.

I’ve sat through several product briefs with webM folks and they have all stated that there will be conversion/migration tools available. It’s a little fuzzy regarding when they’ll be released, but there is no question in my mind that webM plans to support a migration from the 4.6 platform to the 6.0 platform. This is for Integration Server specific components. What will really be interesting is how they plan to port the components from Enterprise into the new application configuration.


Caveat to statements below: Opinions below are my own and may not have any basis in reality.

6.0 is the first real effort to unify the up-to-now very distinct environments of B2B/Integration Server and Broker/Enterprise Server. Migration/update is primarily an Enterprise concern. AFAIK, the IS 6.0 is essentially IS 4.6 with some updates.

IS is becoming a full-fledged broker client capable of direct interaction with the broker without using a bridge. Enterprise adapters are being re-hosted in the IS environment. VI, ETE, the old Enterprise Manager, the ATC, and probably other “legacy” Enterprise components are being taken behind the shed and shot (someone please correct me if I have this wrong).

However, existing adapters/agents WILL BE SUPPORTED by the updated broker. This means existing infrastructures will continue to work, but YOU MAY NOT HAVE A MECHANISM TO UPDATE THEM if they need changing (e.g. no ETE to update infosets).

ATC Workunits and Blueprints should work with the new broker. Workunits could be updated, as those have always been external to the tools, but you may not have a mechanism to update infosets that associate the Workunit with an event. There is likely no tool provided to update Blueprints. Migration tools that come out may be able to automate the migration of Blueprints but Workunits will have to be reimplemented (no way to migrate what is essentially free-form Java code).

VI scripts will need to be rewritten (in FLOW) if you need to update them before migration tools are available.

If you have custom Enterprise adapters, and you want to take advantage of the new architecture you’ll need to rewrite so they can be hosted in the IS environment. You do not need to change them if they are performing as desired. Since the run-time libraries need to be supplied by wM to support legacy adapters, then you should be able to make changes to your own custom adapters as needed.

Can anyone speak to “…the 6.0 architecture’s new performance.” that Frank mentions as pertains to custom adapters? I’d suspect a possible performance hit, depending on how custom adapters are written (C, Java) but measuring is always the right approach rather than assuming.

Of course if anyone has any information to the contrary of any of the points above, please post it!!

This is what we decided…

It may not be in the best interest to migrate all existing code to the wM 6.0 environment.All the adapters hang out of the IS and there exists no clearly defined process to take the adapters under the adapter monitor process and push them under this(IS) JVM.

Architecturally, how do we move cetralised adapter environments in to a totally distributed strucutre which the wM 6.0 proposes.

There are quite a few issues that we had when wM made the presentation to us.They are due for a follow up and I shall keep the group informed.

webMethods will most definitely offer upgrade utilities where needed for migration to the next version, webMethods 6. For some components of the webMethods integration platform, upgrade happens on install with no user intervention. In other cases, easy to use tools will be provided. These tools will be accompanied by documentation and are part of the webMethods 6 release. Please talk to your webMethods Account Manager and Sales Engineer for more information about webMethods 6.

Thanks for the info Susan. I don’t think there has been much concern about IF wM would provide migration tools but more of WHEN and specifically WHAT. In particular, I think there is some concern about migrating Enterprise/Broker components (ATC, ILA, VI scripts, notifiers, et. al.).

Vishal–to argue semantics with you: strictly speaking, the old adapter environment is distributed too. Every adapter could be on its own box. A common practice, however, has been to centralize all the adapters on the same box as the broker. Presumably you could follow this same approach with the new architecture if you wanted.

I agree that it would make no sense to migrate components just to migrate them. The approach of moving things as they change to support functional requirements would seem to be more practical and manageable.

I’ll stick to Susan’s statement. About when… I would expect at least 5 months (development/testing), but indeed an Sales Rep. should be aware of such a time scale (would be silly if he/she didn’t, because it’s an expected question). What would be more in a scope that are area’s that need architecture migration (ADK). The older migration tools where very adequate and easy to implement and had straight-to-step-by-step documentation. I believe webMethods would stick to a user friendly pattern.


This might be an odd question … but did some of you heard why WebMethods jumped on their numbering from 4.6 to 6? I’m using a lot of other software and this ‘jump’ caught me by surprise. There might be a simple reason for it nevertheless.

Jonathan T. Valdez

I’ll take a stab at this, I guess.

webMethods 6 unifies the flagship webMethods products – Integration Server v4.6 and Enterprise Broker v5.0.1.

“webMethods 6” therefore, indicates a new release to all stakeholders, regardless of prior platform experience.

This is my fourth month now on 6, so I’ll address this based on what I’ve seen and tried…

First, Frank’s “having to rewrite process logic as there is no conversion tool planned”.

Any logic written in IS Flow or Java services does not need to be changed in any way - the 6.0 platform runs it just fine. Logic written into ES adapter operations and integration components is another story, as the ES adapters are not formally supported in the 6.0 architecture. (See below for more detail.)

Second, Frank’s "need to “rewrite” any custom adapters to take advantage of the new 6.0 architecture’s new performance. "

Adapters written for the IS 4.x platform will work just fine in the 6.0 platform. However, the 6.0 platform utilizes the J2EE Connector Architecture for its adapters, so “old-style” adapters will not utilize this architecture. Whether this architecture provides a performance benefit is arguable - the primary benefits of JCA are standardization and abstraction of system-level details for adapter developers.

ES adapters are not directly supported in the 6.0 platform. To use a 4.x/5.0 ES adapter, you will need to retain your 5.0 broker and place it in a territory with the 6.0 Broker.

Third, Rob’s comments regarding old ES support in the new 6.0 environment are for the most part correct, with the exception of his statement that the ES adapters are being rehosted in the 6.0 environment. To support the “old” ES adapters, you’ll need a 6.0 and 5.0 broker in a territory. However, this model is pretty sweet, as it allows you to perform a phased migration. Connect the two brokers together and add new functionality/systems in the new architecture while maintaning old functionality/systems AND allowing both to talk to each other!

Fourth, Rob’s comments regarding the WHEN of migration tools/utilities. I don’t have any information on this or even a hint from any webM contacts. However, I do know the new adapter ADK for 6.0 is to be released late next month.

There will be an article on the new adapter architecture in Feb’s Ezine. I confess I did not address migration of “old” adapters in the article - I just stuck to explaining the new.

Thanks Dan. That explains it. Are there websites/collateral you can refer me to regarding this topic (I’d like to pass this on to one of our prospects who asked for it).


Hi, Jonathan.

Grab as much collateral as you can from

As for your prospects, send them to :wink:

Are there any plans to make 6.0 available to wmusers on an evaluation basis? If so, when?

I wanted to add a correction to my statement above - I have now successfully implemented 4.11 Broker Adapters against a 6.0 Broker. It works :slight_smile: Unfortunately, I still have to use the old tools to manage those adapters. :frowning:


I am just getting a little more comfortable with WM 6. For a customer site got a decision to make whether to use the WebMethods Modeler or a third party tool BPM product my customer likes. They like (smaller type of) BPM products like Fuego, Biztalk (can’t seem to avoid Microsoft these days), but also WebLogic BPM (since they have a subsidiary that runs on WebLogic).

I have to convince my customer to stick to WebMethods and just use the Modeler tool (seems to me a no brainer, but it’s the reality I have to deal with). Anyways, I plan to deliver some sort ‘presentation’ of the following key features of WM Modeler:

  • how to create/maintain business process models (ease of use, UI, supported BPM language standards etc)
  • how Modeler automatically executes processes (such as state persistence)
  • Modeler’s standard options for reporting/monitoring process execution including history data
  • last but not least how extensive Modeler support Web Services (‘consuming’ a Web service, deploying a process model as a Web service (as ‘producer’)

I really want to keep these other BPM tools out of the door not in the last place to avoid unnecessary extra integration work with that tool.

Again, it should be a no brainer. Nevertheless, I would appreciate if someone of you would be able to help me better describe WM Modeler using the bullet points I wanna use (just let me know if you feel I should add another bullet point or drop one). I need “real feature descriptions” instead of marketing ‘benefit’ statements (they already got that from the Website). Perhaps some of you had positive experiences with one or more features in Modeler (such as seeing a business user easily create a new process without the help of MIS people or how easy it was to deploy a Web service) . Those ‘testimonials’ really work in my experience. I just lack ‘hands on’ experience and more importantly, … time.

Many thanks.

Jonathan Valdez

Hi Johathan,

I’ve mostly focused on working with folks who needed to use Modeler for gathering requirements. I’ve also spent some time on the darker side taking the results and making it a reality. I’ll give you my observations and you can take away whatever is useful to you.

I’ve seen both business/system analysts and developers adopt the tool in practically no time. For a Business Analyst it’s no more difficult to adjust to than learning to use visio, but the graphical result that is produced is clear enough for end users to be able to understand without any level-setting. If you’ve ever drawn a picture by connecing the dots, you know everything you need to know to be able to “read” a process model. The model that’s produced also turns out to be a great place to start the technical discussion for what will eventually be the final solution.

Using the tool the BA/SA can focus on the steps that need to be performed, in what sequence, by whom or what (workflow step or service), where to go if a step takes too long, where to go when an exception occurs, etc. The nuances of the process can be ironed out by the BA right there real time while working with the user community and it’s in a format that they (the end users) didn’t all need to go to 4 weeks of specialized training to know what the symbols mean and how to read the picture.

A few BA’s and I just completed several sessions with quite a few groups where by the time I got to the third session they were directing me to make small changes to an earlier process and hook it into the one that was currently being discussed. None of them had been to Modeler training and they didn’t even bother to ask me if the tool could do that (fortunately it can!). They even got ahead of me by stating that it would be nice if they could then use the pretty pictures to see where things were for each instance of the process (thankfully, that’s baked in there too!!). Bottom line, the feedback that I’ve gotten from the BA’s/SA’s is that Modeler seems to be making it easier to convey the message to the decision makers and getting confirmation from the end user community. I think they like it because they can focus on the process (like saying update CRM - set customer status to inactive, or send notice to customer) vs getting hung up on the underlying technology (like update Ora/PSoft/Other, or, send email/letter/Other).

On the technial side is where things got really cool. Not only did the development team have an easier time understanding the “big picture”, but they were able to drive out a couple of things pretty quickly and be even more precise about what needed to be done and how long it would take. I’m talking about things like:

  1. How many different interfaces needed to be built and to what systems (internally and externally)
  2. What the communications infrastructure needed to be, specifically, what was there today vs what needed to be put in place
  3. Which steps would be pretty simple vs those that really needed to have spotlights shined on to assess their complexity and level of risk to the overall process
  4. Which steps required what technology (Simple Services, Web Services, Mainframe Services, WorkFlow components, etc.) and therefore, what the staffing requirements needed to look like. I think that came about because they had confidence that what was represented in the process model was exactly what would be “generated” and they could focus on the individual widgets (steps) and not worry about how they all tied together. They also like the idea that BA’s/SA’s would be able to re-use the individual widgets within/across process models and they didn’t have to code anything special to make that work correctly.

I found it quite humbling to observe that in two weeks