Streamsets and webMethods

About the Article

This article explains different ways to connect Streamsets with webMethods On Prem products like Universal Messaging and Integration Server.

1.0 Streamsets Landscape

Any development platform will have 3 main components: Design time components, Runtime Components and Administration/Control components. Sometimes Design time components and Administration Components are merged. Streamssets follows the same path. Its Admin component that is “Control hub” is also its Design time component.

Runtime component can be hosted on Cloud providers (AWS, Azure, GCP) or locally. Once a Deployment set is defined on the Control hub, it allows users to download the deployment script. Either as a tar ball or a docker command. Running instance of a Deployment is called and engine.

To Engine automatically connects to the control hub and users can manage the engine from the control hub. Since Streamsets engines can be executed anywhere, to work with webMethods the Streamsets engine and webMethods instances should be reachable to each other.

A Streamsets Data pipeline (aka Workflow ) consists of mainly 3 types of steps.
• A Data Origin to read data from
• A Data Destination to send data to at the end
• Processing steps are Intermediary between Source and Destination which processes data.
Pic1_streamsets_pipeline_basic

2.0 Messaging

For messaging Streamsets provides connectors for different messaging technologies like JMS, Kafka, Amazon SQS, Azure Event Hub, Apache Pulsar etc.

To connect with Universal Messaging we can use JMS Consumer as Origin and JMS producer as Destination.

To use JMS tools should be enabled and target JMS library should be added in the Engine.

2.1 Enable JMS Stage Library in the Deployment

Goto Setup > Deployments

Edit the required Deployment

Goto Stage 2

Click on Stage Libraries
pic2_Stage_libraries

Search for JMS, this will list JMS Consumer and JMS Producer
Click add all to move them to Selected stages pane on the right
Pic3_select_stage_lib

Click ok and click Save & Next
Goto Last section and Restart engine.

2.2 Add Universal Messaging JMS Client libraries to the Engine

Goto Setup > Engines

Click on the required Engine

In Edit page, Goto External Resources > External Libraries

Click on ‘+’ icon and browse UM JMS client jars nJMS.jar and nClient.jar from local SoftwareAG installations.

<SoftwareAG>/UniversalMessaging/lib/nClient.jar

<SoftwareAG>/UniversalMessaging/lib/nJMS.jar

After upload Click Restart Engine to restart engine

Pic4_Client_libs

2.3 Create a Connection

Goto Setup > Connections

Click on + icon to add a new Connection

Give it a Name and Description

Select Authoring Engine as the engine where JMS libraries are deployed

For Type, Select JMS

Click Save & Next to go to Configure Connection section

Add Universal Messaging connection details

JMS Initial Context Factory: com.pcbsys.nirvana.nSpace.NirvanaContextFactory

JNDI Connection Factory: Connection factory name from UM

JMS Provider URL: UM URL nsp://<host>:<port>

Click on Test Connection, this should show Green Tick at the top

Click Save
Pic5_Connection

2.4 Use JMS stages in Data Pipelines

Goto Build > Pipelines

Create a Pipeline

Add a origin to read data from.

In the Destination drop down, select JMS Producer.

Leave defaults in the General Tab

Goto JMS tab

For Connection select the connection created in previous section

Configure JMS Destination name and JMS Destination type as per configuration from UM
Pic6_Data_pipeline_with_jms

In Data Format tab,
Select a suitable data format. Example JSON.

2.5 Results

Start the pipeline and see results,
Pic7_Streamsets_jms_result

Pic8_UM_resutls

3.0 REST API

Streamsets provides HTTP Client Destination, which can be used to make REST API Calls.
This feature can be used to call REST APIs on SofwareAG products Integration Server, API Gateway, webmethods.io Integration, webmethods.io B2B etc.

3.1 Use HTTP client in Data pipelines

To call an API on Integration Server, use HTTP Client in Data pipeline
Pic9_rest_data_pipeline

3.2 Configure HTTP Endpoint details

Set up REST Endpoint details
Pic10_RES_HTTP_tab

Set Credentials in Credentials tab and set appropriate Data format in Data Format tab
Pic11_rest_data_format_tab

3.2 Results

Run the pipeline to see results on Streamsets control hub and Integration Server logs
Pic12_rest_result_streamsets

Pic13_rest_result_IS

Useful links | Relevant resources

https://academy.streamsets.com/courses/dataops-platform-fundamentals/

https://academy.streamsets.com/courses/certification-onboarding/

1 Like