Flowservice for amazon s3 bucket to topic in google pubsub data sync in webmethods.io new flow editor

Summary:

     This article describes one business use case integration where "Bucket" from Amazon gets created into "Topic" in Google PubSub.

Prerequisites:

  • User needs to have working Amazon S3 and Google PubSub accounts
  • Working webMethods.io tenant 

Content:

This simple use case describes the step-by-step process of integration between Amazon S3 and Google PubSub, where a Bucket in Amazon is getting created as Topic in Google Pubsub

This example will also show the mapping steps in the new webMethods.io flow service.

Note: 
Any coding or configuration examples provided in this document are only examples and are not intended for use in a production system without verification. The example is only done here to better explain and visualization of the possibilities.

Steps:

  1. Login to the webMethods.io & create the new project then choose the flow service then click on the ‘+‘ icon to create the new flow service.

     2. Name the flow service, then insert try block. Now, the user can choose the predefined action “getBucket“ from the drop-down menu or he can make the custom operation as well.

     3. Once the action is selected, the user needs to set up the connection as shown in the below screenshot.

     4. To configure an account for Amazon S3, fill in the required details and hit  "Add" tab.

     5. Once the account is configured, we will insert the ‘if‘ condition inside the try block. Inside the if condition we will validate the Output result of the “getBucket“ response. We are checking the null scenario.

     6. Here, 'if block' has two cases, YES or NO. The flow of execution will proceed based on the if block condition evaluation. 

     7. Inside the YES path we will add the ‘Google PubSub‘ connector. 

     8. Choose "create Topic" action from Google PubSub connector and configure the Google PubSub account, like we configured for the Amazon S3.

     9. Fill the credentials details & click on the Add  button. Now the account is set up for the use & the user can perform operations on this connector.

    10. Defining an input/output makes easier execution of the flow. Move the cursor to "Define i/o" for adding input and output.

    11. Adding input such as "projectId" and "bucketName" which are required to execute this Integration.

     

    12. Click on editMapping for Amzon S3 connector and map the property bucketName as ashown below.

    13. Click on editMapping of Google PubSub connector and map the properties "projectId" and "name" from getBucketOutput to topicName from createTopic action as shown below.

    14. This Flow service execution shows a simple demo, that handles the error scenario as well. Catch block has the “GetLastError“ service, that informs user for any errors occurred during the flow service execution.

      15.User needs to pass the inputs while running the flow and hit  "Run" tab.

     16. In case of successfull execution, the existing name of the bucket in Amazon S3 is created as a name of the topic in Google PubSub.

     17. Either in the case of “getBucketOutput“ is null or in case of any error.The flow of execution results in error and which will fall under catch block logging the erro message and exits the flow.