Loading data in batches to a table in ARIS Process Mining with webMethods.io

Summary

This article describes loading data in batches to a table in ARIS Process Mining using webMethods.io. This webMethods.io workflow takes a JSON data as input in a specific format and

  • Creates table in ARIS Process mining if it’s a new table.
  • Uploads the data in batches from the input
  • And commits the data when the all the sets in the batch are uploaded

When we have a huge amount of data to upload there are chances that we may run into memory issues. Especially when data transformation, loading of data and committing the transaction is required. An efficient way of moving bulk data between systems is by performing batch uploads.

The json structure uses a data Key and last commit value to identify if a set belongs to a certain batch. There are four workflows and a flow service used here

  • The main workflow which orchestrates the upload and commits logic
  • setArisKey – Associates the data ingestion key from the first upload of ARIS with the dataKey of the batch
  • getArisKey - Retrieves the ARISKey based on the dataKey in the input for all sets in the same batch
  • deleteTheDatakey – Once the data is committed remove the Key from the storage

The workflow also requires a flow service(map) formTableData_batchUpdate to perform the transformation from json format to the ARIS required format

Prerequisites

The user needs to have a working account in ARIS with a client id and a secret key

Working webMethods.io Integration cloud tenant.

Json Data in the requested format

Steps

  1. Login to webMethods.io Integration tenant.

  2. Create a new project or choose an existing project.

  1. Click on Import icon on the top right corner to import the workflow into your project.

  1. Select the archive file “Batchdata_Arisprocess.Zip” shared in the article.
  2. The workflow name and description are preloaded.


6. Configure the account information for ARIS Process Mining(click on + button against Connect to ARIS Process Mining). Provide the account information like accountlabel, ariscloudurl, projectroom, datasetname, clientId and Client Secret.

  1. Once the details have been entered you can save and import the workflow.
  2. The workflow is imported successfully and now you can enable the flow service.
    image
  3. Once u enable the flow service you can import the formTableData_batchUpdate.zip file

  1. Review the mappings

  1. The workflow requires the json file to be sent in a specific format.
{
	"RecordHeader": {
		"DataKeyID": "dataKeynumber",
		"lastCommit": "true/false"
	},
	"TableInput": {
		"TableName": "ARISTableName",
		"Namespace": "ARISTableNameSpace",
		"ColumnNames": {
			"Header": [{
					"name": "ColumnName1",
					"type": "ColumnName1Type"
				},
				{
					"name": "ColumnName2",
					"type": "ColumnName1Type"
				},
				{
					"name": "ColumnNameN",
					"type": "ColumnName1Type"
				}
			]
		},
		"ColumnData": {
			"Record": [{
					"data": [-- //Record 1
						"ColumnNameRecord1",
						"ColumnNameRecord2",
						"ColumnNameRecordN"
					]
				},
				{
					"data": [-- //Record 2
						"ColumnNameRecord1",
						"ColumnNameRecord2",
						"ColumnNameRecordN"
					]
				}
			]
		}
	}
}

Sample is also available in the webhook

{
"RecordHeader": {
		"DataKeyID": "dataKey1",
		"lastCommit": "false"
	},
"TableInput": {
		"TableName": "Service4",
		"Namespace": "ServiceDataNew",
		"ColumnNames": {
			"Header": [{
					"name": "CUSTOMER_ID",
					"type": "STRING"
				},
				{
					"name": "CUSTOMER_NAME",
					"type": "STRING"
				},
				{
					"name": "DATASOURCE_NAME",
					"type": "STRING"
				},
				{
					"name": "PRODUCT_NAME",
					"type": "STRING"
				},
				{
					"name": "PROCESS_KEY",
					"type": "STRING"
				},
				{
					"name": "PRODUCT_ID",
					"type": "STRING"
				}
			]
		},
"ColumnData": {
		Record": [{
			"data": [
				"CUST00001",
				"Rolls-Royce Ltd",
				"SMART MAINTENANCE",
				"Certuss Junior 80 - 400 TC",
				"SR-100028",
				"PROD00001"
					]
				},
				{
			"data": [
				"CUST00002",
				"Holiday Inn",
				"SMART MAINTENANCE",
				"Certuss Junior 80 - 400 TC",
				"SR-100029",
				"PROD00002"
					]
				}
		]
	}
	}
}

The above sample will upload the data but will not commit the data and for every data load we have to change the datakeyid and for the last commit you can make true so that the data will commit in the ARIS process mining.

  1. You can check if the data is loaded by logging into ARIS

Next Steps

Would you like to create a BatchUpload of data Please follow the steps in the post:
Loading a Json file to a table in ARIS Process Mining with webMethods.io

Process_Mining_BatchUpdate.zip (260.4 KB)
formTableData_batchUpdate.zip (9.7 KB)

2 Likes

Well explained.