In CreateJob operation when CSV is selected for the field tns:contentType, ResultList resource doesn't produce the result.
Summary: This article describes handling of stream response(CSV content) of BatchResultInStream resource of Salesforce Bulk Data Loader connector. BatchResultInStream resource is a replacement of ResultList resource where only XML format is supported.
- The user needs to have WmSalesforceProvider installed with latest CloudStreams fix. By default, latest version of Salesforce Bulk Data loader will be enabled (e.g. v44 current version). If in case, user needs lower version of Salesforce Bulk Data loader connector user can enable the connector from Integration Server administrative page.
- Download SalesforceCSVHandler package attached along with this article.
- The .csv file to be processed in createBatch operation.
Steps required for a Batch Job:
Step 1. Create a valid connection for desired Salesforce Bulk Data Loader connector.
Step 2. Create a Job, using CreateJob resource.
Step 3. Create a Batch, using CreateBatch resource. You need to pass the .csv file as an input to this operation.
Step 4. Obtain status of batch using batchStatus resource. If the status of the batch is "Completed" then procced with the Step 5.
Step 5. Obtain the results of successful and unsuccessful records using batchResultInStream resource.
To create a valid connection,
- In Integration Server Administrator, go to Solutions > CloudStreams > Providers
- Click on Salesforce.com. provider name
- In the Connector Name column on the Connectors screen, click the name of the CloudStreams connector for which you want to create a connection
- On the Connections screen, click Configure New Connection
- On the Configure Connection screen, select in which view you want to create the connection: Fill in the standard parameters like package, Folder Name, Connection Name, Username and Password.
- Click Save.
To create Job,
- Open the Service Development perspective in Designer.
- Create new cloud connector service(CCS) > choose CreateJob operation.
- Provide necessary inputs content-type, headers and the type of the operation should be processed.
Ex: content-type as ‘CSV’, headers as 'text/csv' and operation as 'insert' and execute the cloud connector service. Save the response for further reference.
Create a Batch, using CreateBatch resource. You need to pass the .csv file as an input to this operation
- On Service Development perspective in Designer, create a new Cloud Connector Service > chose CreateBatch operation
- Provide necessary inputs.
Header as 'text/csv' and provide jobId from above cloud connector service response
To provide stream as input,
- Have the inserting data ready in .csv file / .txt
- Right-click the folder in which you want the flow service to be created and select > New >Flow Service > provide the name of the flow service > finish
- Open the flow service created and right-click >invoke >”pub.io:streamToString” service
- Double-Click on the service invoked and in pipeline section, provide the input for filename as full path of .csv/.txt file and loadAs stream
- Drag and drop the cloud connector service which creates batch.
- Map the inputs and outputs as shown.
- Run the flow service and save the response.
Obtain the status of the batch using the operation 'batchStatus'.
Note: The batch should be processed completely. If the batch status is failed or queued or inprogress, user needs to wait till the completion of the batch.
To get the resutList of the batch processed for successful and unsuccessful records,
- Create a cloud connector service and choose the operation 'batchResultInStream’ and save the service.
- Create a Flow Service and invoke the following services ‘batchResultInStream’ , 'pub.io:stringToStream','delimitedDataStringToDocument'
- Run the Flow Service and save the response.
- Response provides the records which are inserted succesfully and which are unsucessful. Response looks as below screenshot.
Note: Kindly refer https://developer.salesforce.com/docs/atlas.en-us.api_asynch.meta/api_asynch/asynch_api_intro.htm for more info