I need to run a loop around 50 times to return approximately 50K records and currently I am limited to 500 records per call. The workflow keeps terminating because of insufficient memory on the second loop. Can you please tell me if there is a way to optimize this workflow?
The container size to run a flow is currently set to 256 MB. Hence, we would suggest you to process the records in batches (e.g., 500 records in a batch), at particular intervals. You can use ‘Clock’ as a trigger to set the time interval for batch processing and use ‘Flow Store’ action to save the status of the most recently processed batch, as a reference to start processing the next batch.