Power Automate, a cloud-based automation tool, empowers users to streamline their workflows and processes. Among its array of controls, the “Apply to each” control stands out as a versatile tool for looping through sets of records.
By default, the “Apply to each” loops execute sequentially, potentially leading to extended execution times when dealing with large datasets. To address this issue, we can leverage Concurrency Control, which allows us to customize the degree of parallelism, thus optimizing the execution process.
- Basic knowledge of Power Automate
- Familiarity with list management in SharePoint
- A Microsoft Power Automate license-enabled account
Consider a scenario where we need to update items in a SharePoint list called ‘List 2’ by iterating through another SharePoint list named ‘List 1,’ containing approximately 150 items. We’ll create a straightforward Power Automate scheduler flow to retrieve items from ‘List 1’ and update the corresponding entries in ‘List 2.’
Running the Flow without Concurrency:
Running the flow without enabling concurrency control takes approximately 1 minute and 28 seconds to sequentially process all items in ‘List 1.’
Enabling Concurrency Control for a Loop:
To enable concurrency, follow these steps:
- Click the three dots in the “Apply to each” control.
- Select “Settings.”
- By default, the Concurrency Control will be turned off. Toggle it on.
- You’ll see a slider to adjust the degree of parallelism, with a default limit of 1 and a range of 1 to 50.
- Set the degree of parallelism to 25, allowing multiple loops to be processed simultaneously in batches, effectively reducing the execution time of the flow.
Running the Flow with Concurrency:
After enabling concurrency, the execution time is dramatically reduced from 1.5 minutes to a mere 8 seconds.
Note: Concurrency control may not be suitable if the loop must adhere to a specific order.
Enabling concurrency control in Power Automate significantly improves the execution time of loops, making it a valuable feature for processing large datasets efficiently. This optimization can save time and enhance the overall performance of your automated workflows.