LinkedIn
Copied!

Table of Contents

Changing the number of retries for SAVE operations in batch and real-time data flow runs

Version:

Only available versions of this content are shown in the dropdown

Control how many times batch and real-time data flow runs retry SAVE operations on records. With automatic retries, when a SAVE operation fails, the run can still successfully complete if the resources that were initially unavailable become operational. The run fails only when all the retries are unsuccessful.

You can control the global number of retries for SAVE operations through a dedicated dynamic system setting. If you want to change that setting for an individual batch or real-time data flow run, update a property in the integrated API.
If a single record fails for Merge and Compose shapes, the entire batch run fails.
Retries trigger lifecycle events. For more information, see Event details in data flow runs on Pega Community.
  1. In the navigation pane of Dev Studio, click Records SysAdmin Dynamic System Settings .

  2. In the list of instances, search for and open the dataflow/shape/maxRetries dynamic system setting.

  3. In the dynamic system setting editing tab, in the Value field, enter the number of retries that you want to run when a SAVE operation on a record fails during a data flow run.

    The default value is 5.
If you want to change that setting for a single batch data flow run, update the pyResilience.pyShapeMaxRetries property in the RunOptions page for the run through the integrated API. For more information, see Integrating with Pega APIs and services.
Did you find this content helpful?

Have a question? Get answers now.

Visit the Collaboration Center to ask questions, engage in discussions, share ideas, and help others.