Close popover

Table of Contents

Specializing activities

Version:

Use the Call instruction with the Data-Decision-DDF-RunOptions.pyPreActivity and Call Data-Decision-DDF-RunOptions.pyPostActivity activities to define which activities should be run before and after batch or real-time data flow runs that are not single-case runs. Use the activities to prepare your data flow run and perform certain actions when the run ends. Pre-activities run before assignments are created. Post-activities start at the end of the data flow regardless of whether the run finishes, fails, or stops. Both pre- and post-activities run only once and are associated with the data flow run.

  1. Create an instance of the Activity rule in the Dev Studio navigation panel by clicking Records Technical Activity .

  2. In the activity steps, enter one of the following methods:

    • Call Data-Decision-DDF-RunOptions.pyPreActivity - Runs an activity before the data flow run. The activity must be defined in the Applies To class of the data flow, and it can use other methods to manipulate the run, for example, retrieve progress information, stop the data flow run, etc.
    • Call Data-Decision-DDF-RunOptions.pyPostActivity - Runs an activity after the data flow run. The activity must be defined in the Applies To class of the data flow.

      The status of the data flow run does not constrain how the Data-Decision-DDF-RunOptions.pyPostActivity activity is run; the activity is run even if the data flow run failed or stopped. The data flow engine passes the RunOptions page parameter to the activity containing the current run configuration page. The activity cannot change this configuration.

  3. Click the arrow to the left of the Method field to expand the method and specify its parameters.

    • In the Data-Decision-DDF-RunOptions.pyPreActivity activity, set Param.SkipRun="true", to ignore the rest of the run. You can also use Call Data-Decision-DDF-RunOptions.pxStopRunById to achieve the same result. The data flow engine passes the RunOptions page parameter to the activity containing the current run configuration page. The activity can change this configuration. If the activity fails, the data flow engine does not run the data flow and this run is marked as failed.
  4. Click Save.

  • Types of data flows

    Data flows are scalable data pipelines that you can build to sequence and combine data based on various data sources. Each data flow consists of components that transform data and enrich data processing with business rules.

  • Data flow methods

    Data flows can be run, monitored, and managed through a rule-based API. Data-Decision-DDFRunOptions is the container class for the API rules and provides the properties required to programmatically configure data flow runs. Additionally, the DataFlow-Execute method allows you to perform a number of operations that depend on the design of the data flow that you invoke.

  • Calling another activity
  • Keystores

    A keystore is a file that contains keys and certificates that you use for encryption, authentication, and serving content over HTTPS. In Pega Platform, you create a keystore data instance that points to a keystore file.

  • Decision Management methods

Have a question? Get answers now.

Visit the Collaboration Center to ask questions, engage in discussions, share ideas, and help others.