Skip to main content


         This documentation site is for previous versions. Visit our new documentation site for current releases.      
 

This content has been archived and is no longer being updated.

Links may not function; however, this content may be relevant to outdated versions of the product.

Monitoring the predictive performance of a model

Updated on September 13, 2021

Capture customer interactions and analyze the performance of a predictive model by using a robust set of metrics to improve your customer experience.

Use case

uPlusTelco wants to improve the experience of their customer support by predicting the reason for each customer call. To achieve that goal, the data analytics team built a predictive model and uploaded it to Prediction Studio. A system architect created a decision strategy with that model, deployed that strategy in a decision data flow, and then created a response data flow with a strategy that references the Predict Call Context model.

When you gather responses by running the decision and response data flows for customer support interactions, your responsibility as a data scientist is to analyze the monitoring data.

Before you begin

Create a response strategy that defines the .pyPrediction and .pyOutcome properties of the model that you want to monitor. See Creating a response strategy.

Gathering customer interactions

To run predictive analytics monitoring on a live system, run the decision and response data flows. This way you collect customer interaction data for monitoring and apply the decision and response strategies to analyze that data.

  1. Run the decision data flow:
    1. In Dev Studio, click Records > Data Model > Data Flow.
    2. On the list of the Data Flow rule instances, locate and click MonitorMyPredictiveModel.
    3. On the data flow tab, click Actions > Run.
    4. In the data flow test run dialog box, click Start.
    5. Wait until the process completes and close the dialog box.
  2. Run the response data flow:
    1. On the list of the Data Flow rule instances, locate and click SetResponsesToMonitorMyModels.
    2. On the data flow tab, click Actions > Run.
    3. In the data flow test run dialog box, click Start.
    4. Wait until the process completes and close the dialog box.

Analyzing the predictive performance of a model

After gathering customer interactions, use different charts and reports to verify the predictive performance of your model.

  1. In the navigation panel of Prediction Studio, click Predictions.
  2. In the Predictions work area, click the My Predict Call Context model.
  3. On the predictive model page, open the Monitor tab and click Refresh data.
  4. Specify the time span in which you want to analyze the model by selecting the Time range and Timeframe options, for example:

    Time range: All time

    Timeframe: Week

  5. Analyze the predictive metrics:
    1. In the Performance area, verify how accurately your model predicted the outcomes in the specified time, compared to the expected value.
      Thumbnail
      Sample F-score performance chart

       

    2. In the Total responses area, analyze the number of responses that were gathered in the specified time.
      Thumbnail
      Sample chart with the total number of responses

       

    3. In the Performance area, click Show confusion matrix and analyze a contingency table of actual outcomes versus the expected outcomes (in percentages or in the number of responses).
      Thumbnail
      Sample confusion matrix

       

    4. Optional: If you want to store the confusion chart offline for further analysis, click Export data and save the .csv file.

      For more information on how to interpret the data and on predictive performance metrics for other model types, see Metrics for measuring predictive performance.

Conclusion

You have imported a PMML model, created a strategy that uses that model to make predictions, captured responses, and analyzed the predictive performance of your model. Thanks to these activities, you have now a baseline for further analysis of the model accuracy.

What to do next

If you are unsatisfied with the performance of the model, create or import a new one to verify whether the predictions they make help you drive your business results and fulfill your commitments to customers. For more information, see Predictive models monitoring.

To view the main process outline for this tutorial, see Monitoring predictive models.

  • Previous topic Creating and implementing a response strategy for predictive model monitoring
  • Next topic Tutorial: Building a headless decisioning scenario with data flows in Pega 7.2.1

Have a question? Get answers now.

Visit the Support Center to ask questions, engage in discussions, share ideas, and help others.

Did you find this content helpful?

Want to help us improve this content?

We'd prefer it if you saw us at our best.

Pega.com is not optimized for Internet Explorer. For the optimal experience, please use:

Close Deprecation Notice
Contact us