Reviewing adaptive model performance
Evaluating the adaptive models is required to continuously improve the value they bring, as well as make sure that they work correctly.
Pega Marketing uses self-learning adaptive models to predict the behavior of customers and select the most relevant action for each customer at each point in time. You can monitor the performance of adaptive models by using the out-of-the-box capacities of Prediction Studio.
Understanding adaptive models
Strategies, such as the Next-Best-Action strategy which you generate with Next-Best-Action Designer, use adaptive models to identify propositions that your customers are most likely to accept, improve customer acceptance rates, or predict other customer behavior. For example, the Next-Best-Action strategy ranks available actions by using the Action Propensity model to calculate the likelihood that a customer would accept a specific action.
Adaptive models work by recording all customer responses (both positive and negative) and correlating them to different customer details (for example, age, gender, region, and so on). For example, if ten people under 35 years of age accept a particular phone offer, the predicted likelihood that more people under 35 years of age will buy the same phone increases. The likelihood can also go down if a negative response is recorded, from this group. Over time, reliable correlations emerge.
The Next-Best-Action strategy framework references a number of out-of-the-box adaptive models to select the most relevant actions and treatments.. For more information, see the following topics:
Monitoring adaptive model performance
Use Prediction Studio to monitor how successful an adaptive model is at predicting customer behavior. This is especially useful for new adaptive models that you created, but you can also use it to increase your understanding of the built-in Next-Best-Action models.
- Log in to Pega Marketing as an operator with access to Prediction Studio.
- In Prediction Studio, click Models.
- Find and open an adaptive model, for example, Web_Click_Through_Rate.
- On the Monitoring tab, review the information available in the chart. Each bubble in the chart represents the model performance for a specific action. Hover your mouse over a bubble to view the results for this action:
- - How good the model is at predicting the outcome. Model performance is expressed in the Area Under the Curve (AUC) unit of measurement, which has a range between 50 and 100. The higher the AUC, the better a model is at predicting outcomes.
- - The percentage of positive responses to the action, calculated by dividing the number of positive responses by the total number of responses.
- - The total number of positive and negative responses.
- For more detailed information about the performance of each action, refer to the table under the bubble chart.