Updating active models in predictions
As a data scientist, you can approve changes to models that are used in predictions for deployment to the production environment. You can change models independently or by responding to a Prediction Studio notification that a prediction does not generate enough lift.
To improve the performance of a prediction, you can replace a low-performing model with a high-accuracy external model that you upload to a Pega repository or directly to Prediction Studio. As a result, you start a standard approval and validation process to deploy the model update to production. Before you approve any changes, you can compare the candidate model with the existing model based on data science metrics, such as score distribution or lift. For more information, see Active and candidate model comparison charts.
Managing model updates
In your Business Operations Environment (BOE), you can start and manage the model update process from Prediction Studio or remotely by using the Prediction Studio API. For more information about using the API endpoints, see Updating active models in predictions through API.
In Pega Customer Decision Hub environments, changes to models that you approve in Prediction Studio are deployed to production through Pega 1:1 Operations Manager and the Business Change pipeline.
If you use decisioning models for other purposes, such as customer service or sales automation, components such as Pega 1:1 Operations Manager and Revision Manager are not present. A system architect merges the branch with the model update to the application ruleset that can be deployed to production.
For more information, see Understanding the model update process.
To replace a model in a prediction, and then deploy the model to production, perform the following procedures:
- Model update prerequisites
To start using the model update feature in your system, configure the appropriate access rights for data scientist operators, update Prediction Studio settings, and ensure that your Business Change pipeline meets the requirements.
- Replacing models in predictions
Improve a prediction by replacing a low-performing model with a high-accuracy model. You can also replace a model with a scorecard or a field in the data model that contains a score.
- Evaluating candidate models
After you add a candidate model to a prediction, Prediction Studio configures and validates the new model, and provides comparison data to help you evaluate the new model. Decide whether you want to approve the new model for deployment to production or reject it.
- Promoting shadow models
A shadow model runs alongside an active model. Both models receive production data and generate outcomes, but the outcomes of the shadow model are not used to make business decisions. Check how the shadow model performs over time, and if the model proves better suited to your business needs, promote it as the active model.
- Rejecting shadow models
A shadow model runs in your production environment alongside an active model. The shadow model receives production data and generates outcomes, but does not impact your business decisions. The system tracks the outcomes to help you evaluate how the shadow model performs in production. If the model is not suitable for your needs, you can reject the model, and then replace it with a different one.