Connecting to topic models through an API
Broaden your selection of topic models in Pega Platform by connecting to custom models through an API. Train and deploy your topic models, and expose an API endpoint to allow Pega Platform to interact with the models.
To help you serve your models through an API, the Pega GitHub repository provides sample Docker containers, with which you can train and deploy your models.
In Dev Studio, configure the OAuth 2.0 authentication profile.For more information, see Creating an authentication profile.
Deploy your topic model and expose an API endpoint to allow Pega Platform to interact with the model.
You can deploy a Python model by using sample Docker containers. For more information, see Configuring sample containers to use Python models for topic detection.
In Prediction Studio, define a machine learning service to connect to topic models through an API.For more information, see Configuring a machine learning service connection for topic models using REST API.
In Prediction Studio, create a text categorization model using the new service connection.For more information, see Creating a text categorization model to run topic models through an API.
ArticleIntegrating topic models FAQ