Expert Circle Webinar - Pega Deployment Manager Business Change & Data Migration Pipelines: What We Learned in 45 Minutes
On March 18th, 2026, Pega's as-a-Service Expert Circle hosted a packed, 45-minute deep dive into one of the most operationally critical areas of any Customer Decision Hub (CDH) deployment: Business Change Pipelines and Data Migration Pipelines. If you missed it live, or simply want to revisit the highlights, the recording will be available shortly at Pega as a Service Expert Circle:
Speakers:
The session was led by Tihomir Petrovic (Principal System Architect), joined by Madhuri Vasa (Product Manager, Deployment Manager), Girish Kamath (Product Manager, 1:1 Business Value & Insights), and Alex Burt (Director of Product Management, 1:1 Customer Engagement).
Why Business Operations Matters
CDH is an AI-powered, real-time decision engine capable of delivering personalized, omnichannel customer interactions in under 200 milliseconds. To keep that engine running smoothly, teams need a structured "Business Operations" cycle, covering planning, building, testing, optimizing, deploying, and monitoring. The two pipeline types discussed in this session are at the heart of that cycle.
Business Change Pipelines: BAU Without the Bottlenecks
Business Change Pipelines (BCPs) are designed for every day, business-as-usual updates to CDH applications, think action and treatment updates, without waiting on full enterprise release cycles. They operate on an overlay application, which acts as an abstraction layer from the main enterprise application, keeping changes scoped and controlled.
The typical workflow moves changes from the Business Operations Environment (BOE) to the Development environment for merging, then packages and promotes them all the way to Production. Key best practices shared during the session:
- Use only one pipeline per overlay application
- Ensure version alignment of Pega Infinity, CDH, and Deployment Manager across all environments
- Avoid using BCPs for fundamental changes such as modifying the Context Dictionary
Data Migration Pipelines: Testing with Real-World Data
Data Migration Pipelines (DMPs) solve a persistent challenge: how do you give marketing and operations teams meaningful test data without exposing full production datasets? The answer is automated, scheduled data migration and Deployment Manager instructing Production to export a sample (up to 20% of customer data), stores it in a shared repository, and then instructs the BOE to import it.
Migrated data types include customer data samples, interaction history summaries, adaptive model factory data, and Scenario Planner Actuals used by discovery tools. Best practices highlighted:
- Always use sample data, never the full production set
- Manage migration artifacts in a dedicated ruleset
- Automate the pipeline on a regular cadence (daily or weekly)
Pega Cloud Considerations
For teams running Deployment Manager in Pega Cloud, there are specific requirements to be aware of when migrating sample customer data. These considerations were covered during the session and are critical for architects and Cloud Ops administrators managing multi-environment deployments.
Lively Panel Discussion
The session closed with an engaging panel Q&A. Topics ranged from branch management and deployment ID handling to access group behavior after live deployments and the future role of AI agents in managing release lifecycles. The full Q&A — with written summaries of all answers, will be published separately on the Expert Circle site.
Join the Conversation
Have more questions about PDM pipelines in your CDH environment? Head over to the Q&A discussion post on the Expert Circle and drop your question in the comments, the panelists are keeping an eye on it!
And if you're not yet part of the Pega as-a-Service Expert Circle, now is the perfect time to join. The community runs regular webinars like this one, covering everything from CI/CD best practices to Pega Cloud operations. Join the Expert Circle here and stay ahead of the curve.