Support Article
Database-Saver-UpdateInsertfail when processing large data
SA-42752
Summary
Database-Saver-UpdateInsertfail exception in logs prevents agents from processing large data.
Error Messages
com.pega.pegarules.pub.database.DatabaseException: Database-Saver-UpdateInsertfail
pyCommitError: **Database save failed: Tried an update and then tried an insert
Steps to Reproduce
- Create an agent
- Run the agent for a large number of records
Root Cause
Pega0040 alerts are logged when the pzPVStream BLOB size is more than the defined size for the database. From the logs, a maximum of 12 MB is inserted in the database but the column size accepted by the database is less than 12 MB.
Resolution
Perform the below local-change:Add the below code in prconfig.xml to compress the BLOB before saving it to the database:
<env name="compatibility/deflatestreams" value="true" />
Published September 7, 2017 - Updated October 8, 2020
Have a question? Get answers now.
Visit the Collaboration Center to ask questions, engage in discussions, share ideas, and help others.