Skip to main content

This content has been archived and is no longer being updated. Links may not function; however, this content may be relevant to outdated versions of the product.

Support Article

Database-Saver-UpdateInsertfail when processing large data

SA-42752

Summary



Database-Saver-UpdateInsertfail exception in logs prevents agents from processing large data.



Error Messages



com.pega.pegarules.pub.database.DatabaseException: Database-Saver-UpdateInsertfail
pyCommitError: **Database save failed: Tried an update and then tried an insert




Steps to Reproduce

  1. Create an agent
  2. Run the agent for a large number of records



Root Cause



Pega0040 alerts are logged when the pzPVStream BLOB size is more than the defined size for the database. From the logs, a maximum of 12 MB is inserted in the database but the column size accepted by the database is less than 12 MB.



Resolution

Perform the below local-change:

Add the below code in prconfig.xml to compress the BLOB before saving it to the database:

<env name="compatibility/deflatestreams" value="true" />

Published September 7, 2017 - Updated October 8, 2020

Was this useful?

0% found this useful

Have a question? Get answers now.

Visit the Collaboration Center to ask questions, engage in discussions, share ideas, and help others.

Did you find this content helpful?

We'd prefer it if you saw us at our best.

Pega Community has detected you are using a browser which may prevent you from experiencing the site as intended. To improve your experience, please update your browser.

Close Deprecation Notice