PEGA0016 alert: Cache reduced to target size
The PEGA0016 alert is triggered when a cache size exceeds the target size (calculated as a count of entries against a limit value). This alert indicates that the instances of the stated cache are being invalidated (and perhaps reloaded later) in the system.
This alert relates to caches on the server, each of which has a limit value set in the prconfig.xml file. When the instances stored in a cache exceed the target count (which is about 75% of the limit set in the prconfig.xml file), the oldest entries in the cache are invalidated (dropped from the cache) until the target count is reached. The alert message identifies which of the following caches was adjusted:
Example message text
collections/mru/UpperCase: Successfully reduced MRU to: 12000
collections/mru/PropertyReference: Successfully reduced MRU to: 12000
Default prconfig.xml settings
<env name="fua/global/instancecountlimit" value="20000" />
<env name="fua/personal/instancecountlimit" value="20000" />
<env name="collections/mru/PropertyReference/instancecountlimit" value="10000" />
<env name="collections/mru/LowerCase/instancecountlimit" value="10000" />
<env name="collections/mru/UpperCase/instancecountlimit" value="15000" />
Reasons for the alert
Some invalidation of cache entries is expected. However, if the alert is seen frequently (for example, each work object invalidates 1,000 entries) this could signal that the prconfig.xml file values for the cache identified in the message are set too low, and entries are being invalidated too often, requiring them to be fetched at the next request.
A cache that is too small hurts performance by requiring more objects to be fetched from the database (or fetched from the file system or loaded by a class loader) than from the cache. However, a cache that is too large can also hurt performance because it reduces the memory that is available for other processing within the JVM.
Calculate the optimal setting for the caches and adjust the prconfig.xml file settings accordingly.