Skip to main content

Resolved Issues

View the resolved issues for a specific Platform release.

Go to download resolved issues by patch release.

Browse release notes for a selected Pega Version.

NOTE: Enter just the Case ID number (SR or INC) in order to find the associated Support Request.

Please note: beginning with the Pega Platform 8.7.4 Patch, the Resolved Issues have moved to the Support Center.

INC-150395 · Issue 625069

Tokenizer updated to handle commas

Resolved in Pega Version 8.6.1

The Text Analyzer was not working as expected in cases where the number was combined with a comma (,) with it but was working when a space was used between the number and the comma. This was traced to the tokenizer not correctly processing and splitting the input text when there was a special character before or after the token. This has been resolved by updating the tokenizer logic.

INC-154746 · Issue 613406

ADM performance improvements and duplicate inputs corrected for delayed learning records

Resolved in Pega Version 8.6.1

Additional work has been done to improve the performance for Adaptive Models used in multi-level decisioning, and an issue with duplicate pxCommonInputs has been resolved.

INC-157357 · Issue 636712

Hazelcast remote execution not called from synchronized context

Resolved in Pega Version 8.6.1

After navigating to the Admin Studio portal to view the nodes, the portal was temporarily freezing. Investigation of the thread dump revealed this was caused by a DDS pulse sending a remote execution call to all nodes to update logger settings even though the site was not using DDS. This has been resolved by updating the system to avoid calling Hazelcast remote execution from a synchronized context.

INC-157629 · Issue 626634

Duplicate key exception resolved for adaptive model

Resolved in Pega Version 8.6.1

During the model snapshot update, a DuplicateKeyException was generated while trying to insert a record in to the predictor table. This did not affect the model's learning, but did appear ion the model monitoring report. This was traced to a local scenario of having the same outcome values defined on the model with different cases (Accept and accept). All predictors used in an Adaptive model are inserted into the model monitoring tables as a part of the monitoring job: because the monitoring tables are not case sensitive, this lead to a unique constraint exception since there were multiple IH predictors with the same name. To resolve this, validation has been added which will skip adding duplicates from new responses.

INC-161829 · Issue 645206

Corrected merged rule checkout error

Resolved in Pega Version 8.6.1

A decision data rule that was created or updated in a branch and then merged to a ruleset version was failing a subsequent checkout from the same merged ruleset version. This was traced to recent work done to address an Indexoutofbound exception in pyEditElement related to java compilation exceeding the 65 000 byte limit, and has been resolved by updating the pzGetCircumstanceValue activity to exclude ruleset version from the circumstance value generated for pyEditElement section. The following activities in the Rule-Decision-DecisionParameters class have also been updated to replace the Obj-Open method with Obj-Open-By-Handle so that correct version of pyEditElement is referred - pzDeleteExistingLayout, OnBeforeDisplay and pzCheckIfRuleSyncWithLayout.

INC-164243 · Issue 656500

DateTime validation works correctly after importing invalid data

Resolved in Pega Version 8.6.1

After creating DecisionData (Dev studio) and adding a DateTime property to the form, importing records with invalid DateTime values failed with a validation error on the screen and the message "Error while converting format for data type DateTime property name Test_date_format with value scvf" was logged. Attempting to proceed by correcting the DateTime property and uploading worked, but any subsequent imports in the same session silently allowed invalid inputs to be passed without any validation errors and then showed blank date fields. This has been corrected.

INC-165704 · Issue 639505

VBD data flow timeout increased and made configurable

Resolved in Pega Version 8.6.1

Intermittent VBD timeouts were seen when writing records to MSK even though no errors were reported on the MSK side. Analysis showed that while batch data flows retry when a timeout occurs, real time data flows do not retry and the configuration to wait up to 10 seconds for an acknowledgement may not be sufficient depending on the system conditions. This has been resolved by increasing the default timeout to 20 seconds and adding a configurable timeout "vbd/streamPublishTimeoutMillis" to allow a customized setting.

INC-166561 · Issue 645650

ADM Models correctly updated

Resolved in Pega Version 8.6.1

The ADM models were not being updated when responses were processed either via the CaptureResponse API or when the time elapsed that should result in an update reflecting a non-response. This was traced to incomplete handling for a response coming for some other model which was converted to EMPTY, and has been resolved by modifying the logic so that the default responses and other responses are processed properly.

INC-167334 · Issue 639318

GRS support added for Kafka Key password

Resolved in Pega Version 8.6.1

An enhancement has been added to support using GRS to set values for the Kafka Key password dynamically.

INC-169125 · Issue 642401

Nodes resume correctly after DDS restart

Resolved in Pega Version 8.6.1

A corner case issue in VBD's code for handling a DDS session was preventing the nodes from recovering correctly after a system shutdown. As part of the process for an event which fires if all DDS nodes are taken down or as part of a switch from embedded to external Cassandra, VBD's cache is invalidated and then re-initialized once new VBD API calls are received or on the VBD service pulse. In this case, the invalidation of the cache did not complete due to logic in the VBD code that can lead to executing a Cassandra query that will not work in the case of all DDS nodes being down. This has been resolved by modifying the handling of a session change event to eliminate inadvertent Cassandra queries so the invalidation can complete correctly and continue the re-initialization process.

We'd prefer it if you saw us at our best.

Pega.com is not optimized for Internet Explorer. For the optimal experience, please use:

Close Deprecation Notice
Contact us