Skip to main content

This content has been archived and is no longer being updated. Links may not function; however, this content may be relevant to outdated versions of the product.

Support Article

Email Listener concurrent thread/ requestor pools are always 1

SA-64393

Summary



The email listener configuration settings for the number of Concurrent threads, Requestor pools do not reflect according to the settings. 
The most active requestor pools are always 1. Therefore, a delay occurs when processing incoming emails.




Error Messages



Not Applicable


Steps to Reproduce



Unknown


Root Cause



This behavior is as per Pega product design.

Each Email Listener is connected with a single email inbox. Even when the Email Listener is configured to run on multiple nodes at any given time, there is only one active instance of that EmailListener. This is because, neither the POP3 or the IMAP protocol (to access mail boxes)  know any locking mechanism. Therefore, it is not possible to prevent concurrent threads or processes from processing a single email multiple times.
Hence, the settings for pooling in  the ServicePackage have no effect on the EmailListener. Different EmailListeners (each of them accessing a different mail box) run concurrently.



Resolution



Perform either of the following local-changes:
  • Distribute emails in to different mail boxes, each with its own EmailListener.
  • Create a queue item and delegate any other processing to the QueueProcessor (these may run in many parallel instances) to simplify EmailListener processing .

 

Published January 27, 2019 - Updated October 8, 2020

Was this useful?

0% found this useful

Have a question? Get answers now.

Visit the Collaboration Center to ask questions, engage in discussions, share ideas, and help others.

Did you find this content helpful?

Want to help us improve this content?

We'd prefer it if you saw us at our best.

Pega Community has detected you are using a browser which may prevent you from experiencing the site as intended. To improve your experience, please update your browser.

Close Deprecation Notice
Contact us