Skip to main content

This content has been archived and is no longer being updated. Links may not function; however, this content may be relevant to outdated versions of the product.

Support Article

UDF functions are not compiling

SA-12746

Summary



User encounters UDF function corruption in a PRPC 6.3 SP1 environment.
After removing the existing functions, when the user creates and validates new functions, exceptions occur.


Error Messages



ORA-29541: class PEGADSSTMP.com/pega/pegarules/data/udf/directstreamreader/DirectStreamReader could not be resolved
29541. 00000 - "class %s.%s could not be resolved"
*Cause: An attempt was made to execute a method in a Java class
that had not been previously and cannot now be compiled
or resolved successfully.
*Action: Adjust the call or make the class resolvable.


Steps to Reproduce



1. Drop existing functions
2. Execute scripts for creating new UDF functions
3. Run the validation script for UDF functions. Here few of the Java classes are not compiled.
4. Execute the script to compile Java classes. The script throws ecxeptions. In the script which provides invalid objects system displays most of the classes are invalid.

 


Root Cause



The root cause of this problem is defect/misconfiguration in the operating environment. UDF's were corrupted. Need to re-load the prreadstream.jar again.

Resolution



This issue is resolved by making the following change to the operating environment:
Note: Take a database backup before performing the following steps.
 
In order to resolve it, one must drop the functions, re-load the udf entries, recreate the functions and then recompile the java classes.  This is done in a series of command-line and sqlplus steps to restore your functions/classes to a usable state.
 
We have to create a file for the connection to the database, drop the functions, re-load the udf entries, recreate the functions and then recompile the java classes.  Once these 9 sequential steps are completed, the issue should be resolved.
 
1. Create a file called user-deployudf.properties as a text file.
 
2. Add the following and replacing the parameters (ServerName_or_IP, port, OracleSID, PegaUser and DATAWORK) with your values:
               oracledate.jdbc.class=oracle.jdbc.OracleDriver
              dev.oracledate.url=jdbc:oracle:thin:@ServerName_or_IP:port:oracleSID
              dev.oracledate.username=PegaUser
              dev.oracledate.password=PegaUser
               dev.oracledate.schema=DATAWORK
 
3. Save the file in the D:\PRPC716_GA\archives\udf folder with the prreadstream.jar file being loaded. Note : This can change from system to system. So locate the archive folder at your end after talking to the database administrator.
 
4. Drop the functions that are causing the udf directstreamreader errors.  Connect to your database via sqlplus (Oracle) and execute the drop commands.  Replace the schema qualified name "DATAWORK" with the same name/value specified in step 2 for "dev.oracledate.schema=".
               SQL> drop function DATAWORK.pr_read_from_stream;
               SQL> drop function DATAWORK.pr_read_decimal_from_stream;
               SQL> drop function DATAWORK.pr_read_int_from_stream;
 
5. Load the classes.  This is done by using a Java Call from a command prompt and specifying seven parameters for the Java call.


 java          Program execution file used to process jar files.

            -cp           Class search path of directories and zip/jar files. 

             D:\PRPC716_GA\scripts\prdeploy.jar          Followed by a quoted string specifying the prdeploy.jar file which is used to load the files .

                  D:\lib\ojdbc6.jar         You must also specify the location of where the jdbc driver is for the database. Separated by a semi-colon (;).

                  com.pega.pegarules.install.DatabaseLibraryLoader           The name of the class in the prdeploy.jar file being called.

                  --config="D:\PRPC716_GA\archives\udf\user-deployudf.properties"            The location and name of the properties file created in step 1 & 2

                  --profile=dev.oracledate             Name of the profile in the config file being called.  Used to identify the database type

                  "D:\PRPC716_GA\archives\udf\prreadstream.jar"            Name of the jar file being processed and imported into the system.

To put all the parameters together, open a command prompt and execute the below.  Note that D:\PRPC716_GA is the folder containing the distribution image and \scripts is the folder containing the the prdeploy.jar file.

D:\PRPC716_GA\scripts> java -cp "D:\PRPC716_GA\scripts\prdeploy.jar;D:\lib\ojdbc6.jar" com.pega.pegarules.install.DatabaseLibraryLoader --config="D:\PRPC716_GA\archives\udf\user-deployudf.properties" --profile=dev.oracledate "D:\PRPC716_GA\archives\udf\prreadstream.jar"
 

6. Rebuild / Create the three functions removed in step 4.  Replace the schema qualified name "DATAWORK" with the same name/value specified in step 2 for "dev.oracledate.schema=".
          SQL> CREATE FUNCTION DATAWORK.pr_read_from_stream
(property in varchar2, insKey in varchar2, 
   storage_stream in blob)
  return varchar2 DETERMINISTIC
as 

language java 
name 'com.pega.pegarules.data.udf.directstreamreader.DirectStreamReader.get(java.lang.String, java.lang.String, java.sql.Blob) return java.lang.String';
/
SQL> CREATE FUNCTION DATAWORK.pr_read_decimal_from_stream
(property in varchar2, insKey in varchar2, 
   storage_stream in blob)
  return number DETERMINISTIC
as 

language java 
name 'com.pega.pegarules.data.udf.directstreamreader.DirectStreamReader.getDecimal(java.lang.String, java.lang.String, java.sql.Blob) return java.math.BigDecimal';
/
SQL> CREATE FUNCTION DATAWORK.pr_read_int_from_stream
(property in varchar2, insKey in varchar2,
   storage_stream in blob)
  return number DETERMINISTIC
as 
language java 
name 'com.pega.pegarules.data.udf.directstreamreader.DirectStreamReader.getInt(java.lang.String, java.lang.String, java.sql.Blob) return int';
/
 
7. Recompile all the Java Classes for the schema in question (DATAWORK).  Replace the schema qualified name "DATAWORK" with the same name/value specified in step 2 for "dev.oracledate.schema=".  Recompiling "INVALID" objects can be done in a couple ways.
 
    a) A clean sweep approach would be to recompile all Java Classes to make sure everything can compile is VALID. 
            SQL> set pagesize 25000
            SQL> select 'alter java class '''||object_name||''' resolve;' from user_objects where object_type like '%JAVA%';
 
    b) One class at a time - Note the DB2LUW/DB2ZOS and PostgreSQL may or may not compile in Oracle. This can be ignored.
            SQL> alter java class "DATAWORK"."com/pega/pegarules/data/udf/directstreamreader/DirectStreamReaderDB2LUW" resolve;
            SQL> alter java class "DATAWORK"."com/pega/pegarules/data/internal/clipboard/directstream/BadReferenceException" resolve;
            SQL> alter java class "DATAWORK"."com/pega/pegarules/data/internal/clipboard/directstream/BadStreamDataException" resolve;
            SQL> alter java class "DATAWORK"."com/pega/pegarules/data/internal/clipboard/directstream/ByteArrayUtil" resolve;
            SQL> alter java class "DATAWORK"."com/pega/pegarules/data/internal/clipboard/directstream/DirectStream" resolve;
            SQL> alter java class "DATAWORK"."com/pega/pegarules/data/internal/clipboard/directstream/DirectStreamEnvironmentAdapter" resolve;
            SQL> alter java class "DATAWORK"."com/pega/pegarules/data/internal/clipboard/directstream/InflaterV7" resolve;
            SQL> alter java class "DATAWORK"."com/pega/pegarules/data/internal/clipboard/directstream/PropertyReferenceUtilities" resolve;
            SQL> alter java class "DATAWORK"."com/pega/pegarules/data/udf/directstreamreader/BasicEnvironmentAdapter" resolve;
            SQL> alter java class "DATAWORK"."com/pega/pegarules/data/udf/directstreamreader/CachedDirectStream" resolve;
            SQL> alter java class "DATAWORK"."com/pega/pegarules/data/udf/directstreamreader/DirectStreamReader" resolve;
            SQL> alter java class "DATAWORK"."com/pega/pegarules/data/udf/directstreamreader/DirectStreamReaderBase" resolve;
            SQL> alter java class "DATAWORK"."com/pega/pegarules/data/udf/directstreamreader/DirectStreamReaderDB2LUW" resolve;
            SQL> alter java class "DATAWORK"."com/pega/pegarules/data/udf/directstreamreader/DirectStreamReaderDB2ZOS" resolve;
            SQL> alter java class "DATAWORK"."com/pega/pegarules/data/udf/directstreamreader/DirectStreamReaderPostgreSQL" resolve;
            SQL> alter java class "DATAWORK"."com/pega/pegarules/data/udf/directstreamreader/FixedCacheMap" resolve;
            SQL> alter java class "DATAWORK"."com/pega/pegarules/data/udf/directstreamreader/UnsupportedFeatureException" resolve;
 
8. After performing above steps if issue still persists go to the respective functions in database and observe the status. Incase if we find red X marks at the functions we need to compile it manually.
9. Need to open the DML of each function in SQL Developer or equivalent tool and look for a semi colomn if it is missed need to insert it and compile it again. Once function gets successfully there won’t be any red X marks.
10. Run the following query to check the function validity.
select  pr_read_from_stream('pzInsKey',pzInsKey, pzPVStream) as "StringValue",
pr_read_int_from_stream('pzRuleSetVersionMajor', pzInsKey, pzPVStream) as "IntegerValue",
pr_read_decimal_from_stream('pzRuleSetVersionMinor', pzInsKey, pzPVStream) as "DecimalValue"
from pr4_rule
where pxInsID like ‘CANNOT%’
11. Query output as below if it valid one.
12. Restart the PRPC Nodes and deleting the PegaRULES_Extract_Marker.txt file from their respective explictTempDir folders.
13. Re-test the offending issue with the corrupt query results.



 

Published August 13, 2015 - Updated October 8, 2020

Was this useful?

100% found this useful

Have a question? Get answers now.

Visit the Collaboration Center to ask questions, engage in discussions, share ideas, and help others.

Did you find this content helpful?

Want to help us improve this content?

We'd prefer it if you saw us at our best.

Pega Community has detected you are using a browser which may prevent you from experiencing the site as intended. To improve your experience, please update your browser.

Close Deprecation Notice
Contact us