The Test Cases tab displays either automated test cases or unit test cases, depending on your configuration. This topic discusses how to view and create unit test cases when you have the AutomatedTesting privilege enabled and you are viewing rules in the Automated Unit Testing framework.
When you use switch to using AUT (by clicking Designer Studio > Automated Testing > Test Cases and clicking Switch to old version), the Test cases tab for activities, decision tables, and decision trees, which are supported by both automated unit testing and automated rule testing, displays options for creating unit test cases.
In the Automated Unit Testing landing page, you can click Switch to new version to restore to the Automated Testing landing page. When you click the Test cases tab in an activity, decision table, or decision tree, the tab displays options for creating PegaUnit test cases.
If you are using the Automated Unit Testing landing page and then log out of the system, Designer Studio displays the Designer Studio > Application > Automated Unit Testing menu option instead of the Designer Studio > Application > Automated Testing menu option. You can open the Automated Unit Testing landing page and click Switch to new version to restore the Automated Testing landing page.
Using the buttons on the Test Cases tab, you can record new test cases or request a report that alerts you to test cases that might need to be updated or re-recorded:
Field |
Description |
Record New Test Case |
Click to record a new test case for unit testing this activity.
To set a white list of pages or properties for an activity test case, first record and save the test case and then locate the test case and open its rule form. Use the fields on the Results tab of the Test Case rule form to set the white list for the test case. |
Clear Clipboard and Record |
Click to clear the clipboard and then record a new test case for unit testing this activity. Follow the procedure described for the Record New Test Case button in the preceding table entry. Note: The best practice is to use the Clear Clipboard and Record button to record an activity test case instead of the Record New Test Case button. When running an activity with the purpose of saving that run as a test case, it is important to have the clipboard in a known, clear state before starting the run. Otherwise, the clipboard could contain pages from an earlier run that you might not want saved as part of the test case, or which present an inaccurate picture of what the activity's steps do. |
Invalid Test Cases |
Click to get a report of any test cases saved for this activity that might need to be updated or re-recorded. If rules have changed, test cases that were recorded prior to the changes might not return the expected results. Examine the Invalid Test Cases report to see which test cases are affected. When you click this button, the Invalid Test Cases window opens, and you can choose to run the report immediately or to run as a background process and have the results emailed to you. If you run the report immediately, the Test Case window opens and displays the list of test cases. |
Refresh |
Click to update the list of saved test cases displayed on this tab. |
If any test cases are saved for this activity, they are listed in the table in the center of this tab. You can run an individual test case by selecting its name.
Field |
Description |
Name |
Name of the previously recorded and saved test case. To play back that test case, click its name in this list. The Run Rule window opens and runs that test case. |
Created By |
Operator who created the saved test case. |
Created On |
When the saved test case was created. |
Last Run On |
When the saved test case was last run. |
Status of Last Run |
The test case's status from the last time it was run. |
Saved Results |
Not used. |
To play back a saved test case, click its name in the Name field.
The Run Rule window opens and the system runs the test case. If differences are found between the results of running the current state of the rule and the saved test case, they are displayed in the Run Rule window with the following options.
If a white list exists for the test case, only differences for those white-listed pages and properties are displayed in the Run Rule window after running the test case. If differences are reported for white-listed pages or properties, you can indicate whether to remove the property or page from the white list by selecting the corresponding radio button (for a property) or check box (for a page) in the Remove White List Entry For column, and clicking the Save Ignores button. The test case is saved and re-run, applying the updated selections.
Note: Activities that result in display of an XML stream will not display that stream during test case playback.
Field |
Description |
Save Ignores | The differences encountered are displayed in the lower part of the window. If you want to indicate that a difference is to be ignored in future runs, you can select to have the system ignore it for either this test case or for all test cases, and then click Save Ignores to save your choices.
You can also indicate that differences involving an entire page should be ignored in future runs of this specific test case by selecting the check box corresponding to that page name. If you specify to ignore a page for this test case, all property differences found for properties on that page are ignored each time this test case runs. |
Overwrite Test Case | Click to overwrite this test case with the current run state. |