Activity form
|
|
This tab is visible only to operators who have the AutomatedTesting privilege through an access role.
Use the Test Cases tab to work with test cases for this activity. You can record new test cases, play back saved test cases, and request a report that alerts you to test cases that might need to be updated or re-recorded.
Recording test cases and viewing invalid test cases
Using the buttons on the Test Cases tab, you can record new test cases or request a report that alerts you to test cases that might need to be updated or re-recorded:
Field |
Description |
Record New Test Case |
Click to record a new test case for unit testing this activity. After the Run Rule window opens, unit test the rule in the Run Rule window, and then save that run as a test case. See How to Unit Test an Activity Rule for more information. To set a white list of pages or properties for an activity test case, first record and save the test case. Then locate the test case and open its rule form. Use the fields on the Results tab of the Test Case rule form to set the white list for the test case. |
Clear Clipboard and Record |
Click to clear the clipboard and then record a new test case for unit testing this activity. After the Run Rule window opens, unit test the rule in the Run Rule window, and then save that run as a test case. See How to Unit Test an Activity Rule for more information. In general, the best practice is to use the Clear Clipboard and Record button to record an activity test case, instead of the Record New Test Case button. When running an activity with the purpose of saving that run as a test case, it is important to have the clipboard in a known, clear state before starting the run. Otherwise, the clipboard could contain pages from an earlier run that you might not want saved as part of the test case, or which present an inaccurate picture of what the activity's steps do. |
Invalid Test Cases |
Click to get a report of any test cases saved for this activity that might need to be updated or re-recorded. If rules have changed, test cases that were recorded prior to the changes might not return the expected results. Examine the Invalid Test Cases report to see which test cases are affected. When you click this button, the Invalid Test Cases window opens, and you can choose to run the report immediately or to run as a background process and have the results emailed to you. If you run the report immediately, the Test Case window opens and displays the list of test cases. |
Refresh |
Click to update the list of saved test cases displayed on this tab. |
If any test cases are saved for this activity, they are listed in the table in the center of this tab. You can run an individual test case by selecting its name.
Field |
Description |
Name | Name of the previously recorded and saved test case. To play back that test case, click its name in this list. The Run Rule window opens and runs that test case. |
Created By | Operator who created the saved test case. |
Created On | When the saved test case was created. |
Last Run On | When the saved test case was last run. |
Status of Last Run | The test case's status from the last time it was run. |
Saved Results | Not used. |
To play back a saved test case, click its name in the Name field.
The Run Rule window opens and the system runs the test case. If differences are found between the results of running the current state of the rule and the saved test case, they are displayed in the Run Rule window with the following options.
If a white list exists for the test case, only differences for those white-listed pages and properties are displayed in the Run Rule window after running the test case. If differences are reported for white-listed pages or properties, you can indicate whether to remove the property or page from the white list by selecting the corresponding radio button (for a property) or checkbox (for a page) in the Remove White List Entry For column, and clicking the Save Ignores button. The test case is saved and re-run, applying the updated selections.
Note: Activities that result in display of an XML stream will not display that stream during test case playback.
Field |
Description |
Save Ignores | The differences encountered are displayed in the lower part of the window. If you want to indicate that a difference is to be ignored in future runs, you can select to have the system ignore it for either this test case or for all test cases, and then click Save Ignores to save your choices.
You can also indicate that differences involving an entire page should be ignored in future runs of this specific test case by selecting the checkbox corresponding to that page name. If you specify to ignore a page for this test case, all property differences found for properties on that page are ignored each time this test case runs. |
Overwrite Test Case | Click to overwrite this test case with the current run state. |
white list testing | |
About Automated Unit Testing
About Test Cases |
|
Atlas — Standard privileges |