Image recognition in a test

In Web UI tests, HCL OneTest UI must identify objects accurately for a successful test playback. While most objects can be identified using text properties, there are cases where text-based identification might not be feasible or effective. To address this challenge, HCL OneTest UI provides image recognition capabilities so that you can identify objects based on their visual appearance.

To address any potential challenges with image recognition, HCL OneTest UI uses image correlation techniques to recognize and manage objects during playback. During test recording, HCL OneTest UI captures an image (reference image) that represents the action that is performed. This reference image is then compared with the actual image (candidate image) of the AUT during playback. A recognition threshold is used to accept an adjustable rate of differences between the reference image and the candidate image, and to evaluate if the images match. The default recognition threshold is set to 80, and the default tolerance ratio is set to 20.

Image correlation can be used in various test scenarios. Following are some examples:

  • Suppose you record a test on a mobile phone and play it back on a desktop. Because the image width and height differ from one device to another, the test playback might fail on devices that do not have the same screen ratio. You can use image correlation to address this issue.
  • In certain cases, the target objects within a test recording might undergo changes when the test is played back. For instance, when a virtual keyboard is used in a secure application, the position of digit buttons can vary from one server session to another. Using image correlation, HCL OneTest UI can adapt to these dynamic changes.
Note: From HCL OneTest UI v9.1.1, custom graphic objects are recognized. In the edited test, a custom object is identified as a Custom Element graphic object, with the name1-name2 description in the test script custom object description in test script.

During a test playback, if HCL OneTest UI fails to locate a certain image within the AUT, or if you have selected an inappropriate image for image identification, the test can fail. To address recognition issues during playback, you can modify the image that is used for target object identification in the test step. Also, you can adjust the threshold score and tolerance ratio in the edited test to enhance image recognition. If you set the threshold to 0, the candidate image that is most similar to the reference image is selected, even if they are not identical. Conversely, if you set the threshold to 100, even a slight difference in images might prevent image recognition.

When you set the recognition threshold in the test editor, HCL OneTest UI displays an image matching preview view to help you find accurate candidate images for test playback. The best matching candidate images are highlighted in green. Images that have a score that is above the threshold are highlighted in yellow and might not be the most suitable. And candidate images that have a score that is below the threshold are highlighted in red, which indicates that they do not match the reference images. You can find the image correlation details in the test report that is displayed when a test execution is complete.

You can add images as the main property in a verification point on the target objects in the tests. For example, you can verify the position of a dropdown list on a screen. For details, see Creating verification points from the SmartShot View.

By using the image recognition feature, HCL OneTest UI can identify objects based on their visual appearance when the text-based identification is a challenge.