Launch an App #
Practice three ways to launch the YouTube app.
- Click an icon on the home screen to launch
- Search for the app name to launch
- Use the app’s unique ID (‘Package Name’ or ‘Bundle ID’) to launch directly
Click Home Screen Icon to Launch #

Step 1. Create a new step.

Step 2. After screen analysis, choose “OD” as the analysis tool.

Step 3. Choose “Youtube” on the screen, then drag & drop to add it as the step’s UIObject.

Step 4. Run the step you created.
Search for App Name to Launch #
To search for an app name, you first need to go to the search area.

Step 1. Create a new step and choose “Scroll” action.

Step 2. After screen analysis, choose “Full Screen” as the analysis tool.

Step 3. Drag & drop the chosen “Full Screen” to add it as the step’s UIObject.

Step 4. Create a new step with “Touch” action, choose OCR as the device screen analysis tool, then drag & drop “Search” from the screen to add as UIObject.

Step 5. Create a new step with “Input” action, choose Full Screen as the device screen analysis tool, then drag & drop the full screen to set as UIObject. Then enter “Youtube” in the value field.

Step 6. Create a new step with “Touch” action, choose Crop image from device screen analysis, then drag with your mouse to choose the app icon area and drag & drop to add as UIObject.
Use App Unique ID to Launch Directly #

Step 1. Create a new step and choose “Launch” action.

Step 2. Enter the App ID of the app you want to launch (Youtube). If testing on Android, enter in Package Name under Attributes at the bottom of the screen; if iOS, enter in Bundle ID.

Step 3. Run the step you created.
Scroll to Find Items #
When the video you need to play is located somewhere in the middle of a long list, let’s create a scenario that scrolls until the desired video appears.

Step 1. Launch Youtube, search for Apptest.ai, then find the video you want.

Step 2. Create a new step and choose “Loop UIObject” action.

Step 3. After screen analysis, choose “OCR” as the analysis tool.


Step 4. Choose “Fully Automated” from the screen, then drag & drop to add as UIObject.

Step 5. Right-click the step above and click “Insert Child” to create a child step. Set the action to “Scroll”, choose “Full Screen” after screen analysis, then drag & drop the full screen to add as the child step’s UIObject.

Step 6. Click the Loop UIObject step to activate the Attributes panel. Since you need to scroll when the target video isn’t on screen, change the comparator to NOT EXISTS.

Step 7. Return the Stego-connected device screen to the initial screen and run the steps you created.
Check Words at Specific Locations #
Let’s create a scenario that identifies specific words at designated locations, then enters them into a search engine.

Step 1. Create a new step and choose “Store Content” action.

Step 2. After screen analysis, choose “OCR” as the analysis tool.


Step 3. Choose “Model” from the screen, then choose “Relative” from the screen analysis tool and drag & drop the model name area next to Model to add as the step’s UIObject.

Step 4. Create a step to click the search area on the Stego-connected device home screen, then create an input step and enter “${device_model}” in the input action Attributes value.

Final Result: Return the device to the initial screen and run the scenario to confirm that the value set in the Store Content step is entered in the search input field.
Zoom in Google Maps #
Let’s create a scenario that finds a desired location in Google Maps, then zooms in on the map.
First, search for Los Angeles in Google Maps.

Step 1. Launch Google Maps, then create a new step and choose “Touch” action.

Step 2. After screen analysis, choose “OCR” as the analysis tool.

Step 3. Choose “Search” from the screen, then drag & drop to add as the step’s UIObject.

Step 4. Create a new step with “Input” action, choose Full Screen as the device screen analysis tool, then drag & drop the full screen to add as UIObject to the step. Then enter Los Angeles in the Attributes value field.

Step 5. Create a new step with “Touch” action, choose OCR from device screen analysis, then drag & drop “Angeles” from the screen to add as UIObject to the step.


Step 6. Since there are multiple “Angeles” on one screen, you need to change the Selector value in the UIObject attributes. For this example, change the Selector value to 2 so the second “Angeles” gets touched. To open the UIObject attribute panel, click the step’s UIObject Field.


Step 7. Create a new step with “Pinch” action, then change direction to “OUT” in the Attributes below. Then choose Custom Box as the device screen analysis tool and drag to choose the center area of the screen. Drag & drop the chosen center area to add as UIObject to the step.

Step 8. Change Google Maps so no text is entered in the search field and run the scenario.
Use Control Actions #
Let’s create a scenario using ‘Store Content’ and ‘If Value’ actions to check the category of the current #1 game in the Play Store, and change it to Role Playing if it’s not Role Playing.
Through this example, you can learn how to use ‘Relative’ for screen analysis tools, save and use text from specific screen points, and create scenarios that work based on conditions.

Step 1. Open Play store, go to the Games menu at the bottom. Then read and save the Category of the currently #1 App from the Top charts screen. You can read the #1 app’s Category using Relative from the screen analysis tools.

Step 2. Create a new step, choose Store Content action, then add the #1 app’s category area as UIObject.

Step 3. Click the step you created above to activate the Attributes panel, then enter first_category in the Attributes key input field.


Step 4. Create a new step and choose If Value action, then click the step to activate the Attributes panel and enter first_category in the Attributes key input field. Choose “!=” for comparator and enter Role Playing in value.
When you enter like this, if the value saved in the first_category key doesn’t match the text Role Playing (first_category != Role Playing), it will run the child steps.

Step 5. Create child steps for the If Value step: one to touch Categories and one to touch Role Playing in the Categories selection window.

Step 6. Reset the Stego-connected device to the initial scenario creation state, then run the created steps from the beginning.
Stop Playing Video #
This scenario detects screen changes when a video is playing in the YouTube player and stops playback through double touch.
Through this, you can learn how to specify areas with ‘Custom Box’ from the screen analysis tools and control actions through ‘If Changed’ action.

① Choose ‘If Changed’ action in a new step.
② Choose ‘Custom Box’ from screen analysis tools, then perform screen analysis.
③ Drag to choose the video playback area on the device screen, then ‘Drag and Drop’ this as the step’s UIObject.

④ Right-click the ‘If Changed’ step and choose ‘Insert child’ to create a child step.

⑤ Create a ‘Touch’ action in the child step.
⑥ Set type to ‘Double’ for video playback to set up double touch action.
⑦ Since you can reuse the UIObject created in the first step, choose that image and bring it with Drag and Drop.
Note: You can add the video playback area as UIObject for the child step by choosing it with Custom Box from the screen analysis tools, but since you already created a UIObject through the same process in the first step, you can reuse it. To reuse, choose the UIObject you want to use from the UIObjects registered in existing steps and apply it by Drag & Drop to that step.
Set Alarm Time #
This scenario is an example of setting a desired alarm time in the Android clock app. In this action, you can learn how to adjust alarm time using ‘Loop Content’ and ‘If Content’.

① Add a new step.
② Choose ‘Touch’ action.
③ Choose ‘OD’ from device screen analysis tools.
④-⑤ ‘Drag and Drop’ the ‘+’ button to add as UIObject.

① Choose ‘Loop UIObject’ action in a new step.
② Choose ‘OCR’ from screen analysis tools.
③ ‘Drag and Drop’ the number ‘9’ to add as the step’s UIObject.

④ Change comparator to NOT EXISTS in the Attributes tab.
⑤ Since time exists from 1 to 12, set the limit value to 12 so child steps repeat 12 times.

① Add a child step.
② Choose ‘Scroll’ action.
③ Manipulate the device screen so numbers smaller than ‘9’ are visible, then choose ‘Custom Box’ from screen analysis tools.
④ Drag to specify the area around number ‘6’, then ‘Drag and Drop’ to add as the step’s UIObject.
In specific conditions, for 8 o’clock and 10 o’clock, 9 o’clock may be displayed on screen.
To solve this, the following actions are needed:
– If the currently set time is 8 o’clock, to adjust the alarm to 9 o’clock, you need to move the ‘hour’ area value down 1 position.
– If the set time is 10 o’clock, you need to move the ‘hour’ area value up 1 position.

① Add ‘If UIObject’ action in a new step.
② Choose ‘OCR’ from screen analysis tools.
③ ‘Drag and Drop’ the number ‘7’ to add as the step’s UIObject.

① Add ‘If UIObject’ action in a new step.
② Choose ‘OCR’ from screen analysis tools.
③ ‘Drag and Drop’ the number ’11’ to add as the step’s UIObject.

① Add a child step.
② Choose ‘Scroll’ action.
③ Choose ‘Custom Box’ from screen analysis tools.
④ Drag to specify the area around number ’10’, then ‘Drag and Drop’ to add as the step’s UIObject.
⑤ Choose DOWN for direction in the Attributes tab.
Note: Among Scroll action attributes, 'direction' represents the direction your finger moves when you touch the device to scroll. For example, to set time from 10 o'clock to 9 o'clock, you need to touch 10 and move your finger down so 9 is positioned in the center of the screen. In this situation, since you need to touch number 10 with your finger and move your finger down for number 9 to be positioned in the center of the time area, set 'direction' to 'DOWN'.

① Add ‘Touch’ action in a new step.
② Choose ‘OCR’ from screen analysis tools.
③ ‘Drag and Drop’ the ‘Save’ text from the screen to add as UIObject.
Check Disappearing UI #
This is an example of testing UI that appears briefly then automatically disappears (ex. toast popup).
We’ll test whether the option menu properly disappears when clicking “Save to Watch later” in YouTube app’s video options.
Through this example, you can learn how to use the Assert Message action.

① Connect device to Stego and launch Youtube. Search “apptest ai” and scroll down so apptest’s first video is visible at the top.

① Create a new step.
② Choose the AI screen analysis button from the device panel toolbar.
③ From screen analysis results, ‘Drag & Drop’ the video option button to the newly created step.

Since there are multiple more icons on one screen, you need to change the Selector value among UIObject attributes.
For this example, since you need to activate video options, change the Selector value to 2 so the second icon gets touched.
To open the UIObject attribute panel, click the step’s UIObject Field.

① Create a new step.
② Choose ‘Touch’ action.
③ Choose OCR from device screen analysis tools.
④ ‘Drag & Drop’ ‘Save to Watch later’ to add as the step’s UIObject.참고) OCR로 분석할 경우 단어가 개별적으로 인식될 수 있습니다. OCR 결과에서 여러 단어를 드래그하여 선택하면 하나의 문장으로 인식됩니다.

① Create a new step.
② Choose ‘Assert Message’ action.
③ Check if the toast popup message appears as ‘Saved to Watch later’.
④ Choose ‘=’ for Attributes comparator.
⑤ Check the toast popup message content and enter it in value.

① Run the scenario.
② After completion, click the Output panel results to check.

① Check test results in the results window.
Check Home Screen Icon #
This is an example of checking if the Play Store app icon exists on the home screen.
If there’s no Play Store app icon on the home screen, it will output “Play Store app icon is not on the home screen.” as an error message in the test results.

Step 1. Create a new step and choose “Assert UIObject” action.

Step 2. Click the “AI” button in the device panel.

Step 3. Choose ‘OD’ from screen analysis tools (default) and ‘Drag and Drop’ the Play Store app icon screen element to the step’s UIObject Field.

Step 4. In Assert UIObject’s Attributes, set comparator value to ‘EXISTS’ and enter custom message as ‘Play Store app icon does not exist on the home screen.’

Step 5. If you run the created scenario with the Play Store app icon deleted from the device, the custom message and label will be output in the Output panel.
Use Common Scenario #
You can create an app launch scenario as a Common Scenario to reuse in other scenarios.

① Create a new step.
② Choose ‘Launch’ action.
③ Enter the exact ‘Package Name’ (Android) or ‘Bundle ID’ (iOS) of the app you want to run.

④ Right-click the created scenario in the scenario list panel, then click ‘Move to Common Scenario’.

⑤ You can see the scenario disappears from the scenario list panel and is added to the Common Scenario list panel below.

① Add a new scenario.
② Create a new step in the added scenario.

③ Choose ‘Common Scenario’ action in the new step.

④ Choose Common Scenario in the Attributes tab.
⑤-⑥ Choose the previously created scenario from the Common Scenario list.

When the chosen Common Scenario is properly added, since that Common Scenario cannot be changed, it appears disabled.
Send and Receive Email #
This is a D2D (Device-to-Device) scenario that verifies email send/receive functionality between two devices.
If the recipient doesn’t receive the email properly, “The recipient did not receive the email.” error message will be output as the test result.
After creating the scenario, click the settings icon at the top of the Scenario Editor panel to access Scenario Settings.
Set the recipient email as User Variables for use in the scenario:

- key: Email
- value: john.doe@example.com (recipient email)
Activate Scenario Settings > D2D Test and define device roles:

- receiver: Device responsible for receiving and checking email
- sender: Device responsible for composing and sending email
Connect 1 Android device and 1 iOS device in Device Farm.

Phase 1: Compose Email (Steps 1-7)
Run email composition process on sender role device.

- Step 1: Specify sender role with If Role action
- Steps 2-7 are child steps of Step 1 (If Role), running only on sender device.

- Step 2: Launch mail app with Launch action
- sender device is set to Random / iOS, using Bundle ID.
- Step 3: Touch “Compose” button with Touch action

- Step 4: Enter recipient email address with Input action
- Use the set User Variable in ${Email} format.
- When running the scenario, Email’s value john.doe@example.com (recipient email) is automatically entered.

- Step 5: Choose email address with Touch action
- Step 6: Touch email body composition area with Touch action
- Step 7: Enter email body content with Input action
Phase 2: Launch Recipient App (Steps 8-9)
Launch mail app on receiver role device.

- Step 8: Specify receiver role with If Role action
- Step 9 is a child step of Step 8 (If Role), running only on receiver device.

- Step 9: Launch mail app with Launch action
- receiver device is set to Random / Android, using Package Name.
Phase 3: Sync Point (Step 10)
Sync action synchronizes the progress of both devices when running the scenario.
Phase 4: Send Email (Steps 11-12)
Run email sending on sender device.

- Step 11: Specify sender role with If Role action
- Step 12 is a child step of Step 11 (If Role), running only on sender device.
- Step 12: Touch send button with Touch action
Phase 5: Check Email Receipt (Steps 13-14)
Check email receipt on receiver device.

- Step 13: Specify receiver role with If Role action
- Step 14 is a child step of Step 13 (If Role), running only on receiver device.

- Step 14: Verify email receipt and received email content with Assert Content action
- Check if email content “Hi, thank you for using Apptest.ai.” appears on screen for 60 seconds.
- If the message doesn’t appear within this time, the test shows as FAILED and “The recipient did not receive the email.” error message appears.