Create an advanced scenario to run simulations

Create an advanced scenario to test multiple API methods in design time to make sure they return the expected results.

About this task

In advanced scenarios, you can test startSession, getOffers, postEvent, getProfile, and endSession APIs.

Procedure

  1. From the Simulator tab in your interactive channel, click Create an advanced scenario in the Scenario palette.
  2. On the General tab, add a name and description for the scenario.
  3. On the Scenario definition tab, enter a session ID for the advanced scenario. The session ID is shared for all APIs. If you do not define a session ID, on is created in runtime and used in the API batch execution.
  4. Click Add API to select a startSession, getOffers, postEvent, getProfile, or endSession method to your scenario.
  5. Add a startSession method.
    1. Select an audience level from the drop-down list.
    2. Select an audience ID.

      You can either enter an audience ID manually or search for an audience ID.

      To use the search option to search your profile table for the audience ID, click . Click Add condition to specify search criteria. If you add more than one condition, your conditions are treated as AND conditions in your search results. Click Find to populate the results for your search criteria. From the Select display column tab, you can choose specific columns to see in your search results. When you select or deselect an attribute, the search results table is updated immediately. Highlight the record that matches your criteria and click Select. This audience ID is added to your scenario.

      The simulator uses the server and profile database defined in Campaign | partitions | partition[n] | Interact | serverGroups | [serverGroup set for Simulator] | prodUserDataSource property to load profile records and perform audience ID searches. This prodUserDataSource should be configured the same as Interact | general | serverGroups | prodUseDataSource, so the audience ID that is set during scenario definition exists when the simulation runs. The profile table mapped in the Interactive Channel Summary tab from design time database should be a subset of the profile table in runtime database.

    3. Select the check box to rely on an existing session or to turn the debug flag on if you want to enable either of these options for the startSession method.
    4. Add parameters for the API. Enter a name and value for the parameter. You can also select predefined parameters from the Name drop-down list. Choose a type from the drop-down list. Click the check mark to add this parameter.
  6. Add a getOffers method. Select a Zone from the drop-down list and add the Number of offers. If you do not add Number of offers, by default all offers assigned in strategy are returned.
  7. Add a postEvent method. Add an event name and parameters for this method. You can also select predefined parameters from the Name drop-down list.
  8. Add a getProfile method. You do not need to configure this API.
  9. Add a endSession method. You do not need to configure this API.
  10. You can arrange the order of the APIs using the up and down arrows. You can add getOffers and postEvent more than once in a scenario and arrange them accordingly. You can also delete an API if you decide you do not want to include it in the scenario simulation.
  11. Click Save to add this scenario to your saved scenarios. From Saved scenarios you can also edit, copy, or delete previously saved scenarios.
  12. Click Run to test this scenario.

    The simulation runs on a different thread. You can still browse or edit simulation scenarios, but cannot start another run until your previous run is complete.

    Note: You cannot run an unsaved scenario. If you create or edit a scenario, you must save it before you can run the scenario with your updates. If you try to run an unsaved scenario, your last saved scenario runs instead.
  13. Click Abort to abort the simulation run, if needed.

Results

When the run completes you can view the results on the Simulation results tab. The results are shown in the same order as your APIs on the Scenario definition tab.

Under the Log tab, you can also view the log results for your scenario run. Only the log entries for the last scenario run is displayed. The logs displayed on this tab are the as same as those logged in interact.log, but this tab filters the logs by the session ID and the start timestamp of your simulation run. The level of logging is determined by interact_log4j.properties.

COVERAGE SCENARIO

Create Coverage Scenarios to test run a simulation for customers and know if they have enough offer coverage for each group of their customer base, or for certain groups/segments.

About this task

Procedure

  1. From the Simulator tab in your interactive channel, click Create a coverage

scenario in the Scenario palette.

2. Enter a name for the coverage scenario.

3. Enter a description.

4. Select an Audience level from the drop-down list.

5. Click Select button on the Zones field.

All the Zones defined in the Interactive channel will be displayed.

User may select a single zone, multiple zones or all the zones.

Click Save button.

6. Click Filter button on the Audience ID filters field.

Here user can specify a filtering condition in order to select Audience IDs by clicking the

‘Add condition to filter audiences’ link.

Once the filtering condition is defined click the ‘Find’ button.

This will list the AudienceID which satisfy the condition.

Click Select audiences button to select the audiences.

7. If no filter is defined the simulator will select all the audience IDs from the profile data.

8. In the DB table name for storing results field, you must define the table name to store the simulation results. This table is created in the Design time database. By default, this field is populated with a default value "[DEFAULT]". Leaving it as is, the application forms the result table name in form of UACI_SimResults_<scenarioID>. You can change the default value, but this field is required.

9. There are two checkboxes ‘Warn to export previous run results’ and ‘Using existing event Pattern states and offer suppression rules’

a. ‘Warn to export previous run results’

If this option is selected. A reminder message will be shown to the user to export the previous run results, in case the user re-runs the scenario.

If this is unchecked no reminder message will be shown to the user.

b. ‘Using existing event Pattern states and offer suppression rules’

If this options is selected. Interact will consider the event pattern and offer suppression

states for those audience IDs. This means that the existing event pattern or offer suppression rules will be applied to the simulation results.

If the option is unchecked Interact will ignore the existing event pattern states or offer suppression rules while displaying the simulation results.

10. Click Save button to save the scenario.

Note: The user must save the unsaved changes before running the scenario.

If the user tries to run the scenario without saving, the last saved scenario would run.

11. Click Run button to run the Scenario.

Results

Once user inputs the required fields i.e. Name, Audience Levels, Zones, and user clicks on Save button, Coverage scenario is created and two more tabs like “Simulation results" as second tab and “Run history" as third tab appears in the Coverage scenario.

By default both the tabs “Simulation results" and “Run history" are blank.

“Simulation results" Tab:

This tab is populated for following fields only after ‘Simulator Coverage Scenario’ is executed successfully by clicking on the Run button. For the failed run, this tab will be empty.

This tab will be always displayed for a coverage scenario. If no run result, user clicks it, it will say “No simulation result available". Once a run is completed and there are results in specified result tables, clicking the tab will make the simulation results, with two sections:

Section 1: ‘Run Summary’, which displays

Overall status of this run in ‘Run Status’ field, Total time for the run in seconds in Duration (sec) field.

How many audiences being processed in ‘Audiences processed’ field and total number of offers are presented in ‘Offer count’ field.

Section 2: It displays a table with columns of

  • Segment: Segment to which the AudienceID belongs
  • Zone : Zone to which offer is assigned.
  • Offer : Name of the offer which is presented to the audience.
  • Audiences served offer : Number of audiences being served offers

This table will help user to know how their coverages are for their customer base.

For a simulation coverage scenario taking long time to run, user may have to check results after a while. They can check the run status in Run History tab.

“Run history" Tab:

This tab is populated with following columns only after ‘Simulator Coverage Scenario’ is RUN.

It has following columns:

Run ID: The application will generate a unique RUNID for every run of Simulator Coverage Analysis scenario. This will be displayed on screen after the run is completed.

Start time: It store the run start time of the Coverage Scenario run. By default it displays the start time in the ‘MMM DD, YYYY hh:mm:ss’ format.

Run duration (sec) : This column display the time taken by the Simulator Coverage Scenario takes to run.

Status: This column, display the status of the Simulator Coverage Scenario run.

There are three different status as below:

  • Succeeded : If the Coverage Scenario run is completed successfully.
  • Failed : If the Coverage Scenario run is failed to complete successfully.
  • Running : If the Coverage Scenario run is still in progress.
  • Exported to CSV : If the run result of the Simulator Coverage Scenario is exported into CSV file, then the status of the Coverage Scenario is set to Exported to CSV. By default user can export only the latest run result of scenario.

Number of audiences processed: This column displays the number of audience IDs processed in each run of the Simulator Coverage Scenario. This column will be populated only when the status is “Succeeded" or “Exported to CSV", It will be blank if the run is failed.

Summary: This column displays the summary for the status of each Simulator Coverage Scenario.

If Run is in progress then the Summary is displayed as ‘Running’.

If Run is Succeeded then the Summary is displayed as ‘Running simulation succeeded’

If Run is Failed then the Summary is displayed as ‘Server error running simulation’.

If run result of the Simulator Coverage Scenario is exported into CSV file the it gives the location where this file is exported.

Actions: This column displays the icon with actions within it. These icons are as below:

  • Delete history records : This is available for every Simulator coverage scenario Run. By clicking on this icon the corresponding history record can be deleted.
  • Abort : This icon is available only for the Simulator Coverage Scenario in Running state. When user clicks on this icon, the running of the Simulator Coverage Scenario is stopped.

Download CSV: This icon appears only when the run result of the Simulator Coverage Scenario is exported into CSV. When it is clicked, the system retrieve CVS file saved on the DT server and download it to user’s browser.

Export to CSV : When user clicks on this link, latest run of the Simulator Coverage Scenario is exported to csv and the status of this run is changed to ‘Exported to CSV’.

Only latest results are saved in result tables, Interact system will pick up records from result tables and export them to an CSV file. The file name is in format of <ResultTableName>_<RunID>.csv.

The status for that run is changed to Exporting to CSV while DT server is processing (A message like “Exporting CSV is in processing" will be displayed). User can also see status in Run History tab. Once the CSV file is generated, the run history’s status changes to “Exported to CSV". The CSV file will be saved in system temp folder(/tmp/Interact_ExportedCSV/ ) of DT web server. The file name with full path will be saved in Summary field of the run history record.

For a coverage analysis, which could results in millions records, exporting could take long time. User can check run status for when it is done. For small set of data and if user session is still alive, the CSV file will be downloaded to user’s browser as well.

All above columns except ‘Actions’ column can be sorted in ascending or descending order by clicking on it.

All ‘Coverage Scenario’ run history are displayed in pagination. By default 10 run results are displayed on one page.

If user clicks on ‘All’, it displays all the ‘Coverage Scenario’ run history on one page.

Delete all history : When user clicks on this link, run histories for complete ‘Coverage Scenario’ from all the pages gets deleted.