Creating a VU Schedule

You can create a VU Schedule to accurately emulate the actions of individual users.

Procedure

  1. Right-click the project in the Test Navigator view, and then click New > VU Schedule.
  2. Enter a name for the VU Schedule, and then click Finish.
    A new VU Schedule that contains one user group is displayed.
  3. Perform the following steps to add User Group and Locations on which each user group must run:
    1. Right-click the VU Schedule, and then click Add > User Group.
    2. Click a user group, and then select Run this group on the following locations from the Locations tab.
  4. Right-click the User Group, and then click Add > Loop to set the loops for the tests.
    Loops are used to run many iterations of a test, to run tests at a set rate, and to run tests in stages.
  5. Right-click the Schedule Contents, and then click Add > Random Selector to contain the selector and their weights.
    Selectors are used to run a series of tests in random order, thus emulating the varied actions of real users, instead of running each test within a user group sequentially. The weight that you assign each selector determines the statistical probability that its child element is selected during an iteration.
  6. Right-click the Schedule Contents, and then click Add > Test to contain the test.
  7. Perform the following steps to set the stages for the VU Schedule:
    1. Select the User Load from the Category field.
    2. Click Add.
    3. Enter a value in the Number of users field in the stage.
    4. Select the appropriate option to set duration of the stage.
    5. Click OK.
    Each stage lasts for a specific amount of time and contains a specific number of users. By setting stages, you can model workloads that reflect real-world usage over time. Putting the tests in a stage in an infinite loop prevents virtual users from finishing before the stage ends.
  8. Add other schedule elements to refine the schedule structure: Right-click a schedule element, and click Insert (adds the new element before the selection) or Add (adds the new element after the selection).
    Element Purpose For more information
    Synchronization point Used for coordinating the activities in a schedule, such as forcing virtual users to wait at a specific point Synchronizing users
    Delay Used to emulate user actions accurately; for example, a user might delay before placing an order Delaying virtual users or actions
    Comment Used for your notes and comments regarding the schedule element
  9. Set the VU Schedule options:
    Category options Typical setting For more information
    User Load Select this option to model workloads over time and change the number of users that perform certain tasks to reflect real-world usage. Setting user loads
    Think time Use the options on this page to increase, decrease, or randomize the think time. The default setting is to use the recorded think time. Think time overview
    Resource Monitoring Select Enable resource monitoring to enable resource monitoring.
    You can capture resource monitoring data from these sources:
    • Apache HTTP Server Managed Beans
    • Apache Tomcat Managed Beans
    • IBM® DB2® monitoring
    • IBM® Tivoli® monitoring
    • IBM® WebSphere® Performance Monitoring Infrastructure
    • JBoss Application Server Managed Beans
    • Java Virtual Machine Managed Beans
    • Oracle Database monitoring
    • Oracle WebLogic Server Managed Beans
    • SAP NetWeaver Managed Beans
    • The rstatd daemon (UNIX)
    • Simple Network Management Protocol (SNMP) agents
    • Windows Performance Monitor
    Enabling Resource Monitoring from the workbench
    Resource Monitoring from Service Select this option to continually observe the health of the system's resources. To monitor a remote system under test, you can install an agent on that system. Resource Monitoring Service
    Statistics log level Typically, keep the default settings. If you are running a long test, change the sampling rate from the default 5 seconds to a larger interval. Setting the statistics displayed during a run
    Variable Initialization A variable is declared in the test variables section of the test. You can use it throughout the test as a reference for any field that can be substituted. Test variables
    Requirements You can define performance requirements to specify the acceptable thresholds for the performance parameters in a schedule. Defining requirements in schedules
    Test log level Typically, keep the default setting of Primary test actions. You must have at least this level of logging to create a Page Percentile report and to see page title verification points that you have set. Setting the data that the test log collects
    Response time breakdown Select Enable collection of response time data to enable response time breakdown.

    You can collect response time breakdown data from HTTP or SOA tests.

    Enabling response time breakdown collection

    Enabling response time breakdown collection on Windows Vista, Windows 7, and Windows Server 2008

    Problem definition log level Change the default settings only when requested to do so by Support. Setting the problem determination level for schedules
    Application Performance Management You can use Application Performance Management (APM) in a schedule to enable AppDynamics or Dynatrace applications and enhance the data collection during load testing by adding HTTP headers to the request in your HTTP tests. Using Application Performance Management in a schedule
    Advanced tab (at the bottom of the VU Schedule Details area) Click Edit Options to set protocol-specific options that apply to all tests in the schedule. Setting protocol-specific options for a schedule is similar to setting protocol-specific options for a user group. Emulating slower network traffic

    Running long duration Citrix tests

What to do next

After you create a VU Schedule that describes the behavior for your software system, run it against successive builds of the application under test or with an increasing number of virtual users. Then analyze the results that are reported.