Uploaded image for project: 'PUBLIC - Liferay Portal Community Edition'
  1. PUBLIC - Liferay Portal Community Edition
  2. LPS-96783

As a marketer I can run/stop an AB test from DXP

    Details

    • Epic/Theme:
    • Sprint:
      SP | Sprint 17 | Aug07 - Aug21, TANGO | SP_18 | Aug21-Sep04, TANGO| SP_19 | Sep04-Sep18

      Description

      Motivation

      See Epic (LPS-87072) and Spike (LPS-95354) for general AB testing motivation.

      The purpose of an AB test is to find the page experience that maximizes a goal for a given audience, by comparing the performance of different experience variants during a time. 

      When the AB test is running, a part of the visitors that would usually view the original page experience (i.e. because they belong to the audience associated to that page experience) will view one of the test variant, instead (accordingly to the traffic split per variant configured in the test, see LPS-97196). Analytics cloud collects the visitor’s interaction with the variants in order to determine which one maximizes the test goal.

      Scope

      This story covers:

      • Traffic split when the test is running
      • UI behaviour while the test is running, including locking edit of associated resources and test status info in the AB test sidebar panel.
      • Sending test data as part of the analytics events to Analytics Cloud

      This story does not cover:

      • Test completion (see LPS-96787)
      • Variant statistics (% data on the variant entry)
      • Link to view data in Analytics Cloud from the actions menu

      Design

      • See Figma (Test is running section) for reference

      Acceptance Criteria

      1. The status is shown

      • Given a test is running
      • When the test is displayed in the AB test sidebar
      • Then status is running

      2. The test cannot be edited when it's running**

      • Given a test is running
      • When the test is displayed in the AB test sidebar
      • Then:
        • the test cannot be edited
        • the test variants cannot be added/edited/deleted

      3. The stop test button is shown when it's running **  

      • Given a test is running
      • When the test is displayed in the AB test sidebar
      • Then the Stop Test button is shown instead of the Run Test button

      4. The test can be stopped when it's running

      • Given a test is running
      • When the stop button is clicked
      • Then:
        • the status changes to paused
        • the test cannot be edited
        • the test variants cannot be added/edited/deleted
        • the Stop Test button changes to Restart
        • the status change is reflected in AC

      5. The test can be restarted when it's paused

      • Given a test is running
      • When the restart button is clicked
      • Then the status changes to running (see previous acceptance criteria for running tests) and the status is reflected in AC

      6. The test does not affect non targeted visitors

      • Given a test is running, 
      • When a user not belonging to the test audience* visits the page 
      • Then he/she observes the corresponding experience (as if no test were running)

      The test audience is the segment of the control experience of the test

      7. The test affect targeted visitors

      • Given a test is running, 
      • When a user belonging to the test audience  visits the page 
      • Then he/she observes one of the test variants. The assignment is random, but consistent with the traffic split weights

      8. The same variant is shown to a visitor affected by the test in further visits

      • Given a test is running, 
      • When a user belonging to the test audience  visits the page again
      • Then he/she observes the same test variants as in the first time

       9. The analytics client sends information of the current experience

      • Given a content page (regardless of there’s a test running for it or not)
      • When a user visits the page
      • Then a request is sent by the analytics JS client to the analytics endpoint, including in the context the field “experienceId” with the shared identifier of the experience (inspect network traffic)

      10. The analytics client sends information of the experiment

      • Given a test is running
      • When a user visits the page
      • Then a request is sent by the analytics JS client to the analytics endpoint, including in the context:
        • the “experienceId” with the shared identifier of the test control experience
        • the “variantId” with shared identifier of the test variant served to the user
        • the “experimentId” with shared identifier of the test.

        Attachments

          Issue Links

            Activity

              People

              • Votes:
                0 Vote for this issue
                Watchers:
                0 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved:

                  Packages

                  Version Package
                  Master