A/B testing can help you determine if a change to one or multiple elements on a page can bring in more conversions by resonating better with the audience that the post-click experience is reaching through your campaign.
If you have already been using our A/B testing feature on your pages, using the method that existed before December 2019, you will need to transfer those existing variations into the new experiments workflow in order to access them. Until you transfer them, they will still run as they have been set up before, but you will need to create an experiment to edit the variations and view the details of the experiment. You will also have to choose a control variation. Click here to learn more about that.
The steps below will show you how to do that. In my example, the page called My signup flow already had Variation A and Variation B created before, on its default experience. The process is identical even if you did not have variations created before.
1. Go to the Experiments tab on the right sidebar and click on Create Experiment.
2. Name your experiment so that you can identify it later and write the hypothesis that you are testing.
3. Select the experience that you will be testing, starting with the group the page is in, the name of the page and then the name of the experience, then click Create.
Note: You can only create experiments for pages that are already published to a live URL.
4. Your existing variations will automatically appear in the list. You can choose to add a new variation, to edit the variations in the builder, and to set the split.
The split represents the way the total visitor count for the page will be assigned to the variations within the experiment. For example, if you have three variations and Variation A has 35%, Variation B has 40% and Variation C has 25%, and your page will have 10.000 visitors in total, then Variation A will receive 3500 visitors, Variation B will receive 4000 visitors and Variation C will receive 2500 visitors.
Don't worry if, in the beginning, the split does not seem correct. If you have a low visitor count it will take a while to adjust, but it will even out in time. This is to ensure that the pattern in which the visitors see the variations is not predictable, for the accuracy of the experiment.
The split does not need to be even, you can manually enter the percentages, but it needs to add up to 100% in total.
When adding a new variation, you can either create one (it will be a copy of the existing Variation A and you can edit it afterwards) or you can import an existing variation from any other experience in your account.
How do I delete, duplicate, pause or rename a variation?
To do any of those actions, click on the three-dot menu on the right side of each variation and you will be presented with the options, as seen below:
The GIF below shows all the steps of the process of creating an experiment:
5. When you are satisfied with all the changes, you can start the experiment.
Once the experiment is live, you will see that the stats of the previous two variations from the past are already there, with their pre-existing split and statistics. You no longer need to check the analytics page in order to set the split between variations and to see their performance.
When you are ready to end an experiment, you will be asked to choose a winning variation. From then on, that variation will be the only one shown on the live URL of the page.
To go back to the list showing all of your experiments, click on the arrow in the upper left corner.
You can revisit ended experiments in order to see their results and other details, such as the time period they ran for. The winning variation will have 100% of the traffic, but you can see the previous traffic split that the variations had while the experiment was running (greyed out).
Note for Enterprise customers that are using Personalization: If you have multiple experiences for a page and you want to run an experiment, you will need to run an individual experiment for each experience, as they all have their unique URL and target audience. You cannot run the same experiment across multiple experiences at the same time.
Team member permission levels
The account owner and Manager-type team members have full access. Editors can change the name of the experiment and the hypothesis, they can add variations and edit them, they can set the split, but they cannot push any changes live. Viewers can only view experiments without being able to make any changes.