A/B testing can help you determine if a change to one or multiple elements on a page can bring in more conversions by resonating better with the audience that the post-click experience is reaching through your campaign.
NOTE: When running an experiment, the test will run on a single URL, from which all the variations will be served at random based on the traffic split to new visitors. You cannot use multiple URLs in the same test. If you want to compare the performance of two different pages, you can refer to the analytics section of your account.
If you have already been using our A/B testing feature on your pages, using the method that existed before December 2019, you will need to transfer those existing variations into the new experiments workflow in order to access them. Until you transfer them, they will still run as they have been set up before, but you will need to create an experiment to edit the variations and view the details of the experiment. You will also have to choose a control variation. Click here to learn more about that.
1. Go to the Experiments tab on the right sidebar and click on Create Experiment.
2. Name your experiment so that you can identify it later and write the hypothesis that you are testing.
3. Select the experience that you will be testing, starting with the group the page is in, the name of the page and then the name of the experience, then click Create.
Note: You can only create experiments for pages that are already published to a live URL. If you create a draft experiment and then unpublish the related page, the experiment will disappear, but you can create a new one if you republish the page.
4. Your existing variations will automatically appear in the list. You can choose to add a new variation, to edit the variations in the builder, and to set the split.
The split represents the way the total visitor count for the page will be assigned to the variations within the experiment. For example, if you have three variations and Variation A has 35%, Variation B has 40% and Variation C has 25%, and your page will have 10.000 visitors in total, then Variation A will receive ~3500 visitors, Variation B will receive ~4000 visitors and Variation C will receive ~2500 visitors.
Those numbers are approximate because it is impossible for the traffic to have a perfect distribution based on the percentage set on the Traffic Split since for each new visitor our algorithm will check the chance for each live variation (the set Traffic Split) and randomize between the variations according to their corresponding chances, regardless of the existing traffic data that the page might already have.
This is to ensure that the pattern in which the visitors see the variations is not predictable, for the accuracy of the experiment. This is how the Traffic Split works for new page visitors. Returning page visitors will always see the variation they first saw the first time they accessed that page, as long as this variation is still live. This happens due to our variation cookie which has a duration of 12 months.
The split does not need to be even, you can manually enter the percentages, but it needs to add up to 100% in total.
NOTE: Do not set a variation to 0% traffic split. If you do not need it and have more than two variations, Pause it from the right-side menu. If you have two variations and only need one variation live, set both to 50% and then end the experiment. More detailed instructions for this use case can be found here.
When adding a new variation, you can either create one (it will be a copy of the existing Variation A and you can edit it afterward) or you can import an existing variation from any other experience in your account.
How do I delete, duplicate, pause or rename a variation?
To do any of those actions, click on the three-dot menu on the right side of each variation and you will be presented with the options, as seen below:
The GIF below shows all the steps of the process of creating an experiment:
5. When you are satisfied with all the changes, you can start the experiment.
Once the experiment is live, you will see that the stats of the previous two variations from the past are already there, with their pre-existing split and statistics. You no longer need to check the analytics page in order to set the split between variations and to see their performance.
You can see the graphic of the stats over time below the experiment, as shown here:
6. Clicking on the Name of the page will open the slideout for the page.
In the slideout you are able edit different page settings such as Conversion Goals, Integrations, Scripts & GDPR, Search & Social and URL Settings.
Clicking on Conv. goal will open the Conversion Goals settings for the page.
To edit a variation in a running experiment, click on Edit and you will be presented with the warning below: You can click on Open builder to make the changes. After you are done, you cannot save changes without pushing them to live URL immediately. To do that, click on Update & Continue. You will be warned again that the changes may affect the experiment. Click Confirm to continue.
After you do this, the warning below will permanently be present on the page of the experiment.
If you are looking to edit settings that are not found in the editor, such as Integrations, Social and SEO settings, or the cookie bar, you will have to end the experiment first, make the changes, then create a new experiment for the same page. Don't worry, your variations will be there, you will start where you left off.
When you are ready to end an experiment, you will be asked to choose a winning variation. From then on, that variation will be the only one shown on the live URL of the page.
To go back to the list showing all of your experiments, click on the arrow in the upper left corner.
You can revisit ended experiments in order to see their results and other details, such as the time period they ran for. The winning variation will have 100% of the traffic now, but you can see the previous traffic split that the variations had while the experiment was running (greyed out).
To view only a certain category of experiments, for example, the ones currently running, you can use the filter on the upper right corner, as seen in the GIF below.
Only unique visitors/conversions will be shown in the analytics data inside an experiment. You can still see the total number in the analytics view for the page in question.
Note for Enterprise customers that are using Personalization: If you have multiple experiences for a page and you want to run an experiment, you will need to run an individual experiment for each experience, as they all have their unique URL and target audience. You cannot run the same experiment across multiple experiences at the same time.
If you have an experiment for a page, this will show in the main Landing Pages view next to the page, displaying the status of the experiment, as seen below. Clicking that information will take you to the experiment's page.
Team member permission levels
The account owner and Manager-type team members have full access. Editors can change the name of the experiment and the hypothesis, they can add variations and edit them, they can set the split, but they cannot push any changes live. Viewers can only view experiments without being able to make any changes.