AI Experiments

NOTE: This feature is only available for some of our subscriptions. If that is the case, you will see an upgrade prompt when attempting to access it. More information about our currently available subscriptions can be found here: 

When you go to the Experiments tab of the Instapage application and want to create a new Experiment, you will see two options: AI Experiment and Manual Experiment. This article will explain the former. For more information about the latter, please see the dedicated guide here: 

With AI Experiments, you can:

  • Speed up experimentation and get insights faster
  • Send more traffic to better test variations
  • Earn higher conversion rates over time

Table of contents

  1. 1. How does it work?
  2. 2. How to create and run an AI Experiment?
  3. 3. What data can I see while the AI Experiment is running?
  4. 4. How do we adjust the traffic split?
  5. 5. How will the AI Experiment end?
  6. 6. Can I edit my variations while the AI Experiment is running?

1. How does it work?

AI Experiments use artificial intelligence to power a form of experiment called dynamic traffic allocation, which tracks the progress of an experiment and directs traffic to higher-performing test variations. The metric that it takes into account is the conversion rate of each variation.
This experiment type differs from traditional A/B testing in a few ways. In a standard A/B/n test, the traffic is split evenly between two or more page variations. The test runs for long enough to achieve statistical significance — in other words, until the test has been performed with enough viewers to reliably show a clear winner.

AI Experiments begin similarly, with multiple page variations and an equal traffic split. During a warm-up period, the AI system identifies the highest-performing variations up to that point, and those variations receive a higher percentage of traffic over time. As the experiment progresses and the degree of certainty changes, the traffic will continue to shift, and eventually, the system will identify a clear winner.


2. How to create and run an AI Experiment?

1. Navigate to the Experiments tab of the application and click on Create Experiment

Markup 2023-08-10 at 13.55.59.png

2. Select the AI Experiment option and then you can choose the landing page, experiment name, and hypothesis of the test. Naming the Experiment is mandatory, while the hypothesis is optional. We recommend adding a hypothesis so that you remember what you are testing at any point in the future.


Note that your page must be published before you can create an Experiment for it. To learn more about our publishing options, please refer to this article:

3.  Once you've selected the experience of the page you want to test and clicked on CREATE, it will open the experiment draft. 

To start experimenting, you need two or more variations of your page to test different design and functionality changes.
Click on ADD VARIATION and choose between Create variation (a new copy of the existing design, which can be edited afterward) or Import variation (import the design of any other page in your account as a new variation for this page). 

If you already have an older variation you want to re-test, you can unpause it and make it part of the new experiment. 

How do I delete, duplicate, pause, or rename a variation?

To do so, click on the three-dot menu on the right side of each variation, and you will be presented with the options seen below: 


4. You can check the AI SETTINGS in the upper right corner. 

Those settings can be changed to fit your use case. To learn more about AI Settings, please visit the dedicated guide here

Warmup period
During warmup, the experiment gathers data and adjusts the traffic based on that data. The warmup period ends after a minimum number of days has elapsed (5 days), and the page has a minimum of 250 visitors per variation, plus at least one conversion per variation.

If one of the criteria is unmet (such as the 250 visitors), the warmup period will extend past the initial five days until the other criteria are met.

Winning probability threshold
The winning probability threshold is combined with the time it takes to reach the set value. The higher value, the longer it takes to predict the winning variation.

Potential value remaining
The maximum conversion rate difference will determine when at least two variations have such similar performance; a clear winner will never emerge.
If the difference between the conversion rates of two or more variations is less than the set regret value, the test is not worth continuing.

5. After editing the new variations, click START EXPERIMENT in the upper right corner. 


3. What data can I see while the AI Experiment is running? 

You can view how the experiment is doing at any time by going to Experiments and clicking the experiment name you want to check.

During the warmup phase, your variations will have the status of WARMUP. 
You can check the current Visitors, Conversion, Conversion Rate, and AI Trafic Split.


Once the warmup phase is over, the status of each variation will change and conclude the experiment results that can be drawn up until this point. 
A variation can have the following statuses during the AI Experiment: LEADER, TIE, CONTESTANT.

a screenshot showing what the statuses look like in the app

The LEADER would be the experiment that is running in the lead with the most visitors and conversions. A CONTESTANT refers to the strongest competitor variation participating in the current experiment, and TIE would be the variations that resulted in the same score.


Further down on the AI Experiment overview page, you will be able to see a breakdown of the traffic split over time and the conversion rate over time. 


4. When do we adjust the traffic split?

The first phase of the experiment is called the warm-up phase. This is when the experiment starts gathering data and evaluating which variations perform better (which ones have the higher conversion rates). The warm-up phase is not where the most traffic split changes happen, but they might if there is a high number of visitors and a significant difference between the conversion rates of the variations.

Next, there is the burn-in phase which prevents too much traffic reallocation too early. 
You need at least 20% of the warmup visits on average per variation to start seeing any traffic allocation. So when we have the minimum warmup visits threshold set to 250, we need 50 visits on average per variation to observe any traffic allocation, then with every additional 50 visits on average per variation, decisions can be more significant.
Markup 2023-08-02 at 14.00.51.png

Above is an example of traffic reallocation for an experiment with two variations and a warm-up visits threshold set to 250 visits on average per variation. 

  • Step one: It will not reallocate traffic split in the experiment.
  • Step two: It will reallocate the traffic split. However, the best variation will get at most 60% of the traffic.
  • Step three: It will reallocate the traffic split. However, the best variation will get at most 70% of the traffic.
  • Step four: It will reallocate the traffic split. However, the best variation will get at most 80% of the traffic.
  • Step five: It will reallocate the traffic split. However, the best variation will get at most 90% of the traffic.
  • Step six: It will reallocate without any constraints based on the probability of the given variation being a winner.

5. How will the AI Experiment end? 

Once the AI has determined a winner (the variation with the highest conversion rate) or that no winner will be chosen, our system will send an email informing you of this. The email will be sent to the user who created the Experiment and the user who created the page. You can end the experiment and choose the winner as the only variation available for your page visitors, or you can leave the experiment running for the full 90 days to validate further the results you have so far. 
If you haven't already concluded the experiment after the AI picked a winner, after 90 days, our system will do it for you, select the winner, and unpublish the losing variations.

You can end an experiment by going to Experiments, clicking the name of the experiment you want to end, then clicking the END EXPERIMENT button in the upper right corner of the experiment overview, selecting the winning variation, and clicking on END NOW.


6. Can I edit my variations while the AI Experiment is running? 

For the experiment to be reliable, you should not make any changes while it is running.
To edit the design or other settings, you should end the experiment. Any changes you make will influence the results because the original conditions of the experiment have changed.

To edit a variation in a running experiment, click on the EDIT button next to it, and you will be presented with a warning message and a button to OPEN BUILDER

Once the variation is edited and the changes updated, you will see this warning message on the AI Experiment overview page: "Experiment has been updated. The results may be affected."
You will also see this warning on the traffic split and conversion rate over time graph.