A/B Testing
Last updated
Last updated
A/B testing, also known as split testing, is a method used to compare two versions of a webpage or app to determine which one performs better. It involves presenting two variants (A and B) to different segments of your audience and analyzing which version yields more favourable outcomes.
This approach helps assess changes, such as design modifications or for our case, content variations, to optimize for desired goals, such as increased engagement or conversion rates. A/B testing is a valuable tool in refining and enhancing user experiences based on empirical data and user behaviour analysis.
On this main page, you get a quick overview of all the ongoing and completed A/B tests.
To start a new test, just type in a name, pick two quizzes to compare, and hit the Create button at the top.
Easy as that!
For your existing quizzes, there are two buttons to play with. The ellipses (...) button lets you tweak the test name, check out various publishing settings, wrap up an active test, or delete the whole thing.
Clicking on the View analytics button provides a snapshot of your test results. It displays your two selected quizzes along with key metrics like Engagement, Completions, Abandonment, Emails/phones captured, Revenue, and Conversion rate.
Understanding these metrics helps you make informed decisions about which quiz to keep active and which one to unpublish.
In the Publish Settings, you have the power to determine how your customers will engage with your fresh A/B testing. By default, the quizzes are published on their dedicated page, following the theme you selected earlier.
Apart from having their own page, you also receive detailed instructions on how to:
Include the quizzes as a section on a different page (Watch tutorial here)
Embed them in a popup (Watch tutorial here)
Embed them elsewhere using a code snippet (Watch tutorial here)