This is a quick post to share the results of an A/B test the Webmaker Product Team recently ran on the webmaker.org homepage.
The team designed and built four variations of a new homepage. The homepage was the same for all users except for the blue ‘Splash page’ area seen in the screenshots below where we tested different images and text.
First we’ll look at how these designed performed against each other…
Before you look at the results below, which design and messaging do you think will result in the most users joining Webmaker?
The results:
Variant | Traffic | Conversions | Conversion Rate | |
Variation A | 11,913 | 903 | 7.58% | |
Variation B | 11,950 | 1,066 | 8.92% | This is a statistically significant winner against A or C |
Variation C | 11,714 | 748 | 6.39% | |
Variation D | 11,896 | 1,034 | 8.69% | This is a statistically significant winner against A or C |
Variation B & D are equal winners. Although B reports slightly higher in these results, the difference in performance between variation B and D is not statistically significant, so we can use either of these versions as our new default homepage.
One of the goals of this test was to explore the relative impact of how we talk about Webmaker to new users, and whether we focus on it’s place in relation to personal user benefit, of the Mozilla mission and these results give us a starting point in that research. But we should not jump to any conclusions from the results of a single test like this.
For example, we were testing a photo versus an illustration in this experiment. Variation A and D have the same text, but A has a photo background, and D has the illustration. Comparing the results of these two variations tells us that this illustration performs better than this photo. Not that all illustrations perform better than all photos, or even that the best option will always be an illustration. It could even be the positioning of the text that made the difference (left versus right on the screen).
What matters though is that we now have a combination of content for the homepage that we know is working well. We make this our new default (the ‘champion’ in testing terminology) and when we want to work further on this page we put up new designs as ‘challengers’ and test the page again. Given that this is our first round of testing, it is likely we can find many more gains over time.
What was the impact overall?
It’s interesting to compare these pages to each other, but what is even more dramatic is this impact this design had when compared to our previous homepage which was not included in the test variations. You can tell from the graph of Conversion Rate below which date this new homepage went live…
That’s it for now, but as we run more tests, we’ll share more results here too.
Anders wrote on :
Robin wrote on :