A/B Testing: ‘Sequential form’ vs ‘Simple PayPal’

This post is one of a series where we’re sharing things we’ve learned while running A/B tests during our End of Year fundraising campaign. At Mozilla we strive to ‘work open’, to make ourselves more accountable, and to encourage others to build on the things we have done. In the same way that open source software saves thousands of developers from writing the same code thousands of times over, we hope our ‘View Source Fundraising‘ can be built on and improved upon by many organizations facing the same challenges today.

Today’s post is looking at two versions of our donation form.

A/B Testing: ‘Sequential form’ vs ‘Simple PayPal’

Screen Shot 2014-12-18 at 21.14.59

By the time we came to run this test, our ‘Sequential Form’ had out-performed two other variants of our donation flow (blog posts to follow on those soon), but we had this other variant which was a really simple PayPal donation form.

This PayPal form was one of many that had been built (and translated) to facilitate donations in many local currencies. And we had a version in English accepting USD, so we thought it was worth testing against our Sequential form. It was different in many ways, so this wasn’t going to be an experiment to understand an isolated content change, but it had the capacity to create disruptive results. And it had already been built, tested and deployed to our production site, so we’d be silly not to test it.

Results

Screen Shot 2014-12-18 at 21.21.01

Screen Shot 2014-12-18 at 21.21.08

We restricted the % of traffic allocated to the test variation in order to manage our risk, but even with a relatively small volume of traffic we were quickly able to see statistically significant results showing that the test variation (PayPal only) would have had a significant negative impact on conversion and revenue if we had scaled it up.

Conclusion

Don’t use the Simple PayPal form as our default donation flow.

It might still be possible to create a version of the Sequential form with fewer options which does increase conversion, but this particular alternative form is not that solution. And at this point in the campaign as the donation flow hits a Code Freeze, we are unlikely to iterate further on this. But there is always next year.

While we didn’t improve our conversion rate with this particular test, but we learned a little more. And to paraphrase our Webmaker colleague Paul Johnson from a couple of days ago; if some of your tests aren’t losing, you’re not taking big enough risks with your testing.