Encryption Campaign – Testing Email Design

One of our assumptions with the Q1 Encryption Campaign is that video is a promising medium worth exploring. This caused our team to question if the current design of our emails was the best way to showcase our new content to our Foundation list subscribers.

We decided to run a test on Beat 2. Our question was if a redesign based around 1) reducing text and 2) including larger graphical elements specifically indicating a video, would increase engagement* with the non Active Advocate audience.

We created 4 emails (A,B,C,D). Email A was our current standard Mozilla Foundation template with more long form text and no graphics aside from the Mozilla logo. Emails B,C and D were all relatively similar, less text, more whitespace, larger graphics and based off of the current Firefox + You newsletter template. Each email had the same subject line and each was sent to a random cohort of 91,000 subscribers or approximately 5% of the Foundation list that did not include Active Advocates.

A. TemplateA  B.TemplateB

C.TemplateC  D.TemplateD

Due to time constraints we ended the test after 8 hours and selected Email D as the winner. It was then sent to the remaining 80% of the Foundation list, including Active Advocates.

  • Clickthrough Rates at the 8 hour mark
    • A = 3.46%
    • B = 3.44%
    • C = 3.44%
    • D = 3.53% – Winner

We continued to monitor the test and at the 48 hour mark we were confident that Email D was indeed the winner of Clickthrough Rates.

  • Clickthrough Rates at the 48 hour mark
    • A = 5.10%
    • B = 5.18%
    • C = 5.11%
    • D = 5.35% – Winner

We also attempted to calculate Action Rate for each email using Google Analytics Video Started as our metric. Intriguingly, Email D did not have the highest Action Rate.

  • Action Rate at the 48 hour mark
    • A = 4.02% – Winner
    • B = 3.49%
    • C = 3.38%
    • D = 3.62%

Our original Mozilla email template (Email A) ended up having the highest Action Rate. While Email D came in second place. This could be because our list is familiar with the look and feel of the original template and thus more apt to take action.

While it can be argued that this test provided no definitive answers, there are several good bits of information we can take away here.

  1. Our campaign schedule either needs to accommodate a 48 hour testing window before final send or we need to increase our test audience size with the hope of seeing stronger results. We were not entirely confident that Email D was the true winner at the 8 hour mark.
  2. It appears that our Foundation list responds well to both short and long text. At the completion of our test both Clickthrough Rates and Action Rates for all 4 emails, were above our 2015 internal benchmarks.
  3. When offered a linked button vs a linked image with a play icon (Email C and D), the button had a Clickthrough Rate nearly 3 times higher.

The ultimate take away here is that more testing is needed. We need to structure a consistent testing process that is regularly scheduled into campaign calendars. Tests should also be repeated in order to try and determine data trends from our audience.

  *Engagement is defined by clicks and video views

 

1 response

  1. mrD wrote on :

    YEAH! GO AHEAD! “TRANSPARENCY FOR THE POWERFUL – PRIVACY FOR THE WEAK!”

    Thanks for your efforts! Keep it up!

    if you need help from a webdeveloper let me know :-D