Allison Sparrow

Tech Marketing

Category: A/B Testing

A/B Testing Emails in Marketo

This post outlines how to test emails when you want to:

  • test for statistical significance
  • test an email in a nurture
  • have clear reporting when using an email performance report
  • track what variation a lead received

Email Programs are great to use if you are just getting started with A/B Testing. Email programs allow you to test a variety of different elements in your email, and then declare a winner after a send.

email_program_dashboard

But what about when you want to A/B test an email in your nurture program? Email Programs come with their limitations:

email_program_limitation

I’m going to show you how I A/B test my email blasts and email nurtures. See this as a different way to test if you have one of the challenges above.

An A/B test for a simple email is easy to set up, and usually your objective will be opens and/or clicks. As pre-work, I strongly recommend outlining your hypothesis, goals and variable for your A/B test. You can use the framework I’ve outlined in my other post as a reference. Here’s an example of an email A/B test outline.

In this example, we are going to

  • test an email used in a nurture program
  • create a 50/50 split test
  • test the subject line as a variable

Start from a default program in Marketo

First, create a program in Marketo that contains an email. If you are going to test an existing email that’s already live, skip this part and jump to the next section: Emails.

default_program

Emails

Create your control email. This is the email that you’re going to try and beat out with your variation email.

You may be testing an existing email, or you may be creating a new email. You want your control email to be the content that you want to beat. So if your hypothesis is that “a funny subject line will be more effective than a serious one,” your control email will be the serious one.

Once your control email is completely reviewed, tested and approved, simply clone that email and append v2 to the new variation:

emails

Smart Campaigns

OK! Now you’re going to set up campaigns that will automate the testing for you.

Campaign: Send Email

Depending on whether this is a blast or an email nurture, the Smart List will vary. Everything else is the same. In this example, I’ll show how you set it up as a nurture email.

Smart List

Member of any Engagement Program (the filter lies in the stream itself, when you drag the entire program into the stream).

engagement_program

Flow

send_email_flow

Schedule

Do nothing

For Nuture

Add to Engagement Stream

 

nurture_smart

Activate the campaign

nurture_Activate

Reports

Let’s create a basic report so you can check your test at a moment’s notice!

Report: Email Performance

Smart List: none

Setup:

Sent Date: All Time

Marketing Activities Emails: Email .v1, Email .v2

email_filter

That’s it! So simple! Now when you look at your email performance report, you will easily see which one is performing better:

email_Results

Resources

Increase Customer Engagement within 30 days

Platform adoption is critical for understanding how engaged your customers are.  There are a lot of reasons why customers engage with your product or not. Testing those hypotheses will help you better understand challenges that customers face.

I ran a split test on a customer nurture program and increased platform engagement by 10% within 30 days. Here’s an overview of the customer nurture test, I hope it inspires you to get curious about testing!

I collaborated with the Optimizely Education team to determine how to increase product engagement with our new self-service customers. We knew that a customer was less likely to churn if they logged in multiple times within the first 30 days. We designed a customer nurture program to educate these new customers on various features and resources Optimizely had to offer. Our hunch was that by providing focused material to our new customers, they would be more inspired to use our product.

Hypothesis If we send our self-service customers a series of resource-rich emails, they will be more likely to engage with our Testing solution within the first 30 days.
Target Audience New self-serve customers
Primary Goal Optimizely logins
Cadence of Nurture Every Thursday for 4 weeks
Results Variation saw a 9% increase in logins within the first 30 days

Our hypothesis: if we send our self-service customers a series of resource rich emails, they will be more likely to engage with our Testing solution. There are a lot of psychological barriers to get started with testing, and our customers may not know where to start.

The program contained 8 emails, bucketed into three offer categories: Resource, Sell, or Feature. CTAs included: an ebook on building a data driven team, a how-to guide for setting up a goal in Optimizely, and a list of popular testing ideas.
cn emails

We held back 50% of our self-service customers from receiving this nurture. After 30 days, we already saw positive results! We tracked multiple activities, and measured significant impact with Optimizely logins and dashboard views. Within 30 days the variation saw a 9% increase in logins. Because of this result, we sent the nurture program to 100% of our customers.

 

How to receive 165% more NPS survey responses

With one simple test, I was able to get a 165% increase in our survey responses. Here’s how I did it.

Optimizely takes customer feedback seriously. We’ve created tight feedback loops that allow customers to share their experience with us. We A/B tested the support feedback email copy that’s delivered to customers who submitted a support ticket.

Test stats and learnings
Elements Support Survey Test
Target Audience Enterprise customers who have submitted a ticket to our support team.
Hypothesis If we send a support survey that refers to the exact support ticket information, then the customer will be more likely to click on the survey to provide feedback.
This will give the customer immediate context and empower them to act quickly.
Variable Email copy
Goal Increase clicks by 50%
Result 165% increase in clicks from variation recipients at 100% Statistical Significance*

 

Email variations
Control Variation
ops-control-email ops-variation-email

 

Modifications made
  • We updated the subject line so the customer would find this email in the same support ticket thread.
  • We linked to the ticket so the customer could review the ticket they were asked to provide feedback for.
  • We reference the support engineer for a personalized experience.

Due to the mass success of this campaign, the biggest learning is that it’s important to personalize. Our customers have reached a level of sophistication that they expect a catered experience with us, even with a support ticket. Every little bit a marketer can do to personalize will result in huge wins from a marketing campaign.

What do you think about this test? Have you had experiences testing out support surveys?

*I’m a geek, so I like to track statistical significance for our AB tests. If you don’t understand statistical significance, that shouldn’t deter you from AB testing!

© 2017 Allison Sparrow

Theme by Anders NorenUp ↑