Allison Sparrow

Tech Marketing

Tag: email marketing

A/B Testing Emails in Marketo

This post outlines how to test emails when you want to:

  • test for statistical significance
  • test an email in a nurture
  • have clear reporting when using an email performance report
  • track what variation a lead received

Email Programs are great to use if you are just getting started with A/B Testing. Email programs allow you to test a variety of different elements in your email, and then declare a winner after a send.

email_program_dashboard

But what about when you want to A/B test an email in your nurture program? Email Programs come with their limitations:

email_program_limitation

I’m going to show you how I A/B test my email blasts and email nurtures. See this as a different way to test if you have one of the challenges above.

An A/B test for a simple email is easy to set up, and usually your objective will be opens and/or clicks. As pre-work, I strongly recommend outlining your hypothesis, goals and variable for your A/B test. You can use the framework I’ve outlined in my other post as a reference. Here’s an example of an email A/B test outline.

In this example, we are going to

  • test an email used in a nurture program
  • create a 50/50 split test
  • test the subject line as a variable

Start from a default program in Marketo

First, create a program in Marketo that contains an email. If you are going to test an existing email that’s already live, skip this part and jump to the next section: Emails.

default_program

Emails

Create your control email. This is the email that you’re going to try and beat out with your variation email.

You may be testing an existing email, or you may be creating a new email. You want your control email to be the content that you want to beat. So if your hypothesis is that “a funny subject line will be more effective than a serious one,” your control email will be the serious one.

Once your control email is completely reviewed, tested and approved, simply clone that email and append v2 to the new variation:

emails

Smart Campaigns

OK! Now you’re going to set up campaigns that will automate the testing for you.

Campaign: Send Email

Depending on whether this is a blast or an email nurture, the Smart List will vary. Everything else is the same. In this example, I’ll show how you set it up as a nurture email.

Smart List

Member of any Engagement Program (the filter lies in the stream itself, when you drag the entire program into the stream).

engagement_program

Flow

send_email_flow

Schedule

Do nothing

For Nuture

Add to Engagement Stream

 

nurture_smart

Activate the campaign

nurture_Activate

Reports

Let’s create a basic report so you can check your test at a moment’s notice!

Report: Email Performance

Smart List: none

Setup:

Sent Date: All Time

Marketing Activities Emails: Email .v1, Email .v2

email_filter

That’s it! So simple! Now when you look at your email performance report, you will easily see which one is performing better:

email_Results

Resources

How to receive 165% more NPS survey responses

With one simple test, I was able to get a 165% increase in our survey responses. Here’s how I did it.

Optimizely takes customer feedback seriously. We’ve created tight feedback loops that allow customers to share their experience with us. We A/B tested the support feedback email copy that’s delivered to customers who submitted a support ticket.

Test stats and learnings
Elements Support Survey Test
Target Audience Enterprise customers who have submitted a ticket to our support team.
Hypothesis If we send a support survey that refers to the exact support ticket information, then the customer will be more likely to click on the survey to provide feedback.
This will give the customer immediate context and empower them to act quickly.
Variable Email copy
Goal Increase clicks by 50%
Result 165% increase in clicks from variation recipients at 100% Statistical Significance*

 

Email variations
Control Variation
ops-control-email ops-variation-email

 

Modifications made
  • We updated the subject line so the customer would find this email in the same support ticket thread.
  • We linked to the ticket so the customer could review the ticket they were asked to provide feedback for.
  • We reference the support engineer for a personalized experience.

Due to the mass success of this campaign, the biggest learning is that it’s important to personalize. Our customers have reached a level of sophistication that they expect a catered experience with us, even with a support ticket. Every little bit a marketer can do to personalize will result in huge wins from a marketing campaign.

What do you think about this test? Have you had experiences testing out support surveys?

*I’m a geek, so I like to track statistical significance for our AB tests. If you don’t understand statistical significance, that shouldn’t deter you from AB testing!

© 2017 Allison Sparrow

Theme by Anders NorenUp ↑