With one simple test, I was able to get a 165% increase in our survey responses. Here’s how I did it.
Optimizely takes customer feedback seriously. We’ve created tight feedback loops that allow customers to share their experience with us. We A/B tested the support feedback email copy that’s delivered to customers who submitted a support ticket.
Test stats and learnings
|Elements||Support Survey Test|
|Target Audience||Enterprise customers who have submitted a ticket to our support team.|
|Hypothesis||If we send a support survey that refers to the exact support ticket information, then the customer will be more likely to click on the survey to provide feedback.
This will give the customer immediate context and empower them to act quickly.
|Goal||Increase clicks by 50%|
|Result||165% increase in clicks from variation recipients at 100% Statistical Significance*|
- We updated the subject line so the customer would find this email in the same support ticket thread.
- We linked to the ticket so the customer could review the ticket they were asked to provide feedback for.
- We reference the support engineer for a personalized experience.
Due to the mass success of this campaign, the biggest learning is that it’s important to personalize. Our customers have reached a level of sophistication that they expect a catered experience with us, even with a support ticket. Every little bit a marketer can do to personalize will result in huge wins from a marketing campaign.
What do you think about this test? Have you had experiences testing out support surveys?
*I’m a geek, so I like to track statistical significance for our AB tests. If you don’t understand statistical significance, that shouldn’t deter you from AB testing!