I like to remind Emma customers to send their campaigns to their Emma test group before sending to their entire audience, but I'm not sure I stress enough the importance of split testing. To find out what really works for your unique audience, create two versions of the same campaign and see what kind of effect a particular variable has on your open rates.
What kinds of variables? Glad you asked. Let's take a closer look at three customers and three different variables.
Peru Mission, a client of Emma agency Outbox Design + Marketing, sends monthly email campaigns to their audience of over 3,000 recipients. They split their audience in half in February and sent a campaign with two different subject lines: A) February News from Peru Mission vs. B) University Students Explore Christianity, Women's Group Forms, Parish Furniture Evolves, and Trujillo Homecomings.
Let's break down some assumptions about subject lines before we dive into the results. Many marketers will tell you that a shorter subject line is better than a longer one; in this case, subject line A wins the battle for length, coming in at 31 characters, while subject line B contains 113 characters. And then there's the issue of uniqueness. We've told you that a generic subject line is no good, and that you're better off giving a teaser of the content to come. In that case, subject line B edges out subject line A.
So, what happened in Peru Mission's test? The results may surprise you, as they did Heidi MacDonald, who manages their monthly emails. The campaign with subject line A received a whopping 45.18% open rate, and subject line B came in with a strong — but much lower — 22.55% open rate. Shall Heidi chalk it up to her subscribers recognizing and preferring the shorter subject line? She could, but she's smarter than that.
She was skeptical of the results and took a closer look. Heidi says, "I began to suspect that the way we split the list [alphabetically] was not fair. After a little more investigation, we discovered that though the lists were split alphabetically, the second list (the one who received the long subject line) was full of email addresses without recipient names. Any email address we had that we didn't have more information for (i.e. first and last name) went to that second list. And probably, the less information we have for somebody, the less likely they are to be interested in the Mission and the less likely they are to open the email."
Heidi went a step further to prove her theory right. In March, she split the list randomly, and sent their March campaign with two subject lines — one short and generic, the other long and specific. And the open rates turned out evenly (32.44% and 32.11%). As Heidi has discovered, "The people who read the emails are generally going to read no matter what the subject line is because they are interested in what we have to say."
What I love about Heidi's split test is that it revealed something completely different than what she initially set out to discover. She may not need to closely focus on subject line strategy going forward, but now she can spend some time figuring out how to better engage the audience members for whom she knows little about. She could send a survey to find out more about them, or send a targeted campaign asking them to manage their email preferences.
Emma agency: Halogen
Split test: RSVP address
Wes Bentley of Halogen was game to test two RSVP addresses, the email address that your email appears to be from when it lands in your recipients' inboxes. He sent two identical campaigns in March — one sent from firstname.lastname@example.org, the other sent from his personal email address at the company.
Both addresses are valid email addresses and both provide brand awareness (they have the @halogen-design.com domain in common) so the question here was whether or not recipients would respond differently to receiving an email from an alias versus an individual.
You might expect an email from an individual to perform better than one sent from a company alias. It's seems more personal, right? However, for some audiences, it's actually more important that your RSVP address remain consistent. For one thing, subscribers may grow accustomed to looking out for emails from that address. Secondly, it's likely the address that they've already added to their safe sender list or address book.
In Wes' case, email@example.com is the address he typically sends from, and it's the one that performed slightly better — a 20.4% open rate versus a 17.1% open rate for the personal address. Still, there's not much of a spread here, and it may be worthwhile to to run a few more tests in the future.
Emma customer: The Ark Church Split test: Time of day
Kyle Kutter manages The Ark Church's media and communications and sends emails to an audience of more than 7,000 subscribers. In February, he split the audience and tested two send-times: 9:00 am and 3:00 pm. Kyle says that he typically sends between 9 and 11 in the morning, so trying an afternoon send was something different. And different can be good. If you don't try something new, you don't have anything to compare to.
Kyle's open rates were very similar — 18.06% for the 9:00 am send and 17.17% for the 3:00 pm send — and as he explained to me over the phone, he was surprised the afternoon send provided nearly the same open rate. It could be that, like Heidi of Peru Mission discovered, Kyle's most engaged readers will open no matter what time they receive the email. (And keep in mind that no matter what time you send, the times folks receive the email will depend upon how quickly their servers accept it.) In fact, most of his audience members are people who've filled out a signup form right in The Ark Church building. They have a direct connection to The Ark Church, and as a result, Kyle sees very low opt-out rates.
Still, if an afternoon versus a morning send-time doesn't make a huge difference, Kyle says he'd like to strategize ways to increase his open rates. He mentioned doing a content shake-up, such as changing the visual format of his campaigns or moving the social share buttons. And he'd like to segment the regular openers into their own group and send specifically to them.
If these three split tests didn't reveal different results, what's the point?
Maybe it's tempting to look at these results and return to the same ol' way (and when and how) you're sending emails. But you'd be missing out on the silver lining here. Whether or not these split tests revealed drastic differences in open rates, they did reveal subtler — and more significant — steps these customers can take to segment and better engage their audience.
And since you've got your own unique audience members with their own habits and behaviors, your split tests might reveal something else entirely. If you're ready to test a few variables, give these a try:
- Subject line
- RSVP from name or address
- Send time based on time of day
- Send time based on day of week (maybe a weekday versus a Saturday or Sunday)
And try these variables for testing click-throughs:
- Placement of the the key story in the campaign
- Copy of a call-to-action button
- Personalization variations (such as opening with a personal salutation versus none)
We'd love to hear how your testing goes. And if you have a compelling test to share and would like to be featured in a blog post, let us know!