April 17, 2012
Case Study

Email Marketing: 17.36% higher average clickthrough rate in 7 personalized subject line tests

SUMMARY: Personalization is one of the oldest tactics in the book. Subscribers aren't wowed by seeing their names in subject lines, but does that mean the tactic is outdated?

This case study shows how a B2B marketing team tested personalized subject lines in seven consecutive emails to see once and for all if the tactic worked for its audience. You'll see the metrics from all seven tests and the surprising results.
by Adam T. Sutton, Senior Reporter

CHALLENGE

Personalization is a tried-and-true email tactic. Marketers have improved results by adding first names to subject lines and greetings for years. But is the tactic a little threadbare?

Amanda Gagnon, Education Marketing Associate at AWeber, was not sure if today's subscribers cared that an email addressed them by name. Furthermore, her team at AWeber, an email marketing software provider, speaks to marketers. Hadn't this group been familiar with personalization for years?

"I personally considered this something that our savvy subscribers would roll their eyes at," Gagnon says. "Personalization has sort of been done to death."

Gagnon knew that personalization improved results at other companies, but she and her team were skeptical that it could work with AWeber's audience.

CAMPAIGN

The team tested personalized subject lines in seven consecutive emails in February. Here's how the team set up the test so it could be certain whether personalization boosted response.

Step #1. Select an email list

The team focused this test on its blog subscribers, people who had opted-in for biweekly emails with links to AWeber's most recent posts. The test was part of an ongoing effort the team underwent to improve these emails.

"We had taken basically the same approach [in the blog email] since we started," Gagnon says. "We wanted to see what could optimize our responses."

Set a test schedule

Rather than testing one email, the team chose to test seven consecutive emails sent in February. This would provide a clear indication of whether personalization would improve results.

The blog email is delivered on Tuesdays and Thursdays. The test emails were sent on the following days in February: 2nd, 7th, 14th, 16th, 21st, 23rd, 28th.

Step #2. Select a single variable to test

The team planned to send a basic A/B test on each of the days listed above. A random half of its list would receive a normal, unaltered email (email A), and the second half would receive a test email (email B).

The team chose one part of the email to test: the subject line. The only difference between the emails was that the subject line for email B included the first name of the subscriber as the first word.

For example, here are the two subject lines for the email sent on Feb. 28:
  • Email A: "Email Marketing Advice From 2 Guys (Who Know What They're Doing)"

  • Email B: "[First Name], Email Marketing Advice From 2 Guys (Who Know What They're Doing)"

The team could have personalized the body of the email, or tried variations on how the subscriber's name was referenced in the subject line. But that would have made the results less certain. By focusing on a single change to a single part of the email across multiple sends, the team could draw definitive conclusions from the results.

Step #3. Double check your data

Many people mistype their names on email registration forms (whether on purpose or by accident). If you're testing personalization, you need to ensure the words you pull from your database do not embarrass your brand.

Clean up your data before launch by correcting or removing the following:
  • Capitalization errors

  • Offensive language

  • False names ("Notgiving Myname")

  • Obvious misspellings

Be careful not to overcorrect, though. You might find some subscribers prefer exotic spellings over traditional.

Tests have to be valid

A/B testing helps reveal answers, but those answers are only as valid as the tests themselves. There are several threats to the validity of your tests, and one is statistical significance.

Statistical significance depends on sample size (the number of emails you're sending in the test) and the number of positive or negative responses to each treatment. Only tests that achieve a "statistically significant" difference in response between versions A and B are valid.

Check the "useful links" section below for more information about validity, statistical significance, and a free validity tool from our sister company, MarketingExperiments.

RESULTS

* Note: We strive to take an even more rigorous approach when reporting on case studies of companies that have a vested interested to be in front of our audience, such as marketing agencies and vendors. That's why you'll see a laundry list of metrics below, including the open and clickthrough rates for each email tested in the campaign.

Gagnon was somewhat surprised to see that the personalized emails had a 5.13% higher average open rate than the regular emails. But she was really surprised that they also had 17.36% higher clickthrough rate (results of all seven emails are below).

"Clicks actually blew opens out of the water," she says. "It turned out that was where the personalization seemed to have the biggest effect."

Results from the 7 tests

The open and clickthrough rates were higher for the personalized emails in every test.

Feb. 2
Subject line A: PadiAct Takes Your Targeted Subscriptions to a New Level
Open rate: 13.86%
CTR: 2.41%

Subject line B: [First name], PadiAct Takes Your Targeted Subscriptions to a New Level
Open rate: 14.79%
CTR: 3.03%

Open rate increase: 6.67%
CTR increase: 25.31%

Feb. 7
Subject line A: Improve Deliverability In Two Simple Steps
Open rate: 14.05%
CTR: 3.07%

Subject line B: [First name], Improve Deliverability In Two Simple Steps
Open rate: 14.71%
CTR: 3.39%

Open rate increase: 4.71%
CTR increase: 10.54%

Feb. 14
Subject line A: Don't Kill the Romance: 7 Email Marketing Buzzkills to Avoid
Open rate: 13.57%
CTR: 3.1%

Subject line B: [First name], Don't Kill the Romance: 7 Email Marketing Buzzkills to Avoid
Open rate: 14.54%
CTR: 3.68%

Open rate increase: 7.15%
CTR increase: 18.76%

Feb. 16
Subject line A: Could An Email Campaign Be Your Business?
Open rate: 14.12%
CTR: 2.57%

Subject line B: [First name], Could An Email Campaign Be Your Business?
Open rate: 14.42%
CTR: 2.81%

Open rate increase: 2.11%
CTR increase: 9.43%

Feb. 21
Subject line A: Bending the Email Best Practices Rules
Open rate: 13.45%
CTR: 2.17%

Subject line B: [First name], Bending the Email Best Practices Rules
Open rate: 14.45%
CTR: 2.6%

Open rate increase: 7.44%
CTR increase: 20.05%

Feb. 23
Subject line A: Effective Marketing: It's All About Your Subscribers
Open rate: 13.09%
CTR: 1.86%

Subject line B: [First name], Effective Marketing: It's All About Your Subscribers
Open rate: 13.47%
CTR: 2.33%

Open rate increase: 2.89%
CTR increase: 25.27%

Feb. 28
Subject line A: Email Marketing Advice From 2 Guys (Who Know What They're Doing)
Open rate: 13.22%
CTR: 2.83%

Subject line B: [First name], Email Marketing Advice From 2 Guys (Who Know What They're Doing)
Open rate: 13.88%
CTR: 3.18%

Open rate increase: 4.97%
CTR increase: 12.12%

Subject lines impact CTR

If the team only changed the subject lines in each test, then why did the clickthrough rate increase? Aren't subject lines supposed to only affect open rates?

"My theory is mindset," Gagnon says. "People decide whether or not to open an email based on so many factors, and what the subject line says, I think, has a far smaller effect than how much time someone has available."

In other words, a personalized subject line cannot give subscribers more time, but it can influence the people who would likely have opened the email anyway, Gagnon says. Seeing their first names got subscribers more interested in the email and more willing to click.

The team will also use personalized subject lines more often going forward, but not for every message, she says.

"We're actually going to test this more," Gagnon says. "I want to do personalization in the body of the email as well."

Useful links related to this article

CREATIVE SAMPLES:
  1. Feb 28 email

  2. Feb 21 email

Basic Validity Tool Download

Bad Data: The 3 validity threats that make your tests look conclusive (when they are deeply flawed)

Email Research: The 5 best email variables to test

Quick Lifts: 4 ideas to increase email clickthrough

Email Marketing: Helzberg Diamonds garners 288% sales lift with animated, personalized promo

Research Update: The state of email marketing testing and optimization

Optimization Summit 2012 in Denver: Measure, Test, Convert

AWeber



Improve Your Marketing

Join our thousands of weekly case study readers.

Enter your email below to receive MarketingSherpa news, updates, and promotions:

Note: Already a subscriber? Want to add a subscription?
Click Here to Manage Subscriptions