November 12, 2009
Case Study

Behavior-based Email Send Times Lift Opens, CTRs and Referrals: Test and Results

SUMMARY: Most email marketers are happy to have a major testing breakthrough every once in a while. But see how a financial services brand recently scored big with multiple tests in this initial Case Study of a two-part series.

First: How an A/B test showed that past response data can be used to send emails to list members' individually-preferred times. Includes tips on list growth efforts, subject line tests and segmentation.

Lisa Friedman, Sr. Director, Marketing,, could see from her open and clickthrough rates that her email list was underperforming. This low engagement meant the free personal financial management service was losing opportunities to move subscribers down the funnel to obtain one of their third-party paid services, and therefore earn the site an affiliate fee.

Friedman and her team suspected email messages weren’t reaching subscribers at times when they were ready to engage. They also wanted to encourage a viral effect by having users refer their friends to the service.

"One of the things we really were looking to do was better target our users and get them to spread the word about our service," Friedman says. "We wanted to test out different tactics and psychologies to improve our email program’s performance."


Friedman and her team also wanted to dramatically increase new registrants at the site (and subsequently, grow the email list). As you’ll see, they certainly had a lot on their hands.

To get the job done, they took the following five steps:

Step #1. Build a bigger list

Before the initiative started early this year, the young brand knew they hadn’t been as committed to their email program as they could have been. Simply put, it was time to focus on list growth.

Friedman and her team came up with a six-pronged campaign to drive traffic to the site. They used:
o Social media sites
o A blog on the flagship site
o Search engine optimization
o Public relations, including TV and radio news segments
o Word of mouth

Visitors attracted by these promotions landed a single click away from a registration page designed to be simple to complete. It included a handful of fields:
o Email address
o Confirm email address
o Zip code
o Password
o Confirm password

Step #2. Program personalized send times

Next, the team worked with their email service provider (see useful links below) to analyze recipient behavior on a rolling basis and predict the best delivery time for each address on the mailing list.

The team began delivering messages to recipients at the time of day they had shown they were most likely to open and click.

Here are a few examples of how the program worked:

- If recipient A displayed a tendency to open emails at 5 p.m.,’s campaigns, alerts and triggered messages would be sent at that time.

- If a recipient B often opened emails at 3 a.m., all messages would be sent at that time.

- If recipients started opening the emails at different times of the day, the system adapted and began sending according to this newly exhibited behavior.

Step #3. Run A/B test

Before implementing the timing system across the board, Friedman and her team ran a month-long A/B test to see if it indeed worked:
o Half of the list received messages at their behaviorally established times
o Half received messages at 6 a.m. EDT

"We had been using 6 a.m. EDT as our send time, and that’s why it was the control send time," Friedman explains. "We always sent it then so it would be in the morning inboxes on the East and West Coasts."

Step #4. Remove inactive names

The team knew they weren’t going to improve opens and clickthroughs by optimizing send times alone. They needed to take a hard look at names that showed a fading interest in the brand.

Friedman and her team decided to create a segment of names that had opened an email during the last four months. They targeted these recently-active subscribers from that point forward.

Step #5. Test subject lines

Lastly, Friedman wanted to get a better grasp on which types of subject lines and offers made the audience respond. Her team tested two incentive-based subject lines/offers vs. a generic for the control group.

- The control subject line: "Spread the word."

- Test subject line A: "Spread the word and win an iPod."

- Test subject line B: "Want early access to new features?"

The email body copy reflected the subject line messaging, but all three emails used the same design template. The creative reflected the aim of the campaign, which was to drive referrals.


The lead-generation campaign and its simple registration page worked miracles for the team’s list growth. Roughly 3,300 new site users/email subscribers were gained a day for six consecutive months.

Then, the A/B test showed that basing send times on a behaviorally-indicated preference was a winning approach. The personalized send times delivered improvements over the 6 a.m. control time on two all-important data fronts:
o Open rate increased 7%
o Clickthrough rate increased 13%

"We were looking to improve our open and clickthrough rates, and that’s what [we] did," Friedman says. "Obviously, if the timing is right, that’s important. Having that edge makes a big difference."

Friedman adds that segmenting the list to people who opened or clicked in the previous four months was also part of the numbers lift. "One of the reasons why it was so successful is because it targeted active users of our product. We didn’t just send it to everybody."

While specific metrics for the subject line test weren’t available, Friedman says her team was intrigued to see the audience open the "Want early access to new features?" subject line at the highest clip. It showed her team that emphasizing the site’s usability upgrades and features was a strong way for her brand to break through the inbox clutter.

"They use our service and like it," Friedman explains. "And they wanted to get a sneak peak at the new features offered more than [they wanted to] win an iPod Nano."

Finally, even though they hadn’t had a refer-a-friend program before and therefore had no benchmark to compare against the test performance, the team was confident that they accomplished a key goal by increasing viral.

"The more sharing we get, the more referrals, and ultimately, more registered users," Friedman says. "And that’s the bottom line."

Useful links related to this article

Creative Samples from’s email tests

Overnight Send Time for Email Lifts Open Rate, CTR, Registrations

Silverpop: The team’s email service provider

Improve Your Marketing

Join our thousands of weekly case study readers.

Enter your email below to receive MarketingSherpa news, updates, and promotions:

Note: Already a subscriber? Want to add a subscription?
Click Here to Manage Subscriptions