Close
Join 237,000 weekly readers and receive practical marketing advice for FREE.
MarketingSherpa's Case Studies, New Research Data, How-tos, Interviews and Articles

Enter your email below to join thousands of marketers and get FREE weekly newsletters with practical Case Studies, research and training, as well as MarketingSherpa updates and promotions.

 

Please refer to our Privacy Policy and About Us page for contact details.

No thanks, take me to MarketingSherpa

First Name:
Last Name:
Email:
Text HTML
MarketingSherpa Email Summit 2015 - SAVE $700 - VIP PRICING ENDS THURSDAY
Jan 15, 2009
Case Study

Overnight Send Time for Email Lifts Open Rate, CTR, Registrations

SUMMARY: A test that delivers a marked improvement spurs excitement. But you need to temper that excitement and make sure the results are valid.

Read how an email marketer continued to test send times for their emails after a previous test showed a surprising boost in CTRs for messages sent out overnight. They wanted to verify tests of the unconventional time to make sure the rate held up.
CHALLENGE

Late last year, Hunter Boyle, Managing Editor, MarketingExperiments, was running tests to optimize his company’s email messaging when his team discovered an exciting result. The CTR jumped more than 21% for messages sent in the wee hours of the morning.

Boyle’s team tested a range of different send times for email invitations to the company’s bi-weekly online clinics and their research newsletter. They wondered whether sending emails earlier in the morning would give their messages more visibility by arriving before most recipients’ inboxes began filling up. (A podcast detailing that test can be found in the Useful Links section.)

His team compared open and clickthrough rates for messages sent at 9 a.m. (their standard send time), 6 a.m., and 2 a.m. Eastern Time. Both earlier send times beat the 9 a.m. standard, but messages sent at 2 a.m. got a 21.64% higher clickthrough rate – despite the 7.55% lower open rate.

The boost in CTR was exciting, but Boyle’s team had some doubts. They wondered whether the nature of the test – conducted over sequential newsletters, instead of through split testing – might reflect differences in subject lines, message copy or topic. The only way to find out for sure was through further testing.

“We wanted to make sure we ran a test out over a couple months to try to distinguish the impact of the send time from some of the different content things going on,” says Boyle. “We tried to isolate or whittle out any factors besides the time.

CAMPAIGN

Boyle and his team shifted from sequential newsletter tests to an A/B test of their email messaging. All email content remained the same, but the team divided their messages evenly between a 2 a.m. send and a 6 a.m. send.

The three steps to validate the test results:

Step #1. Create standard email copy for a nine-message cycle

To strip out any potential differences caused by changing weekly subject lines and email copy during the sequential test, the team conducted the split test over the course of nine regular email communications.

They used the same email copy and subject lines for the clinic invitations, reminders, and newsletters, which were sent with this schedule:
o Thursday – Invitations to a clinic the following week
o Monday -- Research brief newsletter developed from a previous clinic
o Wednesday – Day-of-clinic reminder email

Step #2. Split delivery between two times

For each of those email messages, the team divided its recipient list into two categories:
- One group received the email at 2 a.m., Eastern Time
- Another group received the email at 6 a.m., Eastern Time

Step #3. Monitor metrics to compare impact of send time

After each email, the team monitored two key metrics:
o Open rate
o Clickthrough rate

They averaged the open and CTRs from all nine messages to determine the impact of the different send times.


RESULTS


The 21.64% boost in CTR recorded during the sequential newsletter test was not sustained during the A/B test. Double-checking the lift in CTR from a 2 a.m. send time was the right decision.

The 2 a.m. send time, however, still performed slightly better than the 6 a.m. send time during the A/B test:
o Average open rate increased 3%
o Average CTR increased 1%

“If you feel like every little bit helps -- which we do -- we decided to shift all the sends over to 2 a.m.,” says Boyle.

Their decision to shift to the overnight send time wasn’t based solely on the improvement in metrics. The 2 a.m. delivery also allowed the team to reach subscribers overseas and on the West Coast during times when their inboxes were less cluttered:

- They saw an uptick in Web clinic registrations from recipients in the United Kingdom with emails delivered at 2 a.m. (7 a.m. UK time).

- They saw an increase in registrations from the West Coast with the 2 a.m. delivery. Those emails arrived at 11 p.m. PT, when many of the Web developers, designers and entrepreneurs that read the company’s newsletter or attend its clinics are still online.

“Maybe we would have gotten them at 6 a.m., but if there’s nothing to lose and we’re seeing those names coming in at that time, then there’s something there.”

Useful links related to this article:

MarketingSherpa Podcast: New Test Results for Email Send Times: 3 Major Takeaways
http://www.marketingsherpa.com/article.php?ident=30917


MarketingExperiments
http://www.marketingexperiments.com





See Also:

Comments about this Case Study

Jan 15, 2009 - Jack Leblond of JackLeblond.com says:
Test, test and more test! Glad this worked out for them, but glad I'm not on their list. I get a lot of mail on my SmartPhone and *REALLY* do not like overnight mail delivery.


Jan 21, 2009 - Mike Atkinson of House of Magnets says:
Like Jack said, TEST! My test were exactly the opposite. It showed 10am PT was the best time (hits the west mid-morning and east after lunch, when their inboxes are most likely cleaned out). But that works with our specific niche audience...



Post a Comment

Note: Comments are lightly moderated. We post all comments without editing as long as they
(a) relate to the topic at hand,
(b) do not contain offensive content, and
(c) are not overt sales pitches for your company's own products/services.










To help us prevent spam, please type the numbers
(including dashes) you see in the image below.*

Invalid entry - please re-enter




*Please Note: Your comment will not appear immediately --
article comments are approved by a moderator.

Improve your marketing

Join our thousands of weekly Case Study readers. Enter your email address below to receive MarketingSherpa news, updates, and promotions:
Note: Already a subscriber? Want to add a subscription?
Click Here to Manage Subscriptions