January 13, 2015
Case Study

Email Marketing: Education group utilizes A/B testing to increase open rates by 39%

SUMMARY: Held back by a labor-intensive email service provider, the marketing team at Apollo Education Group, which provides online education to people around the world, was extremely limited in their testing capabilities.

After switching ESPs, the team was able to launch A/B tests to learn more about their customers and increase open rates by 39%.
by Courtney Eckerle, Manager of Editorial Content


Apollo Education Group is the parent company of online schools and universities, which was founded in 1973 and includes The University of Phoenix.

"From my perspective, the University of Phoenix is our biggest school, but we also have them worldwide — Mexico, South Africa, Australia, England. The goal is to bring education to people everywhere. It's just a lot more flexible," said Mae Umbriac, Director of Email Marketing, Apollo Education Group.

Umbriac works for the University of Phoenix, emailing to people who are already in the database. She runs the communications for students and alumni about training programs and other reporting news.


Apollo Education Group was facing a lot of internal issues with their then-current email service provider (ESP) because they did not have the proper IT resources to run programs. Unfortunately, the current ESP was a "really IT-heavy platform. You need resources to set up programs. It wasn't something that our operations team could do very easily," Umbriac said.

For example, the team was able to send email campaigns announcing open houses or new programs without issue.

"One-off emails were super easy, but setting up nurturing programs was pretty hard, decision trees and … we needed to come up with something that was more robust, that our team could actually dig in and do the management," she said.

If the team needed to change a template or if there were any changes to add to emails, they had to reach out to IT resources to implement their request. From there, the team had to make sure that everything launched correctly, was taken down correctly and that a winner was decided.

"It was a bit exhausting to have to go through that. It got to the point where it was like, well, you don't have a resources to do that anymore … It wasn't something that I could just do. So we went through the [request for proposal] process," Umbriac said. The team decided to move forward with a new ESP.

Deciding to go over to a more intuitive ESP was simple, Umbriac added, but the process was not. "There was a lot of data that had to be moved and swapped," she said.

Going through Apollo Group's legacy systems, proprietary databases was a "painful and difficult process in trying to get our data clean to move it over … These systems have been up since the '70s, so trying to get all of our data correct and appropriate and in the right format was a pretty big challenge."

About halfway through this changeover process, Apollo Group experienced changes in upper management and "we lost our backing. So we're kind of in the middle of this project, and all of a sudden, we're like, 'OK, now we need this done,'" she said.

Because of the changes, the project was approved only through the current phase, Umbriac said. "All of a sudden, it's, 'You don't have any more resources to do this' … Then moving forward, the next phase needs to go through the approval process, so that was a bit of a challenge."

Going through the approval process again, for the future phrases was "from our team's perspective … a really challenging and difficult process," she said, adding that the project ended up continuing relatively unscathed, and the data was fully migrated onto the new ESP.

"No platform is perfect, so we do have a whole new set of challenging issues with [the new ESP]. But from my perspective, it's a system I can log into. I can see what we're sending," she said, adding that one of the biggest changes has been the ability to do A/B testing, for which they've created their own database for tracking.


In the current ESP, because Umbriac and her team of seven have access to the email templates and don't have to use IT resources, "I can just go in and edit them myself and upload things. We can do A/B testing really easily, so it's pretty exciting to do all the reporting on that," she said.

Month by month, the team at Apollo Group has been building up their testing and tweaking their program with new capabilities.

"It's pretty exciting, actually, to be able to get more into it than I think that we have been in the past," she said.

Step #1. Open communication to present clear objectives

"The way it happened was our team, we were constantly facing issues," Umbriac said.

One of the most consistently positive changes has been working together as a team to identify issues, whether with the old ESP or otherwise. In regular meetings, the team now brings up what isn't working and finds a way to address it.

"Moving forward with that, we brought it up to upper level management," she said. From further open communication, the team was able to work with upper management to create business requirements and objectives.

By being clear with needs and objectives, the team was able to get upper management on their side and work together.

"I would say it was our lead for managing up, and then they took over and pushed it through," she said.

Step #2. Explore testing possibilities

"A lot of the pieces were determining a data dictionary, in terms of what all of these pieces mean and then how we want them to display," Umbriac said, adding that the process was very similar to data mapping.

For example, if the team wanted to send an email congratulating someone on completing their first class, they can ensure that the student has passed the class, is financially cleared and that they are referring to the student's correct program.

The team needed to decide "how we want things to display and how we want things to make sense in the new system, how they're going to pull, how we're going to get lists, the ETL development of it," she said.

In October 2014, the team was fully operational with the new ESP and began A/B testing in November. A lot of the capability they had for testing was new to them.

"The flexibility to do as much as we wanted was really cool. We started running testing programs, and some of them started off basic, like subject line testing or 'from' line testing. Surprisingly, 'from' line testing made a really big difference in our engagement rate," she said.

This stage is where the biggest differences were made in their program because the team was suddenly able to test everything they were curious about, answering small, but crucial questions about their customers.

"Testing images, testing content, testing time of delivery — so do we send on day three, do we send on day 12, do we send on both? Working through [all of] that. Using names in headers or in subject lines, what kind of difference does that make? Adding more links, taking more links out, [or] just making it a simple message," Umbriac said of the testing they ventured into.

Currently, she and her team feel like, "we can do almost anything. Obviously, we have a lot of legal limitations, so everything needs to be legal and brand approved, but I don't think that's unique to us. We can be pretty flexible with the way we want to look at a lot of things," she said.

They have also tested aspects like moving the unsubscribe link from the top to the bottom of the email, a simple change that in itself sprouts many other questions, such as "Do our complete rates go up? If we put it at the bottom, will people complain?"

View the Creative Sample

Click here to see the full version of this creative sample

Establish a regular review meeting

Umbriac and her team meet monthly to review the email program and decide what tests and sends they're going to launch. They also review what was sent the previous month and the subsequent results.

"We review the results and what we think we can improve moving forward. Obviously, we pretty much just do email, so over the course of a month, we see a lot of things that can be changed as well," she said.

Issues that are regularly discussed are deliverability, engagement and asking in-depth questions about testing. By discussing what went right or wrong in a test, they can proceed thoughtfully with the next month.

"'This email's going to do great, or this one did really well. What did we do in this that worked really well?' I try to keep a running log of that, and then when it's time to have our meetings, talk about it," she said.

Step #3. Keep testing data organized

Now, Umbriac added, "I can do it. I can just create a separate branch. I can end responses. It's pretty easy. It's an allocation switch, and then I just create it. I upload a new campaign, add a new template in and then that's ready to go. I can run it today just for a day. I can run it for three months. I just go in and enter it in like 30 seconds."

The team keeps a file where they log all of the testing information, which is a vital part of building up their testing process, she said.

"It's not like you can query it in the system, so we keep a testing log, a list of campaigns. It's a lot of tracking. So, while the process of implementing is very simple, the process of tracking is very in depth, but it helps," she said.

People in the organization "ask about all kinds of crazy things, so a lot of it is just management of what the KPIs are that we're testing," she said.

The team tries to test different objectives within each month and "get a little bit of everything in each month," she said, focusing on mixing open rates and clickthrough rates, for example.

To do this, Umbriac said the team schedules out testing for up to two months ahead of time because "I found the biggest thing in testing that really works is the timing of the emails. It's really timing that makes the biggest difference."

Verify results by retesting

Umbriac said that her team "absolutely" re-tests to verify results that they see — another benefit of the ease of setting up A/B tests.

She gave the example of a birthday email where they tested putting the subscriber's name in the preheader, which didn't test very well initially.

"It didn't do very well, which I thought [was] kind of strange because of the way you view things on your phone, and the way Yahoo mail displays, and actually Gmail too, you only see the little preheader. I tested it again, and it did amazingly well in September," she said.

Now, when the team sees a result come back that is surprising or opens up new questions, they will test it again.

"I'll wait a couple of months … and you just break it and see what happens. You learn," she said.

Emails like the birthday email are perfect to play with, she said, because they don't carry major business implications.

"The birthday email is kind of like a goodwill, happy email, so there's a lot that we can do with it that is not offensive … We can come up with some silly images or some fun things," she said.


"Looking back, it was crazy. We had to create a project with IT if we wanted to launch an A/B test. So if we wanted to have a version of the email, or a confirmation email with a picture with a red background or a blue background, we actually had to file a project and wait until we got a resource to create a separate branch and then test it," Umbriac said.

Umbriac's overall takeaway from where the team was to where they are today with testing is to consistently take advantage of ESP capabilities by experimenting with your sends because, "you always learn something."

The results Apollo Group has been able to achieve are:
  • A 39% overall increase in open rates through 'from' line testing

  • A 58% overall increase in clickthrough rates through email template testing

  • A 9% increase in open rates by adding a name in the preheader

From these efforts and testing — even when results aren't positive — Umbriac said she has learned to ask, "'How can I apply that in a different way to a different email or a different program? What else would people want to see?'"

The most important aspect has been process management, she added, as well as having fun with the process.

"You never know how people are going to respond," she said.

Creative Samples

  1. Unsubscribe test email


Apollo Education Group

Responsys — Email Service Provider

Related Resources

Email Marketing 2014: The top 8 MarketingSherpa case studies for your email program next year

Email Marketing 2013: The top 10 MarketingSherpa articles for your New Year's marketing resolutions

Community Marketing: 1 million Instagram impressions via creative design contest

Email Marketing: Persona-based email campaign drives 7% conversion rate with targeted content

Email Marketing: Clothing retailer lifts average open rate 40% via customer segmentation campaign

Improve Your Marketing

Join our thousands of weekly case study readers.

Enter your email below to receive MarketingSherpa news, updates, and promotions:

Note: Already a subscriber? Want to add a subscription?
Click Here to Manage Subscriptions