January 27, 2015
Case Study

Email Marketing: Cart abandonment campaign sees a 400% revenue boost over previous campaign

SUMMARY: Lifestyle and furniture online brand Dot & Bo switched ESPs and began setting up a closer relationship between the marketing team and testing. They began by tweaking an existing cart abandonment campaign.

See how the team was able to turn a focus on testing into a 400% revenue boost within that effort.
by Courtney Eckerle, Manager of Editorial Content


Dot & Bo is a website for modern furniture and lifestyle and focuses on offering its customers inspiration through a curated collection of home décor centered around a theme.

"The theme can be design influence or regions of the country, a trend, a holiday … it helps [customers] discover unique items that they wouldn’t find anywhere else," said Allyson Campa, Vice President of Marketing, Dot & Bo.


Campa works on a team of nine at Dot & Bo and has been in Silicon Valley, "mostly at start-ups and other consumer software companies, focused on generally marketing roles. I love the intersection of science and art. Particularly the science of consumer marketing is very data heavy where you can understand user behavior and what motivates people to engage with your service," she said.

With that philosophy in mind, the marketing team changed email service providers to break down barriers between the company and its customers. The team wanted to increase the amount of control and understanding they had about what was and what wasn't working in testing.


With this new ESP, the marketing team implemented a new cart abandonment campaign and A/B testing.

Step #1. Set up testing procedures

Dot & Bo has a number of trigger campaigns based on where the customer is in their life cycle and purchase cycle and sends an email every day to a broad group of users. When moving over the cart abandonment campaign to the new ESP, the team now had the ability to A/B test and have a multi-step workflow.

"We’re still optimizing it, but we started out with typically featuring one item that was in a customer's cart and reminding them of it, which is a fairly standard industry best practice," Campa said.

From there, the team was able to "improve it by pulling in all of the items in this customer's shopping cart, so multiple items have some logic around trying to get them to return to this order," she added.

Now that the team can "fairly easily and flexibly set up tests," they are testing timing of offers and how often emails go out. For each of these tests, she added, the goal is to learn something that "helps us optimize it for the next one."

Since the team has greater control with the new ESP, they are "essentially using it on our own," so the workflow has changed to be more internal and independent than previously.

The workflow now includes creative development of the tested emails as well as deciding who will be segmented. "We're going to do some analytics around that to understand the size of the group that we're going to go after, the criteria that we want it to select," she said.

Step #2. Set up abandon cart emails

The timing for the abandon cart emails currently is two hours after the abandonment, and then a second email is sent 24 hours after.

If a customer clicks through the first email and completes the purchase, they are automatically taken out of the second send.

The first email reads, "It’s still available" and largely features the product that was left. Customers can then either view the item or shop other sales. The second email offers customers 10% off their order, but the offer expires.

View the Creative Sample

Click here to see the full version of this creative sample

Each email includes a paragraph encouraging customers to reach out to a help line or customer service email address if they need assistance ordering or have any questions.

The team was essentially making two changes in this effort, and "one is showing more items to a user, and the other is an offer associated with it. We saw … it was close to $4,000 increase in our growth margins," she said.

Build customer concerns into tests

Timing is one of the things the team is currently testing, especially the idea that customer timing might be different based on product price point. Dot & Bo sells furniture and home décor, which has a wide range of prices, Campa explained.

"I haven't completed those tests … But you might imagine if it's a bigger purchase, or considered [a bigger] purchase, a two-hour reminder may be too soon because, essentially, they're considering this and running it by a family member or housemate and thinking about their budgets," she said.

Step #3. Set up a procedure to analyze data

"Ideas have come in from a lot of places, and so we can test that with pretty big, broad tests with a lot of people," Campa said, adding that even though they can intuit what will win, the team wants to collect as much data as possible to learn from the results.

The team has a weekly meeting to share testing results, and "we run everything through statistical significance calculators, and sometimes look at the actual raw data," Campa said.

The team is careful not to make changes, she added, unless the results are statistically significant, and they often validate results by continuing the campaign in a bigger test sample or re-running it.

"And things do change over time. Our user base is growing," she said, adding something that worked a year ago might not hold true today.

Email is a very important channel for Dot & Bo, Campa said, and "we're excited that … we can be very analytical in how we optimize it over time. And we know that each email is going out to users that are of very different mindsets as far as their time with the company, where they are in their buying cycle, types of items they're looking at [and] who they are," she said.


This campaign and the team’s testing efforts shows "the potential of doing that, and we are continuing to work toward a solution where we'll send out offers to, effectively, just the customers who need them," according to Campa.

From the cart abandonment campaign, the team was able to see a 400% increase in revenue over the previous version.

"We were able to see a significant increase in revenue through optimizing the campaign, and we expect more to come. And then, separately, we're trying to figure out what's the right content and offers to put in front of which customers to optimize the experience," Campa said.

Creative Samples

  1. First cart abandonment email

  2. Second cart abandonment email


Dot & Bo


Related Resources

Email Marketing: How content and testing boosted revenue 114% at IAC subsidiary HomeAdvisor

Email Marketing: Education group utilizes A/B testing to increase open rates by 39%

Email Marketing: First-time cart abandonment campaign drives a conversion rate 1,858% higher than weekly send

Email Marketing: Browse abandonment campaign drives more than $300,000 in additional sales for menswear site

Email Marketing: Educating new subscribers drives 33% of total email revenue in welcome campaign

Improve Your Marketing

Join our thousands of weekly case study readers.

Enter your email below to receive MarketingSherpa news, updates, and promotions:

Note: Already a subscriber? Want to add a subscription?
Click Here to Manage Subscriptions