October 03, 2006
Case Study

Can Multivariate Tests Reduce Your Shopping Cart Abandons? Real-Life Results ...

SUMMARY: According to MarketingSherpa data, the average ecommerce shopping cart has a 59.8% abandonment rate. (Can you imagine a retail store line with 60% of filled carts standing there abandoned by shoppers?)

Discover the practical cart design lessons one marketer learned from multivariate tests conducted this year. Turns out seemingly insignificant cart design factors can raise conversions. Which ones exactly? Click below to find out.

Yes, includes creative samples from the test:
Jeff Booth, CEO of BuildDirect™, describes the ecommerce site as being akin to a Sam's Club or CostCo -- discounted bulk buying -- except instead of stocking up on groceries, his shoppers buy building supplies.

Over seven years, Booth's team had repeatedly upgraded the site's cart using A/B tests and best practices, including:

o Adding shipping costs more prominently on merchandise pages so there's no surprises in the cart.

o Replacing the old multiple-page checkout with a single-page form.

"Honestly, we thought we had optimized it quite well," Booth laughs. But still, "I'm a CEO trying to drive improvement as hard as I can." When he heard about multivariate testing (which MarketingSherpa calls "A/B testing on steroids"), he wanted to test it on one page of the site.

Many marketers focus multivariate on the front end of the sales funnel -- the landing page that visitors first arrive at. Booth decided to do the opposite -- test the absolute final page.

It would take longer than a more traditional multivariate test. You need roughly 1,000 or more conversions to get statistically valid results. At the home page that conversion could be clicking on any link for free. In the cart, it had to be buying something.

"I have a frustration for things that take too long," Booth admits. But, if sales rose even 5%, at an average order size of $4,000, the bottom line would be very happy.

BuildDirect's marketing team, headed by Rob Woods, reviewed their single page checkout to invent things to test. (See link to creative samples below.) Key tests included:

#1. Page headline

Does a cart even need a headline? If so, should it be:
o Factual, "Checkout"

o Benefit-oriented, "Easy Checkout" or "Checkout in 60 seconds"

o Instructional, "Checkout is easy. Just fill out the form below."

#2. Form fields

Should the spaces where shoppers type in their shipping and billing information be classic blank white? What about the trendy new lemon yellow color that some sites use? It might help the forms stand out on a white background page …

#3. Delivery info

By the time the customer got to the cart page, they would have already seen delivery pricing and time. However, it always helps to remind them. The question was, how prominent should that estimated delivery time info be? Should it stand out in bold or was regular type OK?

#4. Price display

Could a quick copy tweak improve cart results? The team tested several wording changes, including "Grand total" and "BuildDirect Discounted Price".

#5. Final checkout button

Buttons and click links are always worth testing because the impact from a small change can be significant. The team tried half a dozen button versions, including:

o Click here to order
o You're done! - Click here
o Order now
o Order (very large)
o gray vs. green vs. blue background with white lettering

The winning page directly increased online sales by 10.6%. (Phone in orders to the toll-free number also increased at a slightly lower rate, but that data is harder to correlate to specific changes.)

Key results showed that mainly even if you the marketer or designer think something is incredibly "duh" obvious, it isn't always so:

o The instructional headline -- "Checkout is easy. Just fill out the form below" -- won individually but it was not on the creative that finally performed the best as a whole. That went to the factual "Checkout."

o Yellow background form fields significantly improved results over white background boxes.

o Bolding delivery time depressed results -- you need to keep their eye on the to-do ball rather than distracting with non-instructional information. That said, the delivery information in non-bold helped performance. So it needs to be there, just not so vibrantly.

o Benefit-oriented pricing won -- "Grand Total" and "BuildDirect Discounted Price" both did well, although "Grand Total" fared better as part of the whole.

o Click here to order button beat all competitors -- in blue. Gray may have been too boring, and green was definitely not a winner. (We suspect green might remind you of all the cash you're spending vs blue, which reminds you of how trustworthy the site is.)

"Now I want every page on the site tested and I want it tested yesterday!" says Booth. "If something works, do more of what works. I'm always pushing, pushing, pushing." So, now his marketers are multivariate testing their way up the sales pipeline.

Repeated testing has benefits -- the first test took about 30 days to launch due to the complexity of the cart and the learning curve of Booth's in-house team. These days they can get a new test live within seven to 10 days.

Useful links related to this article:

Creative samples from BuildDirect's tests:

Optimost - the optimization firm that BuildDirect relies on to develop, run and analyze its multivariate tests


Improve Your Marketing

Join our thousands of weekly case study readers.

Enter your email below to receive MarketingSherpa news, updates, and promotions:

Note: Already a subscriber? Want to add a subscription?
Click Here to Manage Subscriptions