February 02, 2007
Shopper-generated product reviews are one of the few forms of user-generated content that are proven on site after site to increase conversions.
However, does this carry through if you add reviews to email blasts? As this MarketingSherpa's new Case Study on PETCO shows, it's worth a test:
It's not that PETCO doesn't get great email results… it's that like all great marketers VP Ecommerce John Lazarchic is never satisfied with status quo clicks and conversions.
The site routinely sent its house file all the standard types of promotions you've come to expect from ecommerce email -- special discounts, limited time sales, free shipping offers, etc. The problem is, of course, after a decade, these offers are all so very, very routine.
Lazarchic wondered if, aside from its cheerful logo, PETCO's email offers might not seem to much like every other pet supply store's emails. That's why this direct response-driven professional found himself wondering how "to get people more involved with the brand and see if that'd get more clicks to the site. We were also looking to use email a bit more to create a higher level of trust with the brand."
What's the best way to add that trusted brand icing to your email creative cake?
Back in early 2005, Lazarchic's team had launched a user-generated content initiative -- getting customers to add product ratings and reviews to product pages. In order to keep trust levels high (and potential law suits low), the team decided to allow negative, as well as positive reviews.
“We always knew that for the program to have any validity, we’d have to be willing to put up negative reviews,” Lazarchic explains. “The negative ones seem to ground the customers' expectations of products. I think it helps customers buy the better product, too. This is good because it can cut down on the amount of returned products you see from unsatisfied customers. It’s less expensive for us if they just don’t buy it in the first place.”
As other ecommerce sites MarketingSherpa has profiled have discovered, adding reviews helped conversions substantially for PETCO:
o Top-rated products were converting at a 49% higher clip
o Shoppers using the ratings section of the site for navigation spent 63% more than shoppers using other navigation column hotlinks.
o Shoppers who both read reviews and shopped via ratings navigational hotlinks had an average order size 40% higher than the average shopper.
Now Lazarchic's team wondered if they could see a similar bump in conversions if they added review content to their email templates.
Step #1. Test basic design elements
The team planned a big A/B test to determine whether reviews helped clicks and conversions or not. However, prior to the big A/B test, they ran a series of smaller tests just to figure out which sort of review content and design style worked best on its own. That way, they'd put their best foot forward with pre-tested optimum creative for the big A/B test.
Elements tested in smaller cells (typically a list size that would generate at least 100 responses) included wording nuances, such as "Most Popular Products" vs. "Top-rated Products".
Step #2. Run the big A/B test
Next, the team ran six A/B tests -- one set of tests (with and without review content) for each of their main demographics: people who love dogs, people who love cats, etc. Note: We like the fact that PETCO didn't just assume test results for one demographic could be rolled out as taken to all the others.
Each of the six A/B tests featured the same subject line, products, and body copy except, of course, for the creative related to the product reviews. (See below for an example of the A/B test to dog lovers.)
Step #3. Roll out based on results
Finally, the team adjusted creative templates for their monthly newsletter, as well as for special offer blasts to reflect the winning creative.
Lazarchic and his team were delighted to discover that clickthroughs on the email creative with product reviews were at least 200% higher than those without. That's one of the highest lifts we've heard of from a creative test to a house list in years -- most tests are far more incremental than this.
Also, when email recipients came to the site, they often purchased products that weren't specifically promoted in the email itself. “The interesting thing to me is that, unlike categories, such as dog toys or cat beds, where people are going to go through several pages of products while they comparison shop for very specific items, people who go through the top-rated channels are reading the reviews and ratings and buying things that they didn’t plan to buy. It’s similar to a consumer merchandized cross-sell program.”
Why did the test work so well? “Pet owners, like a lot of consumers, tend to trust each other a lot more than anything your company can say about the products,” Lazarchic explains. “So, it’s great to be able to take their posts and put them in email and leverage them. I think using positive consumer feedback makes sense for almost every channel.”
Related links for this story
Creative samples from the A/B test and current PETCO email campaign:
Responsys – the email services provider for Petco:
BazaarVoice – hosted application enabling Petco’s ratings and reviews: