July 14, 2004
Case Study

Palo Alto Software Raises Online Sales 41.3% by Testing Counterintuitive Site Design Tweaks

SUMMARY: Do you suspect your site's conversion rate could be a lot higher, but you're not sure what tweaks, beyond really obvious stuff, will make a difference?



Or do you yearn to test dozens of different tweaks but because you get limited site traffic (as many b-to-b marketers do), it would take years to get statistically valid results with standard A/B split testing?



Discover how one marketer got around these problems by using what's called multivariable testing. Even he was surprised by the results.



Yes, includes screenshots of tests that worked, plus those that didn't:
CHALLENGE
Site tweaks can become addictive when you've seen significant results.

As we reported exclusively in a MarketingSherpa Case Study this January, by testing relentlessly, Palo Alto Software's Web team raised their site sales conversion rate from 1.01% to 2.39%.

They didn't want to stop -- but when you've already more than doubled online sales, sooner or later you hit a ceiling.

Executive Producer Pat McCarthy felt like he'd run out of obvious elements to test. "I began to wonder if I might be limited by a personal bias, assuming a variable might have no effect on conversions when it really might. What about testing counterintuitively? What about testing lots of little variables?"

However, like most b-to-b sites, Palo Alto Software gets limited traffic in comparison with giant consumer sites. McCarthy estimated he'd need three-to-four weeks worth of traffic to an a/b/c split test* to get statistically valid results for just a single variable.

When your traffic isn't massive, and you don't know intuitively what changes may make a difference, and every page has dozens of variables to tweak, it may take you years of tests to show meaningful results.

McCarthy wanted a faster solution. How could he test a whole bunch of ideas at once and get meaningful answers quickly?

CAMPAIGN
"Instead of going through the pain of a/b, we decided to try multivariable testing," says McCarthy. This would allow him to test many ideas at the same time to fairly small test pools and get reliable results.

The math behind multivariable testing is known as the Taguchi Method (links to more info below from Harvard and the American Supplier Institute.)

First, McCarthy had to pick a particular site page to test. He decided against Palo Alto Software's home page because that's not where the majority of traffic arrived. "We picked what we call our decision page, where we give an overview of our two top products and see if people are interested in the $99.95 or the $199 product."

The page was fairly high traffic, as well as critical in the conversion-to-sale funnel because it contained "Buy Now" links among other elements. Plus, McCarthy had similarly designed pages throughout the site for other products, so he figured he could apply any key learnings to more areas later.

Next, the team chose 11 different variables to test, such as the hero shot (photo of the product) at the top of the page. And they invented 58 different tweaks for these variables. For example, a bulleted four-point list was tested in four ways, with each copy bullet trying out the top position. All in all 41,472,000 template variations were analyzed with just a couple of month's worth of traffic....

More test examples:

- An underlined "buy now" text link versus a "buy now" button graphic

- The order of cross-promoted products as they appeared in a bar along the bottom of the page

- Key information displayed in chart format versus copywritten paragraphs

- More copy versus less copy

- A short testimonial versus a ZDNet review quote versus a colorful product screenshot

"I was really curious to see if any of these things would have an effect," says McCarthy. "I didn't know what to expect." By May 5, 2004, "it was blatantly obvious we had a winner," so he threw the switch and started sending all traffic to what had become his new control.

RESULTS
Visitors who landed on the page with the winning creative consistently purchased at a rate of 41.3% more than visitors who landed on the old control. That's a big sales lift from testing lots of little things.

In general, the more simplified, less cluttered design elements tended to perform better. For example, even though the page acted as a decision fork for shoppers to choose between two versions of a product, the hero shot of just a single product outpulled a shot of the two together.

"That was counterintuitive," says McCarthy. "It made better sense to me to put both product boxes at the top because we were talking about both. Go figure. It's tough to explain the human mind sometimes..."

Usually less copy also pulled better than more copy, except for one instance. The ZDNet review quote at the top of the page raised conversions by 16.8% compared to a colorful screenshot of the software. Now McCarthy is going through other product pages on the site, testing quotes in this critical position on the screen.

Graphics certainly pulled better than a text-link for the Buy Now button (which McCarthy says he didn't expect, although we sure did.)

McCarthy trusted every test result except one. "The chart did poorly. I looked at it and thought to myself, 'This isn't because people don't like charts.' My gut feeling was the chart was too busy, too crowded. I quickly designed another chart and had it tested."

Sure enough, the new simplified chart clearly won over copy in paragraphs.

McCarthy says he learned two critical lessons from the entire process...

Lesson #1: "Making my own assumptions about site design is not a good idea. Just because I think something is going to work better doesn't mean it will. Testing is the way to go."

Lesson #2. "When I showed the winning page to our President, he said, 'Well, that doesn't look as good as our original page.'

"When I designed the original page, it had a consistent look and flow. But, when all the variables are tested, you find different winning things in each area. When combined, the winning page visually looked worse as a whole."

So, now he's carefully revising that winning page hair by hair, trying to stay true to the elements that made each variable a winner, while tying them together in a smoother flow.

*Footnote:

What's an a/b/c split test? It's a clever tactic to figure out when your results are probably statistically valid -- without having a PhD in math. McCarthy splits traffic to a single page into three streams. The first in group "A" see the standard control page. The second in group "B" see the page with a single variable tweaked. The third in group "C" see the exact page as "A" -- it's like using a placebo in a drug trial.

McCarthy sits back and lets the test run until groups "A" and "C" show the same results ... which they should ultimately because it's an identical page. He figures that's enough traffic to determine whether the test cell "B" is a winner or a loser.

Useful links related to this article:

Screenshots of before-and-after, plus winners-and-losers:
http://www.marketingsherpa.com/pa2/ad.html

Optimost: The multivariable site testing service Palo Alto relied on for the project
http://www.optimost.com

MarketingSherpa's January Case Study: Palo Alto Software Raises its Sales Conversion Rates from 1.01 to 2.39 with Site Design Tweaks
http://www.marketingsherpa.com/barrier.cfm?currentID=2574

Harvard Business School article on experiential design related to multivariable testing with small respondent pools
http://harvardbusinessonline.hbsp.harvard.edu/b02/en/common/item_detail.jhtml?id=R0109K

American Supplier Institute explains what the Taguchi Method is:
http://www.amsup.com/taguchi_methods/index.htm

Palo Alto Software
http://www.paloaltosoftware.com/

Improve Your Marketing

Join our thousands of weekly case study readers.

Enter your email below to receive MarketingSherpa news, updates, and promotions:

Note: Already a subscriber? Want to add a subscription?
Click Here to Manage Subscriptions