Five years ago, John Peebles, VP, Online Marketing, Avis Budget Group, and his team used live and manual testing to tweak their websitesí designs and appearances.
"There are problems with that," Peebles says. "You have to show your changes to 100% of the people, and that means you have to take your risks upfront, which makes you reticent to try new ideas."
The vehicle renting team has since abandoned the manual approach for multivariate testing. They can test pages on a smaller percentage of site traffic, test multiple versions, and calculate the results if applied site-wide.
"As it turns out, when you havenít done this before and you start doing it, thereís a lot of low-hanging fruit out there," says Peebles.
Avis Budget quadrupled its online revenue in the last five years, due in part to the teamís testing. Online revenue now accounts for more than 20% of the companyís total revenue. Along the way, they learned several important lessons about how to make the most of multivariate testing.
Here are seven key tests and tactics the team used to generate more revenue through testing.Lesson #1. Start with the homepage
For many companies, your homepage is your most visited page. This traffic makes it a good place to find quick, big wins and to find out if multivariate testing will help you.
Peebles' team tested many homepage design elements, including:
o Button size, color and design
o Newsletter registration link placement
o Taglines and headlines
"Within our first few months, we had tested about 20,000 versions of the homepage," Peebles says.
- Aim for incremental improvements
Each improvement typically lifted performance by tenths of a percent. But those tenths quickly added up to big gains. Since they started testing the homepage, the teamís overall conversion rate increased "by a couple of percent at most," Peebles says. But those small percentage lifts translated to more than $9 million in revenue.Lesson #2. You can test offline concepts
The subject of ancillary products came up during a conversation between Peebles' team and the operations team. Operations wanted to test the price elasticity of ancillary products, such as GPS navigation, but they could not test it in brick-and-mortar locations.
"We canít ask one guy in line, 'Hi, would you pay $20 for this?' and then ask the next guy in line, 'Hi, would you pay $22 for this?'"
Instead, the team tested price elasticity online by measuring how incremental price changes affected conversion rates. This revealed which prices could maximize overall revenue.
What did they find? "Most of our products are highly [price] inelastic," Peebles says, so small price increases have almost no effect on demand. Lesson #3. Tests can help navigate tricky office politics
As the teamís websites grew in importance to the company, Peebles' team often received internal requests to add or change something on a page.
"Iíd say 'OK, Iíll put it up there for 10% of the [visitors] and weíll see how it does.'"
If the change did not improve performance, Peebles had the metrics to prove it, and only 10% of visitors saw the change. If the change did improve performance, the team could plan how to move forward.
"It takes the politics out of it. You give people the data and they respond very well."Lesson #4. Apply lessons to other sites
The team operates several websites under different brands, such as Avis.com, Budget.com and BudgetTruck.com. The team found that certain insights gleaned from tests on Avis.com were applicable to the other brandsí websites as well.
For example, the team learned that keeping the Avis.com homepage simple with a limited number of offers improved performance. They took this lesson and applied it to Budgetís site, producing similarly improved results.
However, not every test translates equally across sites. The brands have slightly different target customers.
"There are some things weíre learning that are universal, regardless of the brand, and there are other areas where the brand might behave differently," Peebles says.Lesson #5. Make testing routine
The team constantly ran tests after adopting multivariate testing. Peebles sees testing as a way to improve almost every online effort.
If his team is making changes to the website, he wants them to ask themselves:
o What should I be testing here?
o Is this an opportunity for testing to improve performance?
- Prioritize testing projects
The team could theoretically test every aspect of its online marketing. But instead, they prioritize projects on their impact potential.
For example, a team member recently suggested testing coupon merchandising. Peebles and the team liked the idea, but they had other priorities and limited time, so they held off.Lesson #6. Worry about "why" later
Multivariate testing can provide baffling results. A button of a certain color might perform better than another, or a product might sell more at a higher price.
Still, results occur for a reason. But those reasons might be so obscure that youíre better off worrying about your next test. Trust your numbers and keep incrementally improving performance.
"We try not to dwell on the 'why.' We focus on the 'what' and worry about execution, because otherwise youíll drive yourself crazy," Peebles says.
- "Why" might help segmentation
Understanding why your customers made decisions might help you improve segmentation, even if itís unimportant your tests.
For example, if you found visitors clicked an image more often than other tested images, the linked content could tell you more about your customersí preferences. This is an area Peebles' team is considering pursuing in the future. Lesson #7. Test the messaging of your offers
Testing the wording, presentation and other elements of online offers comprised about 20% to 30% of all the teamís multivariate testing, Peebles says.
For example, the team tested several different versions of a Web page where customers added additional products and services to their rental reservation. They wanted to test how to increase the number of customers who added GPS navigation without hurting reservation rates.
The team tested four different creative versions and showed them to a limited number of customers. Hereís what they found:
Version: GPS Take Rate, Reservation Rate
o Original: 8.8%, 21.45%
o Version A: 10%, 21.2%
o Version B: 9.75%, 21.4%
o Version C: 9.4%, 21.65%
o Version D: 9.3%, 21.7%
As you can see, the team lifted both the rate of GPS purchase and their overall reservation rate in versions C and D. They selected the page that generated the most revenue and added it to the site (see creative samples below).Useful links related to this article
Creative Samples from Avis Budget Groupís multivariate testing:
How Multivariate Test on Banner Ads Boosted Clicks 50.4%
Before & After: 4 Steps to Identify Best Redesign with Multivariate Tests & Lift Lead Gen 85.3%
Autonomy: Provided the teamís multivariate testing tool
Avis Budget Group