by Allison Banko
VacationRoost is a key destination for those whisked away by wanderlust. The vacation rental company offers more than 100,000 lodging options in more than 80 destinations ranging from chalets in Chile to mountainside cabins in Colorado. These rentals are speckled throughout the 25 brands and websites VacationRoost umbrellas, including HawaiianBeachRentals.com and MexicanDestinations.com.
"It's a marketer's dream," said Ryan Hutchings, Director of Marketing, VacationRoost. "It allows us to have different websites that share a platform. We can test a single thing across 25 different websites, aggregate the results and see what the impact is."
From his past experience in testing, Hutchings said that the best way to see ROI gains was through the foundation of a good testing platform — something VacationRoost lacked.
The company was simply treating testing like individual projects sporadically working on single landing pages. The project would come through the marketing department in which the team would create a test, use a third-party testing tool and then launch it.
"A single landing page could be a project that an employee works on," Hutchings explained. "It's saying, 'I'm going to pick this page because it's got a high volume of traffic or the bounce rate’s really bad and there are a lot of people going to it. So, as a project, let's fix that page.'"
But that's the issue. After launching the test, it's over — VacationRoost wasn't using testing to achieve consistent lifts over time. Additionally, because tests were treated as one-off projects, other tasks took precedence within Marketing, causing testing to fall by the wayside.
"When you look at [testing] just as a project-by-project basis, you're always butting up against other priorities that the company may have," Hutchings said. "Therefore, spending any given time for any employee or money or efforts or resources gets diminished."
Hutchings' challenge was to figure out how to go from launching random tests to integrating testing into the everyday processes of VacationRoost's marketing department.
VacationRoost developed two testing methodologies: one for small tests and one for large tests, formulating a strategy that is now weaved into the company's overall marketing strategy.
The team defines small tests as single landing page tests
that target items at the top of the funnel. These test pieces are for unique pages that aren't necessarily shared across all of the VacationRoost brands.
However, large tests qualify as those items that are shared across the brands. Large tests include vital pieces of the websites such as search, products, billing and checkout pages. These are the dynamic pieces that encompass the bottom of the funnel and directly impact conversion.
Step #1. Identify the need to establish internal testing
It is important to differentiate the approaches for small tests and large tests due to the capacities of the relevant testing platforms. Small tests can be performed on a third-party tool, but because large tests are more complex, that smaller platform can't execute them efficiently.
"It's hard with third-party tools because you'll have different versions of the page with dynamic parameters based off of what somebody's chosen, filtered or checking out with, and every user's different," Hutchings explained. "There's no easy way with a small, simple, $20-a-month tool to test all of the checkout pages with two versions on my site. You actually have to have a larger testing platform to do that."
For the large tests, Hutchings wanted to build an internal A/B testing platform rather than investing in a multi-thousand dollar, external platform. He explained that when you invest in something so expensive externally, it puts pressure on everyone internally to have these tests perform, which he thinks skews the testing process.
"There's so much pressure to justify costs versus upfront, [developing] an internal tool, that doesn't have any long-term, continuous, true cash financial implications on a month-to-month basis that I have to justify," he said. "It's just our time and effort, on the marketing side, that is the only determining factor of whether I spend more or less time on something and it's not going to get axed from the budget one day."
Welcoming testing within the company would ensure that testing wouldn't be viewed as one-off, single projects. Making it a responsibility within the marketing department would change testing from random, low-priority efforts to organized, high-priority work given the fact it would now be a part of the job description.
"[When] there are goals and bonuses based off [testing], all of a sudden, the prioritization seems to change and it doesn't become random," Hutchings said. "You don't do two or three a year. It becomes every week, I have tests running because somebody is responsible for that. And as an organization, we're banking on results."
To develop this internal support, Hutchings needed to utilize internal developmental resources and money to engage a partner — items that required company buy-in. However, he wouldn't need company buy-in to perform the small testing, making it a perfect starting point to build a case for the company to further devote to testing.
Step #2. Start small to garner buy-in
Conducting small tests required an inexpensive third-party tool that didn't require the team to go through the approval process. This way, the team could run tests on landing pages through a platform costing $30 per month and see some significant gains. These smaller tests would be a good beginning to build a larger case for significant investment and resources, Hutchings said.
"We said, 'Look, we've done a bunch of these individual landing page tests with these third-party tools. We're seeing success. I know the metrics on an aggregate. If I saw a similar gain into larger pieces of our purchase funnel, here's what the financial impact would be,'" he explained. "And that's how I built out that case to the executive team here — to the entire company."
Step #3. Develop two testing methodologies
After achieving company buy-in to move forward with embedding testing into regular processes, the team developed two testing methodologies to go by: one for small tests and one for large tests.
However, both tests begin the same initial step — decide what to test.
This is determined in a weekly meeting where Hutchings' team, along with a lead developer, meets to discuss what tests are running while deciding on and prepping for the tests that are up next. Upcoming tests are elected based on a prioritization spreadsheet.
While many testing teams may allocate importance to factors such as volume and bounce rates, VacationRoost's spreadsheet is organized by which level in the funnel the page or test would impact.
"Really what you want [to ask] is, 'What's going to have the most dollar impact on the organization?'" Hutchings explained.
He added that the pages leading into the purchase funnel rise to the top of the testing docket because of the likelihood of increasing conversion for the whole site is far greater on those types of pages.
"When we make the decision, it also is based off of some of the other tests that are running," Hutchings added. "Because you have to consider if I'm already running a test here, and here, it may make more sense to queue up something somewhere else and start the rotation. So, there are a lot of moving pieces, but it's something that always is a moving target."
Based on this decision, the test either falls into the small testing bucket or the large testing bucket. Here are the two methodologies for each.
VacationRoost utilizes small tests for items at the top of the funnel, including:
"Our goal is to always have one of these live and running," Hutchings said. "Usually, it's about two to three running at the same time on different websites or different areas, on a one-off basis."Identify the target conversion goal
After determining what to test, the team will look at the current page as it exists. Then, they will pick a single metric they want to impact with the test. Create a hypothesis
Next, the team generates ideas based on their testing partner’s Conversion Heuristic, asking questions such as:
- What can we add or change on the page that wouldn't impact any piece of that equation?
- Which piece of that equation has the highest potential to impact that conversion metric that we picked out?
Based on this equation, the team will hash out the hypothesis for the small test. Wireframe, design and launch treatment
The team then makes wireframes for the page and launches the small test employing a third-party tag management system.
From a technical standpoint, Hutchings said it's a 10-minute process to launch. Analyze results and calculate ROI
For these small tests, a third-party validation tool
simply spits out the results. The team also calculates the ROI from these small tests by using a spreadsheet in conjunction with that third-party testing tool.
"We'll use [the third-party tool] and Excel with some pre-built template spreadsheets specific to us that tell us how much we're going to make off of every single little test, even if it's a small landing page," Hutchings explained.
VacationRoost utilizes large tests for the bottom of the funnel, including pages such as:
Again, these are dynamic, shared pages across various VacationRoost brands. These tests are conducted on the company's internal A/B testing platform. The goal for these large tests is to always have at least one running at a time. Work with IT to define variables or events
Because custom variables are key with large tests, Development is more involved in this methodology than they are in the small tests.
"When they're implementing this test, Development has to be involved at that level by setting custom variables and events in the background, in the code," Hutchings explained. "That's one piece on large test that has to get inserted in the beginning."Identify target conversion goal for control
Similar to the small tests, the team must identify the target conversion goal for the test's control. Create hypothesis and set up validation sheet
The team utilizes their partner's Conversion Heuristic to create a hypothesis. However, differing from the small tests, this step for large tests involves setting up a validation spreadsheet. Because large tests are performed on an internal platform rather than through a third-party tool, validation requires more of a manual effort.
"Our company, the way we operate, 70% of our business actually happens offline," Hutchings explained. "So, we have 30% that does happen online, which is typical e-commerce, but standardly, people will actually take the conversation offline at some point or in the purchase process with us because it's a complex sale."
This all has to be taken into account when determining the validation of the large tests, so the team implements additional tracking options. Wireframe, design and launch treatment
With the hypothesis nailed down and the validation sheet in place, the team makes wireframes, designs the test and launches the treatment.
Between Development getting involved and the marketing department creating the page, it typically takes two weeks to launch a large test. Analyze results and calculate ROI
"On a large test, part of the internal development platform we have is a whole reporting system that tracks all of the online and offline channel leads generated from the test," Hutchings said.
The team then analyzes internal reports from the large test, aggregating this data manually in spreadsheets
to determine the test's success and ROI.
Since VacationRoost prioritized testing by creating an internal A/B testing platform and two methodologies for small and large tests, the company has experienced a 12% increase in total conversion rates. The team has run more than 50 tests in the past year and is now running two or three tests at any given time, conducting small and large tests simultaneously.
For marketers looking to transform the culture of their companies from small-project testing to ongoing, beneficial testing, Hutchings stressed two important factors — garnering buy-in and internal support.
In addition to Development, VacationRoost has a dedicated, full-time employee that spends 75% of their time on testing specifically.
"A lot of their job is to push tests forward and their salary and annual goals are partially based off of testing results," he explained.
Having someone who's heading these efforts year round is key to ensuring testing is not only a priority, but a wheel that keeps on turning. See Hutchings' presentation "The Nuts and Bolts: 2 testing methodologies an e-commerce company used to increase total conversion rates by 12%" at Web Optimization Summit 2014.
- Landing page
- Small test validation tool
- Lead form
Ryan Hutchings, Director of Marketing
Colby Gilmore, Interactive Marketing ManagerOptimizely
— VacationRoost’s landing page A/B testing tool MECLABS
— Conversion Heuristic
Related ResourcesE-commerce: 3 test ideas to optimize the customer shopping experience Web Optimization: How AARP Services boosted renewals by increasing usabilityHow a Long-term Optimization Strategy Led to a 6,031% Increase in Leads B2B Web Optimization: 140% surge in mobile transactions through responsive design effort