Close
Join 237,000 weekly readers and receive practical marketing advice for FREE.
MarketingSherpa's Case Studies, New Research Data, How-tos, Interviews and Articles

Enter your email below to join thousands of marketers and get FREE weekly newsletters with practical Case Studies, research and training, as well as MarketingSherpa updates and promotions.

 

Please refer to our Privacy Policy and About Us page for contact details.

No thanks, take me to MarketingSherpa

First Name:
Last Name:
Email:
Text HTML
Jun 27, 2012
Case Study

A/B Testing: How a landing page test yielded a 6% increase in leads

SUMMARY: A/B testing offers marketers the ability to continually learn from their customers, and improve marketing efforts through this new knowledge. To drive this point home, HubSpot sponsored a live test at Optimization Summit 2012 that helped drive a 6% increase in leads for one of its landing pages.

This case study breaks down the steps HubSpot and MECLABS took to design, launch and learn from an A/B test, and includes an inside look at the original hypotheses created for this test.
by Paul Cheney, Editorial Analyst

CHALLENGE

Setting up an A/B split test and actually achieving validity can be difficult for most marketers.

First of all, the technical knowledge required to set up a test is a little beyond the scope of marketers who are used to crafting messages and strategizing campaigns.

Second, aligning departments in a way that reduces the risk of invalidating the test requires some internal marketing -- which isn’t usually part of the marketer’s job description.

At Optimization Summit 2012, marketers Jessica Meher and Megan Anderson from HubSpot, a marketing software company, and optimization researchers Tony Doty and Erin Fagin from MECLABS, MarketingSherpa’s parent company, had to overcome all of these challenges while also considering the input of 280 marketers at the Summit and running a test in just two days.

These marketers were tasked with creating a test that would run live at the Summit to teach the Summit audience how to execute a test and interpret the results. The team tested a lead form page for HubSpot All-in-One Marketing Software, the test’s sponsor, that offered a free special report to download.

Step #1: Choose the right testing platform

For this test, the choice was easy, since HubSpot is a marketing software company that sells a testing platform.

By using HubSpot’s internal testing platform, Meher was able to set up the test without first going through technology.

"Eliminating that potential bottleneck gave us the power to get the test set up quickly," Meher said.

HubSpot’s platform allowed Meher and colleagues to point and click their way to developing test treatments and getting them ready for the test.

Of course, you should consider which testing platform is best for your organization. According to Todd Barrow, Senior Manager, Application Development, MECLABS, here are some of the factors you should consider:
  • Cost -- Both cost of the software itself as well as your own costs of implementation

  • IT Requirements -- Can you make changes yourself or do you have to rely on development for each change? If so, what are your development resources? Does the tool have a WYSIWYG editor for basic layout or do you need customized HTML for each treatment?

  • Support -- Do you have phone support or are you on your own?

  • Technology Ecosystem -- What is your current Web platform (for example, a content management system or static pages) and how will this new platform impact it?

"Even so-called ‘free’ software has costs as well," Barrow said. "You have to pay someone who knows how to use it to implement it."

Step 2: Gather and interpret the data you have

No matter how little data you have, there’s always some amount you can cull from your existing sources.

In this case, Meher had data from last year’s live test at Optimization Summit 2011.

While the test itself yielded invalid data, there were a few data points that helped the team plan the this year’s test.

"One thing we noticed was that in the previous test, conversion rates on the lead generation page were some of the highest we’ve seen anywhere, at something like 50%," said Tony Doty.

"So we knew that in order to reach statistical significance, we were going to have to carefully plan our treatments and make sure they were dissimilar enough to make a difference on an already high-performing page."

Another lesson the marketers learned from the previous year’s test was that the treatment didn’t represent a particular strategy.

"It was sort of a camel [a horse designed by a committee]," Doty commented.

Validity aside, if one of the treatments last year had won, neither HubSpot nor the live audience at the Summit would have known why.

Step 3: Determine the problems (or opportunities) with your control

Once the data was gathered and the lessons drawn out, the control was studied for problems and opportunities.

"There were no major problems with the control," Doty said. "It was a standard page, but there were ways we could think of to improve it."

Doty and his team thought they could improve the page through four main ways:
  • Reduce the amount of friction on the page

  • Show the monetary value of the offer

  • Add urgency to the offer

  • Make the image show the offer more accurately

Step 4: Create hypotheses and treatments that correspond to the opportunities

Once the opportunities were defined in the control, the team was able to quickly develop hypotheses and treatments for the experiment.

All that was left to do was create pages that corresponded to the opportunities. By creating specific hypotheses for each of the opportunities, the team was essentially guaranteeing that they would learn something from the test. Basically, they were creating horses, not camels, to test.

Here are the hypotheses the team chose:

Hypothesis 1 -- Visitors arriving to the page are highly motivated to download the e-book based on brand recognition. Removing friction from the page will result in a higher conversion rate.

Hypothesis 2 -- Communicating the urgency of the offer, that the free e-book download is a limited-time offer, will result in a higher conversion rate.

Hypothesis 3 -- Adding more visual value to the page, such as charts and graphs from the e-book, will result in a higher conversion rate.

Hypothesis 4 -- Incorporating pricing information to increase the perceived value of the e-book will result in a higher conversion rate.

Step 5: Determine which hypothesis to test

Once the treatments were developed, the team at HubSpot requested the help of the Optimization Summit audience to choose a treatment to test. Unfortunately, because of sample distortion effect, the test only had enough traffic to test one treatment. (Note: Some factors that contributed were the relatively high motivation of the audience combined with short duration of the test.)

So the team presented the four hypotheses and treatments to the live audience at the Optimization Summit, and asked them to vote.

Hypothesis #2 won the vote and was chosen for the test.

While the team used the Optimization Summit as its de facto optimization council, other marketers use peers in the marketing department, the IT department, Sales or simply peer relationships they have in other companies as an optimization council.

Step 6: Prepare the test and be cautious of validity threats

When preparing to run the test, Doty and the team had to take precautions to guard against validity threats.

"We tracked our channels so that if there were any strange looking numbers from a particular channel, we could remove it or segment the bad data out before things got out of hand," Doty said.

"We also did some quality assurance checks with the treatments and the testing software, so we knew everything would work correctly in the test."

Step 7: Run the test

Thanks to the preparation of the team, the test was able to begin almost immediately after the audience chose the hypothesis they wanted to test. Once the test was live, the team drove traffic to the control and treatment using email and social media.

During the Summit, there were several updates from the HubSpot testing platform. Throughout the two days of the Summit, the treatment consistently beat the control, indicating a higher probability that the results were valid. In all, it was a very clean test.



RESULTS

Once the test had enough samples that a winner could confidently be chosen, the team discovered the treatment generated 6.8% more lead form completions than the control at a 97% level of confidence.

"I think what surprised us is the fact that a limited-time offer did do better than the control. We have actually never tested a limited-time offer on one of our content download pages before," said Meher.
"But because it was a small increase, we also learned that a lot of people were going to the page just to get the free guide."

Essentially, through the test, HubSpot learned customers were primarily interested in the offer itself due to the 62% conversion rate average, and no amount of urgency could make much of a difference.

Creative Samples

  1. Control

  2. Treatment #1

  3. Treatment #2

  4. Treatment #3

  5. Treatment #4

  6. Results

Sources

HubSpot All-in-One Marketing Software -- The sponsor of the test, the marketer running the test, and also the testing platform used to run the test

MECLABS (parent company of MarketingSherpa) -- Optimization lab that helped conduct the test

Related Resources

Bad Data: The 3 validity threats that make your tests look conclusive (when they are deeply flawed) -- via MarketingExperiments

A/B Testing in Action: 3 Real-Life Marketing Experiments -- via HubSpot

What to Test: 4 sample landing page treatments from Optimization Summit 2012 -- via MarketingExperiments

Online Marketing Tests: How do you know you’re really learning anything? -- via MarketingExperiments

Online Marketing Tests: How could you be so sure? -- via MarketingExperiments

Email Optimization and Testing: 800,000 emails sent, 1 major lesson learned

3 Live Landing Page Optimization Lessons


See Also:

Post a Comment

Note: Comments are lightly moderated. We post all comments without editing as long as they
(a) relate to the topic at hand,
(b) do not contain offensive content, and
(c) are not overt sales pitches for your company's own products/services.










To help us prevent spam, please type the numbers
(including dashes) you see in the image below.*

Invalid entry - please re-enter




*Please Note: Your comment will not appear immediately --
article comments are approved by a moderator.

Improve your marketing

Join our thousands of weekly Case Study readers. Enter your email address below to receive MarketingSherpa news, updates, and promotions:
Note: Already a subscriber? Want to add a subscription?
Click Here to Manage Subscriptions