How can you improve your lead generation efforts? Learn from your customers. A/B testing is one way to do that.
by Erin Hogg, Copy Editor
Conducting an A/B split test from launch to result analysis in two days does not sound like a walk in the park for many people, but for the MECLABS team and attendees of Lead Gen Summit 2013, it was a chance to gain testing insights and learn about lead gen in an interactive format.
The goal of this Summit's live test was to learn how presentation of incentives and form field impacted lead gen rate … but also to provide an example of how to A/B test for lead gen marketers.
Every new project or campaign should be built on the shoulders of all of the projects that came before it.
In this case, presenting a live, interactive test at Summit has become a tradition at MarketingSherpa Summits. Beginning at Optimization Summit 2011, then Optimization Summit 2012, to this year's Optimization Summit 2013, the process of designing a live test for a Summit audience has been streamlined into a process that helps audience members gain actionable insights and take back ideas for their own testing efforts.
This year's live test at Lead Gen Summit 2013 was the culmination of lessons learned from past tests, considerations of what does and doesn't work, as well as teamwork and collaboration of multiple departments at MECLABS, parent company of MarketingSherpa.
In partnership with Act-On, a marketing automation company and premier sponsor of Lead Gen Summit 2013, MECLABS ran a two-day test focused on discovering which incentives and form fields yield the highest lead gen rate. However, the main goal of the test was to gain new insight, regardless of the outcome. The test was developed by Kyle Foster, Research Manager, and Brittany Long, Research Analyst, both of MECLABS.
"We wanted to build a test that can’t lose," Foster explained, emphasizing how test design should be focused on learning about the customers. In this way, even if treatments produce a negative result, the increased customer intelligence can be applied to future campaigns for future gains.
The results of weeks of strategy and planning was an A/B split test of incentive offers and form field additions to discover which approach yields the greatest number of leads.
Here are the steps the team took to design and launch this test.
Cross-departmental collaboration can be crucial to effective A/B testing.
According to Foster, the beginning stages of the test strategy revolved around meetings with team members to nail down what would be the best possible test to run at Lead Gen Summit. Foster met with team members involved in previous Summit live tests to discover what worked, and what didn't, in those tests.
From there, the test developers met with the content and marketing teams to receive feedback on the test thus far, and build on their ideas. Then, the team met with members from the MECLABS optimization team. The optimization team supports the development of test plans, experiment designs, wireframes and idea development at MECLABS.
The next step was to present the test design in a Peer Review Session, or PRS. These sessions are an opportunity for any team member or members to present ideas and gain feedback from colleagues they might not normally collaborate with.
This was a beneficial part of the process because Foster and Long had produced several ideas at this point of the test development as to what should be tested and how it should be presented on the landing pages. Through deliberation and collaboration with many members of the MECLABS team, a final test design was proposed and approved.
Lead generation marketers often use incentives in the lead capture process. So it was decided this experiment would focus on the best way to present these incentives along with determining if presenting incentives in a way that connoted more value could encourage prospects to include more information in the lead gen form.
The chosen incentive for this test was a choice of MarketingSherpa Quick Guides, a $45 value, which would be offered for free after the visitor filled out a short or long form.
The live test consisted of a control landing page and two treatment pages. On the control page, visitors would be presented with a MarketingSherpa Quick Guide and a short form to complete. Treatment #1 contained the same short form as the control, but visitors would have the choice of one of three Quick Guides. Finally, Treatment #2 had a choice of incentives, but also required a longer form for visitors to complete before receiving the incentive.
By adding a choice of incentive, the team attempted to see if giving visitors their pick of incentive would raise perceived value of the incentive. In the treatments, the choice of incentive was indicated by tabs on the page of the different incentives that were available.
However, the team also wanted to test if adding more fields to the lead capture form would add too much perceived cost (time, in this case) to overcome the perceived value for visitors.
At this point in test development, all of the ideas and concepts for the pages were finalized and sent to the development team at MECLABS.
The first step for that team was to evaluate the compatibility of Act-On's marketing automation platform with what the team had planned.
Once the development team understood how the platform worked, the test design process went to a design team to plan the look and feel of the pages.
After all of these elements were developed, the pages were built, analytics were set up, and a quality assurance (QA) process began on tracking and development. The entire process from building the pages through QA testing spanned two weeks.
During the opening welcome on day one of Summit, Foster presented the live test to the audience.
In addition to learning the objectives of the test, attendees received an audience sheet.
The survey sheet gave attendees the opportunity to choose which incentives would be options for visitors to the pages, the number of additional form field(s) that should be added, and what those additional form field(s) should ask visitors.
After collecting the surveys, the results were sent back to the lab in Jacksonville Beach, Fla., for the deveopment team to launch the test. Out of 153 surveys, the chosen incentives were:
As for the form field winner, 44% of the audience chose to add only one additional form field. Runner-up at 34% was two additional form fields, and 22% wanted to test three additional form fields.
At 32%, the audience selected asking for a job title as the top choice for the one additional form field. A close contender was asking for company name at 30%.
During Summit, attendees also indicated which treatment they thought would win — 20% chose the control landing page, 64% chose Treatment #1 (choice of offer, short form). Finally, 16% chose Treatment #2 (choice of offer, long form).
With all of these variables in place, the test launched Tuesday at 1 p.m. Eastern Time, day one of Summit, and concluded in the afternoon of day two. The team had two hours to incorporate the audience's changes into the pages from the time the surveys were received till the launch.
To drive traffic to the pages, an email was sent through the Act-On platform to both MarketingSherpa newsletter subscribers as well as a list from Act-On, totaling around 300,000 recipients.
The live test process didn't stop there. The sciences team in Jacksonville Beach evaluated the experiment as results streamed in from around the world during Lead Gen Summit, and collaborated with Foster, who was on-site, to craft updated data with interpretations of the results.
After lunch on the first day, the MECLABS team updated attendees with how the test was performing. At this point in time, the chosen incentives and form fields were revealed to the audience, as well as the predicted winner of the test.
On day two of Summit, attendees were updated first thing in the morning. The control landing page had received a conversion rate of 58% while Treatment #1 received a rate of 57%. However, the statistical level of confidence for Treatment #1 was 33%. Finally, Treatment #2 received a conversion rate of 51% at a 99% level of confidence.
Not every test can run perfectly without a hitch, and this test was no different.
One challenge the MECLABS team had was releasing the test and getting the audience's changes in quickly.
However, the test was accidentally released via LinkedIn before the audience's changes were implemented. Luckily, the mistake was caught and retracted, the channel was small, and the test was released correctly and on time.
Another complication was an issue with tracking conversions with analytics software. While there were a few hiccups, the team managed to recover and track conversions on the pages.
"The importance of a QA process when testing cannot be understated. If we would have tested the analytics platforms two hours before we launched, instead of afterward, we could have had those problems fixed," Long said.
In the end, there was no significant difference between the control (single offer, short form) and Treatment #1 (choice of offer, short form). Treatment #2 underperformed both the control and Treatment #1.
From the results, marketers learned that a choice of incentive does not increase perceived value of that incentive. Having the choice between the one versus three MarketingSherpa Quick Guides did not make a significant difference into the number of leads generated from the form.
Also, when comparing the control page with Treatment #2 (choice of offer, long form), a takeaway from the results is the perceived value of the Quick Guide did not outweigh the perceived cost of having to fill out a longer form with more information.
The top choices for incentive selected by visitors to the site were:
The results of the live test are as follows:
Treatment #2 with the additional form field asking for job title reduced lead gen rate by 11%.
(Editor’s note: Due to the teaching nature of the live test, two validity threats were intentionally introduced that might have skewed test results. Priming a small number of test recipients likely took place, since members of the Lead Gen Summit audience may have received test treatments after learning about the test. This may have introduced a small selection effect. Also, since the test was conducted in such a short time frame to complete the test during Lead Gen Summit, a history effect may have been introduced.
It should be your goal in your A/B testing to create a controlled experimental environment that accounts for these validity threats. For more information about validity threats, you can read or watch "Bad Data: The 3 validity threats that make your tests look conclusive (when they are deeply flawed).")
"The Control and Treatment #1 performed similarly, but when you compare it to Treatment #2, it had a significant difference for both the control and Treatment #1. So if anything, we take away that Treatment #2 lost," Long said.
However, "it depends on what you mean by lost, because Treatment #2 had the additional form field, it was an 11% relative decrease in lead gen. But to some people, extra information, like knowing job title, might be worth taking an 11% loss. That's up to the organization," she added.
"Even though the treatment didn't win, it doesn't mean it wasn't a successful test," Foster said.
Another element the team analyzed when reviewing the results was the tabbed layout approach. The team knew a drop-down would hide the incentive choices, so using the tabs on the top of the page was a way for visitors to see all of the available incentives.
Visitors chose the Lead Generation Quick Guide most frequently on the treatment pages, but it was the default choice, the first in the order of tabs.
"If I could keep testing this, I would test the tabs in the layout. I'm not sold that this was the best approach. I want to see if the tabs created friction, not the additional form field," Foster said.
Another element the team called into question after the test was the value of the Quick Guide, and the importance of offering an incentive relevant to your audience.
"I feel like for those on the MarketingSherpa mailing list, they understand what a Quick Guide is, and know the value that's attached. But, maybe those on the Act-On list did not know what a Quick Guide was and asked, 'Why do I want these? Is it reputable?'" Long said.
Overall, the goal of the test performed at Lead Gen Summit was to learn new customer insight, which is exactly what occurred.
"Even if you're losing, it could still be successful. It's more about the learning. Even though lead gen was flat, it gave us information about what our customers wanted," Foster said.
From one tester to another, Long explained her key takeaways from the test for marketers to apply to their own testing efforts.
"Be decisive about your choices about what you're going to test before you run the test, and make sure that you have safeguards in place for your analytics, or anything really, so that you know the data you're collecting is going to be accurate and valid," Long said.
MarketingSherpa Lead Gen Summit 2013 Wrap-Up: Top 7 lead capture, qualification and nurturing takeaways
Testing and Optimization: Radical website redesign program improves lead gen 89%
Email Marketing: Change in incentive offer causes a 25% increase in email subscribers in one day for nonprofit
Online Marketing Conversion: “Free” is a Pretty Strong Incentive
Landing Page Optimization: For the best test ideas, look beyond yourself
Landing Page Optimization: Identifying friction to increase conversion and win a Nobel Prize
Act-On, sponsor of the Lead Gen Summit 2013 live test
Get Better Business Results With a Skillfully Applied Customer-first Marketing Strategy
The customer-first approach of MarketingSherpa’s agency services can help you build the most effective strategy to serve customers and improve results, and then implement it across every customer touchpoint.Get More Info >
Infographic: How to Create a Model of Your Customer’s Mind
You need a repeatable methodology focused on building your organization’s customer wisdom throughout your campaigns and websites. This infographic can get you started.Get the Infographic >
Infographic: 21 Psychological Elements that Power Effective Web Design
To build an effective page from scratch, you need to begin with the psychology of your customer. This infographic can get you started.Get the Infographic >
7 Steps to Discovering Your Essential Value Proposition with Simple A/B Tests
Drive better results when you discover what it is about your business that customers love.Get the Guide >
Increase Mobile Conversion Rates
These five free micro classes (each under 12 minutes) apply 25 years of research to help you maximize the impact of your messages in a mobile environment.Get the Course >
Test Discovery Tool
Show business leaders all the results of your testing efforts with this free tool.Get the Tool >
Receive the latest case studies and data on email, lead gen, and social media along with MarketingSherpa updates and promotions.