November 02, 2004
Should your landing pages for paid search ads be different from your other online campaign landing pages? Turns out the answer may be yes. Consumers using search engines are in a slightly different mindset than folks who happen to see your regular online ads elsewhere on the Web. So, you need to tweak your regular landing page to get a higher conversion rate. Our new Case Study reveals how one marketer tested 22 different variables from headlines to submit buttons against banner vs. search traffic. Yes, results data and creative samples included:
Way back in 2003, NextStudent's marketing team tested six different landing pages to arrive at their control for online ads promoting their student loan consolidation services.
These initial tests revealed three critical factors:
1. A one-page version that includes both offer copy and a free registration form worked far better than a two-page version requiring users to click to a second page to complete the form.
2. Headlines that precisely matched the banner or search ad the user came from worked the best, but if you had to rely on only one landing page to save resources, then the headline that came closest to matching the gist of the ads' copy worked the best.
3. Super-clean design featuring a white background, brief bullet-pointed copy, at least 10 point type, no extraneous links or navigation, and no obvious graphics beyond the logo and submit button, worked the best.
Happy with their results, the team made the final winner their official landing page (link to sample below). Then, Marketing Manager Scott Linzer began to notice a trend. Paid search ad clicks converted to filling out the loan qualification form at a different rate than banner ad clicks.
He figured this was due to the fact someone actively searching under a term such as "consolidating student loans" was deeper into the buying cycle than someone who impulsively clicked on a banner ad offering the same service.
But, it made him think. Since these searchers were in a slightly different mindset, would they convert better to a slightly different landing page?
First Linzer had to get a budget for creative help and a Web analytics package to conduct more landing page tests. "We figured if we could double the conversion rate, it was like spending half the ad budget for the same amount of results."
But, doubling conversions is a pretty tough aim, so the team targeted a safer 20% lift, and figured out how much that was worth to the bottom line. They needed to show a return quickly, so this time they used a Web analytics package that could test 22 different variables with reliable results in under two months.
To make the tests statistically valid, they split tens of thousands of clicks from the two sources (banner ads and paid search ads) between the control and the test pages. Key variables tested included (in order of priority):
The headlines tested were all winners from past banner and search ads, including:
Test a. The Next Two Minutes Could Save You Hundreds of Dollars
Test b. Slash your student loan payments by up to 59%
Test c. Find Out In 60 Seconds If You're Eligible
Test d. Find Out If You're Eligible - It's FREE
o Submit buttons
As we've reported in the past, wording, placement, and look of a submit button can be absolutely critical. The team had learned from past tests that a very utilitarian design worked the best (see link below for samples) so they stick with their standard button look. But they tested moving it from the bottom left to the bottom middle of the page.
They also tested wording differences, including
Test a. Am I Eligible? Find Out Instantly!
Test b. Click to qualify - It's free
(Note: they did *not* test "submit" because that's a guaranteed loser for many sites.)
o Body copy
The team wanted to keep their body copy fairly short because their target demographic -- young adults aged 21-25 -- don't tend to have long attention spans online, and every line of copy would push the rest of the form further down the page, increasing scrolling.
They then tested three different copy styles, each occupying the same amount of space on the page:
Test a. factual bulleted copy from the control page
Test b. factual paragraph-form copy
Test c. dramatic sales copy based on their all-time winning banner ad creative "College Graduates Kissing Federal Government Officials"
o Additional copy
There wasn't much more copy on the page, but the team did test:
Test a. long and short forms of the privacy and "your SSN# is safe with us" blurb near the bottom of the page.
Test b. renaming the page section headers, such as testing "About You" versus "Almost done!"
o Layout and design elements
The design team were eager to see results from these three tests:
Test a. adding a photo of happy college grads to the upper right corner of the page
Test b. putting a light block of color (in print deign parlance, it would be a "light screen") behind one section so the page wasn't one long stretch of only black and white.
Test c. moving the form's box descriptors such as "First Name" "Last Name" from the left of each box to a position on top of each box.
Test d. leaving question radio buttons empty vs. pre-checking them with the most common answers
Tests revealed two different winning landing pages -- neither of which was the old control. The search marketing winner increased registrations by 43.1%, and the banner ad winner increased registrations by 49.4%. (Link to samples of both below.)
Happily, the increase in registrations didn't cause a corresponding decrease in the value of each submitted application. (This is a very real danger.)
What were the differences? Overall the search landing page was a bit more serious and factual than the banner ad page.
For example, banner-driven visitors responded better to the word 'Free' in the headline and submit button. However, "free" reduced conversions for search visitors. They preferred the headline, "Find Out in 60 Seconds if You're Eligible" with a similar button.
Linzer says, "We believe the search audience is looking for bare bones details about the program, the facts. They may need less handholding through the process, less selling."
Some factors were winners no matter where traffic came from, including keeping the general look and feel of the old control, plus:
- Adding the photo of the happy graduates to the upper right corner
- Adding a color block behind text midway through the page to help define areas
- Keeping the bullet point body copy from the control
- Shortening the privacy/SSN# security info (warning, not a good idea for an older demographic)
- Pre-checking radial buttons to the most common answers
The biggest surprise? Yes, moving the form's descriptive tags from left of box to above box made a difference. However, banner ad visitors preferred the tags to the left of form boxes, and search visitors responded better to tags on top of boxes.
The last result was extra startling because the extra vertical space the tags on top of boxes took up meant that the submit button was pushed below the fold for many visitors. And generally anything that pushes a response button below a fold is a guaranteed response depressor.
Which just goes to show you have to test even proven "facts" about landing pages to see what works with your traffic.
Linzer says, "We've locked in the best performers, but I don't think the testing will ever be done. The more minutiae we can test, the more we can increase conversions and spend less money."
Useful links related to this article:
Creative samples - screenshots of tested landing pages: http://www.marketingsherpa.com/nextst/ad.html
Optimost -- the Web analytics system NextStudent used to create and run the tests quickly and easily: http://www.optimost.com
Mighty Interactive - the creative agency who helped with the copywriting and design of landing page tests http://www.mightyinteractive.com/