June 09, 2011
Event Wrap-up

Optimization Summit 2011 Wrap-up: 6 takeaways to improve your tests and results

SUMMARY: We're back from the southern heat of Atlanta and are capping off the first-ever Optimization Summit. We've brought back six takeaways you can use to improve your optimization tests and marketing.

See how you can improve metrics reports, write better blog posts, and even make use of landing page tests that have poor results. Also, find out which types of tests aren't reliable.
by Adam T. Sutton, Senior Reporter

Last week, we closed the first-ever Optimization Summit. Marketers from across the country came to Atlanta to hear the latest optimization research and case studies inside the monolithic Westin Peachtree Plaza Hotel, and are now likely scrutinizing their own landing pages and feverishly planning tests.

Our notebooks are bursting with insights from the two-day event. If you could not join us, or you just need a refresher, don't worry. We have you covered.

Below are six takeaways we pulled from the pile. You'll find out which metrics you should stop ignoring, why you should never copy another company's landing page, and how negative test results can help your marketing.

Takeaway #1. The goal of a test is to learn something

Dr. Flint McGlaughlin, CEO & Managing Director, MECLABS, told marketers in the kick-off session not to beat themselves up over tests that yield negative results.

“The goal of a test is to get learning, not a lift," Dr. McGlaughlin said.

Optimization tests that are statistically valid are valuable. They provide insights into an audience's behavior -- even if the treatment does not outperform the control. As long as the test is valid, the results should help you learn more about your visitors' needs and preferences.

For example, Dr. McGlaughlin highlighted two landing page tests that decreased response rates by more than 50 percent from the control page. After analyzing the data, the team hypothesized that the channel driving traffic to the page had already presented the offer to visitors. The copy on the pages was unnecessary and slowed visitors on the path to conversion.

The team then tested a radically different page that cut nearly all the copy to prevent it from slowing visitors further. The results: 78 percent increase in conversion rate.

As the example shows, optimization tests with poor results still provide valuable information about your audience. If you learn from every test and apply those lessons moving forward, you will improve your results.

Takeaway #2. Measure the value and segment the data

Matt Bailey, Founder & President, SiteLogic, railed against what he called 'caveman analytics.' Reports on page views, time spent, page rankings and similar metrics are red herrings, he said. Instead, marketers should focus on the amount of revenue, or "value," generated.

"Value tells you how many people have met the goal, what the conversion rate is, and ultimately how much money you made per visit and overall," Bailey said. "Then you make decisions based on value rather than based on numbers."

After marketers uncover the value of their marketing channels, they need to segment the data to compare the value of various pages, audiences, campaigns, etc. Comparing these segments will reveal where to invest, Bailey said.

Two examples of how value-based analytics can refocus a marketing program:

- SEO: "Page ranking becomes irrelevant when you start focusing on value, because you can find out which words mean nothing and which words mean everything and you can go after the words that bring you money," Bailey said.

- Lead generation: Bailey mentioned an experience with a real estate website. His team wanted to uncover the marketing channels that drove the highest-quality leads. "We looked at where the sales came from and found that every single sale came from a link on another website that we bought for $20 a year."

Takeaway #3. Write "uncomfortable" blog posts

People who are using search engines to research purchases are not shy about what they're looking for. Many of us have used these phrases:

o "Product A vs. Product B"
o "Problems with Product A"
o "How much does Product B cost"

Even though people are hungry for this type of information, too few companies publish it, said Marcus Sheridan, Co-Owner, River Pools & Spas.

Sheridan described how he blogged his company's website into the world's most popular swimming pool site in terms of traffic. One key tactic was to not shy away from discussing his company's products.

"We can't be afraid to talk about anything the consumer wants to know about; the good, the bad, and the ugly," Sheridan said.

This includes blog posts that cover:

o How your products compare to alternatives
o Potential problems with your products (and how to avoid them)
o How much your products cost (even if it varies, why does it vary?)

The title of River Pools & Spas' most-trafficked blog post:
"Fiberglass swimming pool pricing cost guide by River Pools"

Takeaway #4. Test re-targeted display advertising

Marketers spend their careers attracting relevant visitors to their websites. But when a visitor leaves without converting, is all that investment wasted?

Not necessarily, according to Randhir Vieira, Director of Product Marketing, Eye-Fi. Vieira described how his team increased Eye-Fi's conversion rates. One important tactic he mentioned was to use re-targeted display advertising.

Re-targeting systems tag visitors when they arrive on a brand's website. When visitors leave, they will see display ads for the brand on sites throughout the Web (the sites are typically in an ad network). Why bother doing this?

"The reason we do it is that these are high-quality people. They come to your site. They are interested in your products. Otherwise they would not come there… Re-targeting gives you another opportunity to get in front of them with different messages that may work for them and an opportunity to bring them back to the funnel that you've worked so hard to optimize," he said.

- Look for 'view-through' conversions

Eye-Fi's re-targeting program was not immune from the notoriously low clickthrough rates that plague most display ads. Of people who clicked, 7 percent converted, "which kind of stinks," Vieira said, because the site converted 8 percent of visitors in general.

"However, we looked at something called 'view-through conversions.' They are people who came to our site, went away, saw the ads on the other sites, but did not click on them. Yet, they came back to our site [on their own accord and converted]."

This stat hit 88 percent -- or about 12-and-a-half times the conversion rate for people who clicked the ads.

"When you add those numbers up, it is phenomenal. That is why I highly recommend that you do retargeting. It is a very cost effective way to get in front of your people," Vieira said.

Takeaway #5. Tests must be valid to be useful

You do not need to be a stock car racer to get onto the highway, and you do not need to be a statistician to optimize landing pages. In both cases, though, you need to follow basic rules to avoid driving off the road.

Validity is a core principle in optimization. Test results need to be statistically valid to be accurate. But “doing the math” (or relying on a testing platform to do it for you) only covers one type of validity threat – the sample distortion effect. Sample distortion can render a test invalid if there is a failure to collect a sufficient number of observations.

The savvy optimizer must account for these other validity threats as well:

o History Effect: When a test variable is affected by an extraneous variable associated with the passage of time. For example, a news event that changes the nature of arriving subjects, or an email they receive that changes their nature.

o Instrumentation Effect: When a test variable is affected by a change in the measurement instrument, such as response slow-downs due to a server overload, splitter malfunction, inconsistent URLs, server downtime, etc.

o Selection Effect: When a test variable is affected by different types of subjects not being evenly distributed among experimental treatments. For example, uneven distribution of traffic from sources among treatments, channel profile does not match customer profiles, self selection (bias).

If your test shows that Page B had a greater response than Page A and your results are statistically valid, then you can be certain that Page B is the best choice assuming you have no validity threats.

If your results are not valid, then you cannot be sure Page B is the better performer. Selecting B over A might be a mistake, because the decision might be based on flawed data. This is why optimization tests must be valid.

- Live page optimization example

At the Summit, we saw an example of how validity threats stretch beyond the "confidence levels" provided by most testing platforms. Dr. McGlaughlin led a live A/B landing page test that spanned the Summit's two days.

The results seemed clear at first, but quickly wavered. First one page was the high performer. Then the second page became the high performer. Then the pages switched again, and so on. This is an indication that you need to take a deeper dive into the data.

MarketingExperiments' researchers dug into the results and found the problem. Several people had tweeted direct links to the control or treatment page. Since these links bypassed the test's randomization tool (which randomly served visitors page A or B), they skewed the results and invalidated the test.

Dr. McGlaughlin used the test's results to emphasize his point: "There are multiple threats to our validity -- and you need to achieve real validity to make good decisions."

Takeaway #6. Everyone has unique challenges -- focus on yours

Through all the case studies and discussions, it became clear that every marketer confronts a unique set of optimization challenges when testing. For example:

o Cara deBeer, SEM Manager, Sokolove Law, inherited a large paid search advertising program where cost-per-click rates ranged above $200.

o Paul Terry, Senior Manager of Optimization, Consumer Source, described how his company had no website expertise inside the organization, yet had to transition from a print-based strategy to an Internet-based one.

o As mentioned above, Vieira's team had an audience that was skeptical that his product worked as the company claimed.

- Their challenges are not your challenges

The diversity of challenges overcome by these marketers is inspiring. It also emphasizes a point made by Boris Grinkot, Associate Director of Product Development, MarketingSherpa, who said that copying other company's success is useless.

"Imitation is not the highest form of [optimization]. You cannot just look at someone's success and copy what they do," Grinkot said.

Every company has unique challenges. Their products, organizations, audiences, budgets and values are completely different. To copy their landing pages is to assume that you have the same challenges -- which is impossible. You must test to uncover the best pages to fit your unique situation.

Useful links related to this article

CREATIVE SAMPLES
1. Landing page treatments 1 and 2
2. Final landing page treatment

Subscribe to the MarketingSherpa Inbound Marketing newsletter

Optimization Summit 2011 - Schedule and Slides

Optimization Summit: Tests with poor results can improve your marketing

Live from Optimization Summit: “Clarity Trumps Persuasion”

How do I know if my test data is valid?

Online Marketing Tests: How do you know you’re really learning anything?

SiteLogic

River Pools & Spas

Eye-Fi

Sokolove Law

PRIMEDIA - The parent company of Consumer Source, Inc.


Improve Your Marketing

Join our thousands of weekly case study readers.

Enter your email below to receive MarketingSherpa news, updates, and promotions:

Note: Already a subscriber? Want to add a subscription?
Click Here to Manage Subscriptions