Join thousands of weekly readers and receive practical marketing advice for FREE.
MarketingSherpa's Case Studies, New Research Data, How-tos, Interviews and Articles

Enter your email below to join thousands of marketers and get FREE weekly newsletters with practical Case Studies, research and training, as well as MarketingSherpa updates and promotions.


Please refer to our Privacy Policy and About Us page for contact details.

No thanks, take me to MarketingSherpa

First Name:
Last Name:
Apr 29, 2014

Marketing Research Chart: Determining which tests should be prioritized

SUMMARY: Once some marketers start engaging in A/B testing and see the results they can achieve from learning what messaging works best with their customers, they want to test everything.

Of course, you can't test everything. We're all limited by budgets, resources, time and other factors.

In this week's MarketingSherpa Chart of the Week, we'll take a look at the factors that drive marketers' testing decisions.
by Daniel Burstein, Director of Editorial Content

In the MarketingSherpa Website Optimization Benchmark Report survey, we asked 1,548 marketers:

Q. What are the contributing factors for your organization when determining what to test?

View Chart Online

Click here to see a printable version of this chart

In a perfect world, we would test absolutely everything we do in order to see how customers react to different headlines, images, copy and calls-to-action in real-world settings. Also in this perfect world, the split testing tool would be parked right next to the marketing unicorn.

Of course, we don't live in a perfect world. Marketing is full of trade-offs, and deciding what to test is no different. So, what factors drive website optimization test design?

44% of marketers determine what to test based on Web analytics results

By looking at the analytics of your funnel or buyer's process, you can identify where on the path to purchase or conversion customers are leaving your pipeline. By focusing your testing and optimization on these barriers to conversion, you can help improve the results of the entire process.

For example, if customers are opening an email but not clicking through, you may look to test the headline, copy, call-to-action and layout of the send.

However, if they click through but abandon on the landing page, you may want to focus your testing on the messaging or layout of the page itself. Also, you could focus on the registration form on the landing page.

Respondents from bigger companies were somewhat more likely to say they make testing decisions based on Web analytics results, with 54% of medium companies (100 to 1,000 employees) and 49% of large companies (more than 1,000 employees) indicating they do so, but only 45% of small companies (less than 100 employees) indicating the same.

40% of marketers test based on their intuition and estimated impact on the page

Intuition can be the most difficult source for deciding what to test. On one hand, no one has a better understanding of your products, website and marketing campaigns than you and your team, so you probably know of some problem areas you would like to improve.

At the same time, because no one has a better understanding of your products, website and marketing campaigns than you and your team, you may be overlooking some challenges — ways in which the customer doesn't understand your website and marketing that come naturally to you since you are so well-versed in the creation of these campaigns.

To help overcome that blind spot, one Benchmark Report survey respondent recommended conducting a little usability research to help inform your A/B testing:

Test your website with a real customer and actually sit next to him or her to see what [they] think and do when going through the website. This reveals very valuable information regarding the effectiveness of the site. Keep pages simple — little text, no special effects around informational elements. [For example,] we had images that will slide over to uncover a product display upon mouseover. Turned out [the] majority of visitors didn't know [about] this interactive element and left the website without ever noticing the products.

The answers you get are shaped by the questions you ask

A/B testing can be very powerful, but every successful test begins with knowing what to test, and what elements are not a high enough priority for investing test resources. As a Benchmark Report survey respondent summed up pretty well, what it takes to be successful in Web optimization is to "know what to test, be patient and have realistic goals."

Related Resources

Web Optimization Summit 2014 in New York City — May 21-23, 2014

Marketing Research Chart: Critical website conversion path elements

Testing: 3 common barriers to test planning

Marketing Research Chart: What are the most prevalent website optimization priorities?

See Also:

Post a Comment

Note: Comments are lightly moderated. We post all comments without editing as long as they
(a) relate to the topic at hand,
(b) do not contain offensive content, and
(c) are not overt sales pitches for your company's own products/services.

To help us prevent spam, please type the numbers
(including dashes) you see in the image below.*

Invalid entry - please re-enter

*Please Note: Your comment will not appear immediately --
article comments are approved by a moderator.

Improve Your Marketing

Join our thousands of weekly Case Study readers.

Enter your email below to receive MarketingSherpa news, updates, and promotions:

Note: Already a subscriber? Want to add a subscription?
Click Here to Manage Subscriptions

Best of the Week:
Marketing case studies and research

Chart Of The Week

B2B Marketing

Consumer Marketing

Email marketing

Inbound Marketing

SherpaStore Alerts


We value your privacy and will not rent or sell your email address. Visit our About Us page for contact details.