by Daniel Burstein
, Director of Editorial Content
In the MarketingSherpa Website Optimization Benchmark Report
survey, we asked 1,548 marketers:Q. What are the contributing factors for your organization when determining what to test?
Click here to see a printable version of this chart
In a perfect world, we would test absolutely everything we do in order to see how customers react to different headlines, images, copy and calls-to-action in real-world settings. Also in this perfect world, the split testing tool would be parked right next to the marketing unicorn.
Of course, we don't live in a perfect world. Marketing is full of trade-offs, and deciding what to test is no different. So, what factors drive website optimization test design?
44% of marketers determine what to test based on Web analytics results
By looking at the analytics of your funnel or buyer's process, you can identify where on the path to purchase or conversion customers are leaving your pipeline. By focusing your testing and optimization on these barriers to conversion, you can help improve the results of the entire process.
For example, if customers are opening an email but not clicking through, you may look to test the headline, copy, call-to-action and layout of the send.
However, if they click through but abandon on the landing page, you may want to focus your testing on the messaging or layout of the page itself. Also, you could focus on the registration form on the landing page.
Respondents from bigger companies were somewhat more likely to say they make testing decisions based on Web analytics results, with 54% of medium companies (100 to 1,000 employees) and 49% of large companies (more than 1,000 employees) indicating they do so, but only 45% of small companies (less than 100 employees) indicating the same.
40% of marketers test based on their intuition and estimated impact on the page
Intuition can be the most difficult source for deciding what to test. On one hand, no one has a better understanding of your products, website and marketing campaigns than you and your team, so you probably know of some problem areas you would like to improve.
At the same time, because no one has a better understanding of your products, website and marketing campaigns than you and your team, you may be overlooking some challenges — ways in which the customer doesn't understand your website and marketing that come naturally to you since you are so well-versed in the creation of these campaigns.
To help overcome that blind spot, one Benchmark Report survey respondent recommended conducting a little usability research to help inform your A/B testing:
Test your website with a real customer and actually sit next to him or her to see what [they] think and do when going through the website. This reveals very valuable information regarding the effectiveness of the site. Keep pages simple — little text, no special effects around informational elements. [For example,] we had images that will slide over to uncover a product display upon mouseover. Turned out [the] majority of visitors didn't know [about] this interactive element and left the website without ever noticing the products.
The answers you get are shaped by the questions you ask
A/B testing can be very powerful, but every successful test begins with knowing what to test, and what elements are not a high enough priority for investing test resources. As a Benchmark Report survey respondent summed up pretty well, what it takes to be successful in Web optimization is to "know what to test, be patient and have realistic goals."
Related ResourcesWeb Optimization Summit 2014 in New York City — May 21-23, 2014
Marketing Research Chart: Critical website conversion path elementsTesting: 3 common barriers to test planningMarketing Research Chart: What are the most prevalent website optimization priorities?