Close
Join 237,000 weekly readers and receive practical marketing advice for FREE.
MarketingSherpa's Case Studies, New Research Data, How-tos, Interviews and Articles

Enter your email below to join thousands of marketers and get FREE weekly newsletters with practical Case Studies, research and training, as well as MarketingSherpa updates and promotions.

 

Please refer to our Privacy Policy and About Us page for contact details.

No thanks, take me to MarketingSherpa

First Name:
Last Name:
Email:
Text HTML
Feb 11, 2014
Article

Marketing Research Chart: How do CMOs make decisions?

SUMMARY: The point of website optimization is to improve conversion on your site.

But how do you know if a potential change will truly make an improvement? Are you just making changes for the sake of making changes?

In today's never-before-published chart, we'll take a MarketingSherpa look at how CMOs are deciding what changes to make to their websites.
by Daniel Burstein, Director of Editorial Content

In the MarketingSherpa 2012 Website Optimization Benchmark Report survey, we segmented the respondents, and asked the following question to only the chief marketing officers.

Q. For CMOs: In your organization, how do you make the final decision regarding which version of a page/process should be uploaded to your live site?

View Chart Online


Click here to see a printable version of this chart



When crafting a Benchmark Report at MarketingSherpa, we sometimes have data that does not make it into the final report and ends up on the "cutting-room floor."

I thought of the above chart, which did not make it into the final version of the MarketingSherpa 2012 Website Optimization Benchmark Report when I read the following insight from Peter M. Senge, Senior Lecturer in Leadership and Sustainability, MIT Sloan School of Management, taken from "The Leader's New Work: Building Learning Organizations," MIT Sloan Management Review, Fall 1990:

Our minds literally move at lightning speed. Ironically, this often slows our learning, because we leap to generalizations so quickly that we never think to test them.

We then confuse our generalizations with the observable data upon which they are based, treating the generalizations as if they were data.

The frustrated sales rep reports to the home office that "customers don't really care about quality, price is what matters," when what actually happened was that three consecutive large customers refused to place an order unless a larger discount was offered. The sales rep treats her generalization, "customers care only about price," as if it were absolute fact rather than an assumption (very likely an assumption reflecting her own views of customers and the market).

This thwarts future learning because she starts to focus on how to offer attractive discounts rather than probing behind the customers' statements. For example, the customers may have been so disgruntled with the firmís delivery or customer service that they are unwilling to purchase again without larger discounts.

Do you treat your generalizations about your website audience as absolute fact?

Senge calls this "seeing leaps of abstraction." Are you making leaps of abstraction on your website? How are you challenging your generalizations?

As Senge said above, many generalizations aren't tested. While testing generalizations can be difficult in business management, website optimization offers a simple yet powerful tool — A/B testing.

Or do you test and learn about your audience?

That's what got me thinking about the above benchmark data. As you can see, only 15% of CMOs use validated test results to determine what goes live on their website.

While at least 29% of marketing departments decide collaboratively, and the decision isn't made in the vacuum of a corner office, that marketing department is only a slightly larger vacuum.

With A/B testing, marketers can break out of that vacuum, challenge their own assumptions, and see how customers really act when changes are made to a website. That's the difference between website optimization and random site changes. True website optimization requires continual testing and learning more about your customers to see what changes improve conversion and the user experience — and, sometimes, when the site is better left alone.

In the above example that Senge used, you could, for example, set up an A/B split test pitting a discount versus an iron-clad delivery promise or a robust customer service SLA and see how customers actually respond in a real-world situation. This can help you dig into customer desires, and fuel future learning to help you better serve customers by better connecting to their motivations — and thereby improving marketing performance.

What have you learned from your website optimization tests?

We want to hear from you as well. The call for speakers for Web Optimization Summit 2014 in New York City, May 21 through 23, is currently open, and we are looking for brand-side e-commerce and subscription marketers to share what they've learned. Tell us about your conversion optimization efforts and hopefully I'll have the chance to interview you on-stage in the Big Apple about what you've learned through your split tests.

Related Resources

Web Optimization Summit 2014 May 21-23, New York City

A/B Testing: Why donít companies track ROI of testing and optimization programs?

Marketing Research Chart: Measuring website optimization ROI

Marketing Research Chart: What are the most prevalent website optimization priorities?


Post a Comment

Note: Comments are lightly moderated. We post all comments without editing as long as they
(a) relate to the topic at hand,
(b) do not contain offensive content, and
(c) are not overt sales pitches for your company's own products/services.










To help us prevent spam, please type the numbers
(including dashes) you see in the image below.*

Invalid entry - please re-enter




*Please Note: Your comment will not appear immediately --
article comments are approved by a moderator.