Close
Join 237,000 weekly readers and receive practical marketing advice for FREE.
MarketingSherpa's Case Studies, New Research Data, How-tos, Interviews and Articles

Enter your email below to join thousands of marketers and get FREE weekly newsletters with practical Case Studies, research and training, as well as MarketingSherpa updates and promotions.

 

Please refer to our Privacy Policy and About Us page for contact details.

No thanks, take me to MarketingSherpa

First Name:
Last Name:
Email:
Text HTML
Join Our Research Team at DMA 2014
May 01, 2013
Case Study

Testing and Optimization: SAPís Test Lab increases digital leads 27% [Part II]

SUMMARY: Sometimes how-to articles with many sources are so extensive, the article is split into two parts. That rarely happens with a case study. But, this very deep dive into SAP's online testing and optimization program is one of those unique exceptions.

Last week, we began sharing SAP's Test Lab with you, and this week, we conclude that process with details of actual tests run by the program, and a look at challenges in testing and optimization. Plus, you'll read about the big picture learnings from this impressive effort.
by David Kirkpatrick, Senior Reporter

CHALLENGE

This week's B2B article features part two of a case study on SAP's Test Lab. During our interview with Shawn Burns, Vice President of Digital Marketing, SAP, we were provided simply too much great information for a single article.

Part one covered how Burns, as the executive sponsor asked to create enterprise Web analytics at SAP, first developed a "single source of truth" for Web data collection across SAP's global Internet properties.

From there, the article explained how the program was developed strategically, and then tactically with a testing queue capable of handling around 25 Web tests each quarter.

Part two provides deeper insight into how the Test Lab was sustained, some of the challenges faced and actual examples of tests SAP has run on its Web properties. You'll also learn Burnsí full takeaways from implementing and sustaining SAP's Test Lab.

CAMPAIGN

This week, we pick up where last week's article left off.

Step #6. Achieve company-wide buy-in for the testing and optimization program

Successful testing and optimization includes developing a testing culture, most easily recognized when there is buy-in both at the highest levels in the company, and within the team engaged in testing. Burns said buy-in is an ongoing challenge.

"Corporate buy-in is a never-ending process," Burns said. "One of the most important things to realize is that you never reach a point where the company has 'bought in.'"

He continued, "You get enough interest to assemble the pilot — your initial test — and what we've done internally is we just communicate like crazy."

He explained communication took two basic forms:
  • Social collaboration inside Marketing to allow SAP's global group of marketers to communicate with each other. "Marketers talking to marketers" is how Burns described this interaction.

  • Weekly internal posts covering the testing queue, reviews of recent tests run by the Test Lab and some of the initial readouts from those tests.

The third key piece of Test Lab communication was more formal.

Three times a year, the group published an "insights book."

This large digital document included:
  • The number of tests run over that timeframe.

  • Visual samples from those tests.

  • Where each test was conducted on the various SAP websites.

  • The results — ideally a lift in conversion — from the tests to communicate how each test impacted the business.

Burns described the value of the insights book: "That's just standing on the shoulders of giants. Any marketer, even if you start at SAP today in a new job in a new marketing role, you can go backwards for the last two years, access those insight books and already do much better marketing online because that [resource is] available. That's a big part of change management."

He added the team believed executive sponsorship is another key to the program, and he "beats the drum internally" all of the time. If he found that some aspect of the website was static, he asked why it wasn't being tested.

If a group sought to run an in-person focus group, he would ask if the Test Lab could handle that task digitally to save time and money.

Burns also explained his ongoing role as a change manager:
The second we slow down internally, you know people will move away from a testing mindset.

It's tough. It's difficult. It's a lot of change management.

It actually forces marketers to do things differently in one major way, which instead of bringing one idea to the surface, they have to bring two.

And, in many cases, especially with creative concepting, that's twice as much work.

You can't do twice as much work for everything. So, then we start having to talk about the volume of activities we do, or the volume of content we do, maybe has to be turned down a bit so that the quality and the testing can be turned up a bit.

All of that is a really big part of change management, and it never stops.

Step #7. Use testing and optimization to continually improve marketing efforts

SAP has marketing teams spread across the globe.

The Test Lab has the ability to run about 25 tests each quarter. And, there's a queuing system in place where SAP's marketers request specific tests to be run. With all that detail going into the Test Lab, it might not be out of the question to think the team finds the "winner" of each A/B split test and then moves on to the next testing area.

Actually, Burns said the Test Lab looked for continual improvement and optimization.

As an example, he explained the team recently ran a number of tests on visual imagery.

The main test compared graphic icons against photography. The conventional wisdom in the Test Lab was icons would perform better than photography on webpages. The result was photography converted at a significantly higher rate.

He added just because photographs "won" the test, it didn't necessarily justify using more photography on the websites because photographs were also significantly more expensive than graphic icons to produce.

Burns also provided an example of an icon versus photography test.

Here was the set-up for the test:

Background: SAP started to introduce pictogram-type imagery, in addition to the photographic images that have been more traditionally used. With both image types in the mix, the question about what image type invoked more "response" naturally came about. The SAP testing team tested both image types against each other, head-to-head, in a PPC landing page environment.

Goal: Increase the lead form submissions rate.

Primary Research Question: Which image type performs better at driving more response in a demand generation environment — pictograms versus photographic imagery?

Approach: A/B split test

The control was all of graphic images — one large image on top with text to the left of the graphic image, and three smaller graphic images directly below with text underneath each of those images.

The treatment had an essentially identical layout, except all of the images where photographs rather than graphics, and the overall design scheme of the webpage included more color — a color background for the larger text box next to the top image.

Here is what the test uncovered:

Results: The photographic imagery drove 33.95% more lead form submissions than the control featuring the pictograms, with a confidence level of over 99%. This test was run on a France SAP PPC landing page — the SAP testing team is in the midst of testing this in other countries, with the intention of being able to uncover any differences in image type preference from region to region and at the same time, gather more samples on the core image type question.

Continued learning from the test

As this test was rolled across different geographic regions, a very interesting result became apparent.

"In China, out of nowhere, [the test] does four or five times yet even higher than our highest conversion of the standard A/B test," Burns explained. "We think we've tapped into something that we weren't even aware of. So, we've got an insight, which is great."

From there, Test Lab began further testing on using photography in SAP's China marketplace to both validate how customers in that region respond to photography, and to find out if this learning should impact other ways SAP markets to China.

Step #8. React to new marketing channels and challenges, such as mobile

Reacting to changing conditions, such as a new marketing channel, is another area where testing involves continued improvement.

"The market changes and forces us to open testing up again before we want to," Burns said. "We really did lock out registration forms. Meaning right out of the gate, we tested every possible derivative of a registration form we could imagine and came to a place when we said, 'Case closed.'"

This process created three basic registration forms at SAP:
  • What the team called "anonymous," which Burns described as a "very tiny registration form."

  • For other scenarios, a longer form.

  • For buying scenarios, the longest form on the website.

Burns said, "We know exactly where the break points are on fields. We know the layout. We know the imagery. We know the colors. We're done. Case closed."

When marketers at SAP asked for a registration form test, the Test Lab team would refuse and then offer the past knowledge to apply to the form the marketer wanted tested.

Then, there was a change in SAP's marketing world — the mobile marketing channel.

Burns said the percentage of users experiencing SAP via mobile devices was skyrocketing and increasing 200% every 90 days.

"The pace and change in the world forces us to open something that we already thought was closed because now youíre right back to, 'What is my registration experience on a smartphone?,' because it's a completely different can of worms," Burns explained.

He added, "Even when we think we're done, the world changes and makes us reopen things."

Step #9. Define the resource challenge

Burns described one challenge in a testing and optimization program as "really simple because it hit us square between the eyes."

He explained:
You get the platform in place, you get the people in place, you get a process.

Now, you really think you're good to go. Right? So, your three-legged stool is locked — people, process, platform. I'm ready to go. And, then what happens? You hit the next blocking point, which is sheer resources to execute the test.

What does it mean? We work in a world of finite resources, and we've got design resources, we've got copy resources, we've got development resources that are in place.

You can imagine one Web experience at a time, whether that's emails, or paid search landing pages, or microsites or core web property websites.

We've got people and teams in place that build us out one Web experience at a time. And, all of the sudden, you come up and go, "I want to do some dramatic testing and I want you to change the layout on everything. There's going to be a full layout, new template." And, the development team comes [back] and says the amount of resources required to do that are the same as rebuilding the corporate website.

Burns said resource gaps led to the team focusing "an inordinate amount of resources" on landing pages because landing pages allowed for changes on the fly. Changes that just weren't possible on the core website at that stage of the Test Lab.

Step #10. Meet the resource challenge

The result of learning there was a resource challenge in what could, and could not, be tested, and having dedicated resources for the Test Lab that allowed for testing regardless of the environment, including:
  • Microsite

  • Paid search

  • Email

  • Core website

The Test Lab knew to what extent it could "pull the lever" in changing things, and when those changes go to what Burns described as "out of bounds."

He added 90% of what the team wants to accomplish falls inside those bounds.

For that 10% that falls out of those bounds — something that needs to be tested, but is heavy lifting and in excess of available resources on the quarterly basis of the program — the question went to a leadership team.

This team managed all of digital marketing and was composed of SAP personnel such as:
  • Head of development

  • Head of design

  • Head of content

  • Head of strategy

  • Head of search marketing

The Test Lab brought the testing scenario to the leadership team, and that team determined if the test was valuable or not.

From there, it became a matter of prioritization. Stop other tests and dedicate all resources to this really big idea? Augment resources on top of existing tests?

"It's very interesting. We have never come to a point where we literally said we just can't do that [at all] because itís too complicated. We've come to the point where we've said we just can't do that yet," Burns said. "And, that's really important because enterprise websites have pretty much continuous development and innovation schedules."

Make resources for testing part of Web infrastructure RFPs

When SAP began changing a large portion of its corporate Web infrastructure, one of the main drivers in the partner selected was ease of facilitating testing.

Burns said the ease of testing element in the request for proposal took up about 25% of the RFP.

He explained, "A quarter of our RFP was, 'This platform has to enable fluid testing without getting into a bunch of build cycles.' That's one of the reasons we selected the platform that we did."

Step #11. Testing should drive revenue

The concept seems obvious, but testing and optimization isnít helping the enterprise if all the "wins" and "learning" from testing don't translate to the bottom line.

Burns explained SAP's approach and process behind ensuring the Test Lab impacts revenue:
Marketing and sales alignment is paramount because for us, our center of gravity is sales results, full stop.

Our center of gravity as corporate marketers at SAP is hitting the revenue target and enabling us to do that as efficiently and as quickly as possible. Right?

Revenue is a quarterly gain, which is why the Test Lab has also been set up on a quarterly cycle, because every quarter, we're trying to generate more leads into the sales team, to [get them to] sell more software.

That is so important because when you come into the queue and you raise the hand and go, "I want to test 'X,'" the barometer to which you're measured against, or the acid test, is impact on revenue.

That's always the center of gravity for the Test Lab when we're making the call to test "this" versus "that" and to choose submissions

He added SAP is organized by categories:
  • Cloud

  • Mobile

  • Analytics

  • Database applications

And, the Test Lab sought to know what solutions in each category are driving the business forward. That way, the team could go into any marketing category and understand where to focus resources.

Step #12. Allow testing to uncover "definite insights"

Based on the importance of revenue, Burns said the KPI for the entire digital marketing team was based on their contribution to sales.

This contribution is tracked through many metrics, some "hard," like online lead generation, and others more "soft," such as:
  • Registration

  • Click-to-call

  • Click-to-action

  • Site traffic and unique visitors

  • Consumption

  • Request-a-visit

  • Login

"Now, we have a fully measurable chain which allows digital marketing to directly contribute to the sales revenue. That's the key guide for digital teams," Burns said.

In addition to the contribution to sales revenue KPI, the Test Lab has what Burns called "business KPIs" that are sub-KPIs the team needs to hit and he described those targets as "definitive insights."

"A definitive insight for us means we donít need to test it ever again," he said. "Now, again, that's a big statement, but that really is the spirit of it."

And, Burns qualified that statement:
Testing is an iterative process, like with the Chinese example with registration forms on mobile devices.

You know testing is iterative but you want [the team] focused on locking down what we call definitive insights that can be shared in the quarterly insights book for the rest of marketing as, "This is a function of digital that allows you to convert at a much higher level."

Our customers prefer it. We've proven it statistically and it's been deployed to all of the areas that we previously tested. To your marketing teams, that's a definitive insight. So, as you move forward with new marketing initiatives, please use these insights and do marketing based on them, because then we're all standing on the shoulders of giants.

You probably could test registration forms forever, right? Drop a field. Add a field. Drop another field. Add another field, on and on, and on it goes.

At some point, it's enough. We've probably got enough insights and enough value from that test we're not going to drive a bunch more. Move onto the next definitive insight, and that's why they're KPI'ed on that.



RESULTS


In part one of this case study last week, we shared the key metrics from Test Lab:
  • 27% lift in incremental sales leads from digital

  • 20% budget savings by avoiding less effective online tactics

For part two, we want to share more of Burns' key takeaways and learnings from Test Lab.

He said:
The number one takeaway is very simple. As hopeful and aspirational as we were, or idealistic as we were when we bought into the power of what we were setting up, we had some aspirations.

We said, "We're going to take a leap of faith." We put the platform in place. Then, we went further and put people in place, and I personally went on the line to hire people — in lieu of search marketers, email marketers, etc. — I hired Test Lab people.

We went out and put new head count in place, and then put the whole process in place for marketing.

So, we're out on a limb because we completely believed in what was possible, and we're now on year three and we still don't think we've scratched the full surface on how much value there is in this process.

There are very few things we do year-on-year that keeps driving exponentially more value, and Test Lab is one of them.

That first year the value just comes gushing out, because we've tested so little previously.

But, it never stops because the world is changing. There are new devices, new customers, new solutions, new products [and] new creative techniques. The world is ever changing and the value just keeps coming.

Burns added through Test Lab, SAP marketers think about testing which fosters a testing culture at the company.

His final thought was reflecting on the nature of finite resources and corporate culture at SAP:
We're all working with finite resources, we're all working with finite budgets.

So, the corporate culture is very interesting because we're all working with finite budgets, but yet, we're in an environment where we go, "Guess what guys, cost savings is not a dirty word, because if that forces us, as a corporation, to do smarter things, or to do better things, or to do more relevant things, or to just pare down the list to the quality versus the quantity, that's not a bad thing."

That's the corporate culture thatís going on right now at SAP.

To help you learn the behind-the-scenes story of this impressive effort, Burns will be interviewed on the May 1 MarketingSherpa webinar Testing: A discussion about SAP's 27% lift in incremental sales leads. And, heíll present discoveries from SAPís Test Lab at the upcoming MarketingSherpa Optimization Summit 2013 in Boston.

Creative Samples

  1. Test control — graphic images

  2. Test treatment — photographs

Source

SAP

Related Resources

Part one of this case study — Testing and Optimization: SAP's Test Lab increases digital leads 27%, leads to 20% budget savings

Social Media Marketing: How SAP identifies and replicates successful tactics across a global company

Online Marketing: 4 sources of customer insight on your website

A/B Split Testing — How to use A/B Split Testing to Increase Conversion Rates, Challenge Assumptions and Solve Problems

Paid Search Marketing: A/B split test produces 144% increase in total leads




Post a Comment

Note: Comments are lightly moderated. We post all comments without editing as long as they
(a) relate to the topic at hand,
(b) do not contain offensive content, and
(c) are not overt sales pitches for your company's own products/services.










To help us prevent spam, please type the numbers
(including dashes) you see in the image below.*

Invalid entry - please re-enter




*Please Note: Your comment will not appear immediately --
article comments are approved by a moderator.