by David Kirkpatrick
, Senior Reporter
Four years ago, Shawn Burns, Vice President of Digital Marketing, SAP, was the executive sponsor tasked with creating enterprise Web analytics at SAP.
The company was running many major Web properties, but lacked a "single source of truth" on Web activity.
Before implementing enterprise Web analytics, different Internet properties used different analytic resources.
"Some are using free cloud-based systems like Google Analytics, others are using systems that are provided third party through their agencies of record," Burns explained.
He added having a single source of truth is a very standard business requirement to put Web analytics in place.
"You implement a system [with a single source of truth] and really very quickly overnight, you go from no clear visibility because of inconsistent reporting systems, to crystal clear and transparent visibility," Burns said.
He continued, "You know what your Web traffic is, where itís coming from, how many unique visitors and what they are doing on your sites."
This case study looks at how SAP created a Test Lab with a proven process and impressive results. In the steps presented, you will learn both high level strategic insight and practical examples on the execution of SAP's Test Lab.
Today, we offer part one of this extensive case study. Next week in the MarketingSherpa B2B Marketing newsletter
, we'll provide how SAP was able to continue the improvement and optimization of this program, cover more specific challenges the team faced and drill down into the definitive insights learned so far in this testing and optimization journey.
Before creating the Test Lab, Burns said SAP had to develop a single source of truth across its international ecosystem of Web properties.
Step #1. Leverage the single source of truth and recognize the challenge it creates
Getting that single moment of truth with data analytics was achieved, but it also provided a new challenge to the marketing team.
Burns explained the process, and the problem:
As a sponsor, I wanted to see the more interesting stuff, and for me and for the entire team that supported Web analytics at SAP, that always meant more efficient marketing, more productive marketing, more successful marketing, marketing that drove the business of SAP.
And, the assumption of course, is that Web analytics — [the magic] happens almost by default. Because, you know as the story goes, once everybody has transparent reporting, once everybody has access to a single source of the truth, once everybody can see what's working and whatís not [working], by default weíre going to start doing things.
We're going to do more of the things that work well, that are driving engagement and getting high traffic, and conversion and consumption rates, and we're going to do less of the things that are obviously and transparently irrelevant because our customers just aren't consuming them.
You're waiting for the change to happen in the way we go to market, the types of content that we offer on the website, the way that content is built and packaged.
You're waiting for the change to occur because the single source of the truth is the original business requirement, but guess what? The reports only get you so far.
Once you got your baselines in place and you know what's good and what's bad, great, but now we have to take it to the next level and do better marketing as a result.
What happens at SAP probably is not unfamiliar to what happens at a lot of places. The change in marketing didnít occur.
Now, what do I mean by that? Our implementation was wonderfully successful, great reports, great transparency, and we immediately went to a high number of users.
The problem was the high number of users at SAP. The adoption rate for data analytics was very high, but change in marketing strategy and tactics didnít happen.
Burns said, "You get into it, and you realize users begin to mean they have access to a system and theyíve been trained. Super user begins to mean theyíve been even more heavily trained. But, the ability to read reports and impact the way you do marketing is a lot like reading the newspaper and making the news. They just don't go together."
At this point, Burns identified the problem with the single source of truth with data analytics:
The challenge was, as we talked to a bunch of the marketing folks that were using the Web analytics platform, it really was an afterthought, and not in a bad way.
The reality is that marketing folks are busy, and at SAP, they all have defined jobs, and they're either making great ads or doing customer references, or hosting incredible events — on and on it goes.
If you make Web analytics everybodyís job, you quickly realize it's nobody's job, because nobody sits around on Friday afternoon, at the end of the week, and runs Web analytics reports.
Step #2. Create a dedicated Web analytics team
Burns said a year and a half into enterprise-wide Web analytics, there was an "a-ha" moment.
He explained, "If we're going to get value from this platform to the extent that we think we can, it's not about reporting, it's not about single source of truth, itís not about having the system in place. It's about a process and a dedicated team around it to squeeze the value out of it."
That a-ha moment led to the creation of the Test Lab at SAP.
Step #3. Develop a three-legged stool strategy
SAP's Test Lab was created as a "three-legged stool" where all three elements were considered integral to the program's success.
Burns described the three legs:
- Platform — covered all of SAP's major Web properties across all of the languages and countries where SAP was operating
- Team — the idea went from everyone doing analytics to a dedicated team that did nothing but analytics testing in order to do better marketing
- Process — actually implementing the A/B split testing on SAPís online properties
The year is 2013 and the digital platforms are allowing us to test in real-time. Not only is the test itself driving more value, because for the percentage of your users that saw the winning version of the test they're converting higher, but you can deploy the results of the test in real-time as well.
A/B is short term until you have a winner, and then you just turn off "A" and you turn on "B," and it now becomes your de facto Web experience for everybody after the fact. So, deployment becomes instantaneous and youíre getting better results. You know just from a time perspective, that changes almost everything traditional marketers know about testing.
Burns explained with the "three-legged stool" strategy in place, the team now had the capacity to conduct around 25 tests per quarter.
Step #4. Create a testing and optimization queue
SAP is a large company, but marketing testing and optimization with the Test Lab was a finite resource.
Creating a testing queue was paramount to making this program successful.
Why is that so important?
Because, it forces the rest of the company to queue and request to be part of the test, and it has to make a business case.
I'll give you an example. If somebody is in a queue and says, "I want to test to get people at a breakfast event in Brazil and I hope to fill up a room with 20 people that can listen to SAP for breakfast," and somebody else comes and says, "I want to test my new shopping cart for the Web analytics — the shrink-wrap software we sell over e-commerce — and I'm hoping this can generate an incremental $2 million in revenue."
You can immediately see where I'm going with our queuing process. We're going to put testing resources behind revenue generation e-commerce as opposed to the breakfast event that hopes to get 20 people to attend a physical event.
To further explain the process, Burns said everyone at SAP "goes through the door, raises their hand — any country in the world — and says, 'I want to conduct a test.'"
From there, there's a period of business validation and the team picks out the 25 tests that are going to have what he described as "the biggest bang for the buck in that [business] quarter."
Burns said, "[Those are] the ones that we actually put resources behind."
He added an executive steering group was put into place on the assumption there might be controversy about the tests that were chosen as part of the group of 25, but that group has never come into play.
Burns said, "It's never been an issue."
Step #5. Retain the learnings from previous tests
Once the Test Lab had been up and running for three years, with an additional year of pure Web analytics in place predating the Test Lab, the team acquired a great deal of insights from previous tests.
"I can tell you eight times out of ten, we already know the answer to your question," Burns said.
He continued, "We don't need to test the number of fields in a registration form for an event because we've done that test so many times it's a locked insight and we know exactly what drives value."
Burns explained team members would come to the testing queue with a question, and since the Test Lab conducts so many tests very often, that team member can immediately be provided with an actionable result from previous related tests.
In part two next week, we'll provide Burns' full insights and takeaways from establishing Test Lab at SAP.
Here are the two key metrics on this marketing effort.
"We track the overall ROI of the SAP Test Lab quite closely, and on average, the tests result in a 27% lift in incremental sales leads from digital. Test Lab findings have also resulted in a digital marketing budget savings of 20% because we are able to avoid less effective online tactics. Continual ROI like that is what allows the 'test culture' to continually grow across all of SAP marketing," Burns explained.
The number one takeaway is very simple. To help you learn the behind-the-scenes story of this impressive effort, Burns will be interviewed on the May 1 MarketingSherpa webinar Testing: A discussion about SAP's 27% lift in incremental sales leads. And, he'll present discoveries from SAP's Test Lab at the upcoming MarketingSherpa Optimization Summit 2013 in Boston.
As hopeful and aspirational as we were, or idealistic as we were when we bought into the power of what we were setting up Ö OK, so we had some aspirations.
We said, we're going to take a leap of faith. We put the platform in place. Then, we went further and put people in place, and I personally went on the line to hire people in lieu of search marketers, email marketers, etc.
I hired test lab people.
We went out and put a new head count in place, and then put the whole process in place for marketing. So, we're out on a limb because we completely believed in what was possible, and weíre now on year three and we still donít think we've scratched the full surface on how much value there is in this process.
Part two of this case study — Testing and Optimization: SAP's Test Lab increases digital leads 27% [Part II]Social Media Marketing: How SAP identifies and replicates successful tactics across a global companyHow SAP Took Local Approach to Global Level for SMB Lead GenerationMarketing Metrics: Do your analytics capture the real reasons customers buy from you?Online Marketing: 4 sources of customer insight on your websiteA/B Split Testing — How to use A/B Split Testing to Increase Conversion Rates, Challenge Assumptions and Solve ProblemsPaid Search Marketing: A/B split test produces 144% increase in total leads