by David KirkpatrickCHALLENGE
Testing and optimization is a powerful marketing tool that can help you uncover and implement winning tactics and strategies. Even tests that don't find a "winner" provide valuable results that can be used to inform and refine further testing.
Effective testing can be performed on a wide variety of marketing pieces from simple pay-per-click ads to single-purpose landing pages on up to webpages with multiple page elements and page visitor goals.
Even though testing and optimization can be useful, there are barriers to successfully implementing the program. Simply getting approval and the budget to begin testing can sometimes be a challenge, and once you begin testing there are potential roadblocks along the way.
Active Network, a technology and media company specializing in online registration and event management software, began an extensive testing project on the homepage of RegOnline, one of the company's stand-alone brands.
"It was a pure lead play," said Lauren Guinn, Director Online Marketing, Active Network, about the testing program. "We were not looking for small changes in conversion rates. We were looking for something that could really just shift the dynamic of the business."
The article looks at how this testing and optimization program was implemented at Active Network, and the barriers the marketing team faced at different stages. Find out how the team was able to secure executive buy-in, overcome discouragement, and even learn what metrics they needed to track through this effort. CAMPAIGN
The RegOnline homepage was chosen for testing because that channel drove 90% of RegOnline's business. The page hosts a free trial sign-up for the product, and the average buying cycle for the brand is 45 days.Obstacle #1. Get executive approval for the testing and optimization program
Robin Jones, Vice President Marketing, Active Network, explained, "The biggest obstacle, I believe, to get the process started was executive buy-in for the investment itself."
She said Active Network is a very data-driven organization that makes investments based on expected returns. The marketing team took a "really hard look" at the RegOnline property and identified it as a real opportunity for improvement through testing.
The team then asked, "How do we quantify that? How do we present it?"
Because every business area is looking for dollars in each budget cycle, Marketing presented the program based on the expected return on the investment. The pitch was successful, and late last year, the company began the testing and optimization process on the RegOnline homepage.
If you are looking for executive approval for a testing program, one approach is to create a document that "sells the test" and provides information, such as
o The problem to be solved or opportunity to be exploited
o The hypothesized solution to that problem or opportunity
o The plan of execution, with details including traffic and time estimates
This document also helps you refine the testing plan to account for risk, account for limitations and account for all the moving parts testing entails.
Like Active Network found, often the best way to get approval for a testing program is to calculate all of these elements and go to the C-suite with, "Here is the potential for this testing plan."Obstacle #2. Build a testing culture
Without buy-in at the top, a testing program may never even get started. Without buy-in at all levels within the company, a testing program may never succeed, and certainly will not reach its full potential. The way to get that total buy-in is to create a culture of testing and optimization.
"Fortunately, for us as a company, we had put a lot of cycles for optimization into a particular piece of our business in our history, so we weren't completely new to this process," stated Jones. "But, I think one of the biggest challenges is to get everybody invested in the process."
She added the importance of a testing culture went into the decision-making process that led to testing on the RegOnline brand.
"There was a part of our business where there was certainly a need and definitely some interest," Jones said. "But I knew that they did not have the support within the team to actually do what was required to move the process forward."
She understood the importance of turning optimization cycles through repeated testing and reiteration. This particular business area didn't have the infrastructure and overall team buy-in, so Jones, as she put it, "did not go there."
Guinn added this buy-in involved more than just the marketing team. She said cross-team support was a vital element to the process and involved design and development, sales, and business analyst buy-in.
Guinn said the way to achieve total buy-in was through educating and empowering -- get the entire team on board through understanding.
"Look at the analytics," Guinn stated. Show the team the change that testing created.
She said tell the team, "Not only can you see that the conversion rate is to this [level], but look at what that means to the business, and you are part of that."
Get your team excited about driving business results, and make sure they have access to the data so they can see what is happening during the testing program.
"There are lots of opportunities to get people involved," Guinn added, "but if you don't, it can become an unbelievable challenge to get something done."Obstacle #3. Combat discouragement
Testing and optimization is a process. Not every test produces a "winning" result, but every well-designed test does produce a learning that improves the overall process. Expectations are often high when implementing a testing program. Early in the process, discouragement can set in when test start to pile up with lots of learning but not so much winning.
Discouragement across the team can affect support and buy-in. Discouragement at the executive level can mean a budget cut, or even a total program shut-down.
Active Network faced this very problem. The first three tests on the RegOnline homepage created learnings, but no "winning" results. Tests #1 and #3 produced results with no significant difference, and although there was no statistical difference, the control actually marginally outperformed the tested treatments in the first test.
Test #2 did produce statistically significant results. Unfortunately the control
outperformed the tested treatment
by 24.9% in clickthrough rate and 51.9% in conversion rate.
The goal of this test was to increase qualified new accounts, and the tested element was a two-step registration (conversion) process compared to the three-step process on the control. The key performance indicators were start rate and conversion rate.
The result of the test found an important learning: RegOnline visitors experienced a "sticker shock" when faced with more form fields up front on the two-step registration process.
However, because this "losing" test was part of three initial tests that were either a wash or found a negative impact, discouragement began to set in at Active Network.
At this point the phrase “definite discouragement” was being tossed around at Active Network.
Jones was the functional leader for the testing program, but a general manager at Active Network was heavily engaged in the process as well. Jones said he "ran in different circles than I did," and wasn't involved in all the meetings that covered the learnings of the "losing" tests, but he did get the results of those tests.
So, when asked how the program was going by top-level management, his response was negative because he was receiving only the results without understanding the program’s big picture.
Jones worked to get the GM back on board with the entire program and explained how his informal internal reporting was creating discouragement both with the testing team, and more importantly with top management.
To help turn the effort around and ease some of the growing discouragement, the testing team decided to change tactics and begin testing on a new channel, an SEO landing page that was receiving both paid and organic search engine traffic.
The idea was to test in a more controlled environment because homepages have multiple variables and elements, such as navigation, information choices, etc., that can detract from the goal of the test (driving registrations in this case).
The SEO landing page offered more specific content with higher visitor motivation and better appeal to work with for testing.
The first SEO landing page test was very successful (see the final obstacle for details on this test), and created learnings that could be taken back to the homepage tests.
As a result, test #4 on the homepage did create a "win" in clickthrough rate with a 90% increase. This test did not produce a winning result in conversion rate, but the positive clickthrough result combined with the strong SEO landing page test result helped ease the discouragement at Active Network about the optimization program and provided some light at the end of the tunnel.Obstacle #4. Use the correct metrics
Interestingly enough, this is an area where testing can actually solve the problem of what key performance indicators to track. A barrier to testing success can be identifying the incorrect KPIs for business success.
When this is the case, you may conduct tests on your chosen metrics and get win after win, but at end of the day find out all those wins don't translate to increased revenue.
When you get this result, your tests can help uncover the metrics that actually impact your bottom line. Testing and optimization can not only improve individual marketing campaigns, but also improve the way you measure and analyze business success. It is much easier to improve the bottom line when you are tracking KPIs with the most impact on your company's success.
Guinn said, "Our original KPI going into testing on RegOnline was a sales-accepted lead. We did a great job, opened up the fire hose [and drove] 37% more leads."
A great result with one significant problem. Over that same time period, transacting accounts was down year-over-year. Marketing was getting many more leads into the pipeline, but those leads were not actually producing income for Active Network.
The team went back and conducted additional discovery to determine what metrics to track to turn all those leads into paying customers.
Sales-accepted lead is still the primary metric for the testing campaign, but Marketing is also monitoring several other KPIs:
o The non-response rate -- leads that come in that Sales cannot contact in any way
o Conversion rate to booking (transacting) account
o Time frame of lead to booking account
Guinn stated, "One challenge is really identifying the right KPIs for your business. What are you going to be measuring?”
She added you need to continually reassess your metrics and make sure they provide relevant information about your efforts.Obstacle #5. Overcome testing risks
Testing does present risks, particularly when testing a marketing channel that produces a high percentage of revenue like the RegOnline homepage, which drives about 90% of the brands' total income.
When split testing with a control and treatment, and the treatment consistently loses to the control as in the first three tests on the RegOnline homepage, Web visitors sent to the treatment represent actual lost revenue. This can lead to obstacle #3 above -- discouragement. And more than just discouragement, it also creates real, and justifiable, concern that the testing program is creating a drain on the business.
The answer is to conduct testing on less "mission critical" areas within the campaign -- dedicated landing pages in Active Network's case -- to ease some of the lost revenue pressure and to gain learning and insight that can be taken back to the key homepage tests and inform new treatments that will hopefully beat the control.
"Because there is so much business risk testing on our homepage," Guinn said, "we shifted to smaller tests on other channels and then applied these learnings to big tests on our homepage."
The smaller channels were SEO and PPC landing pages, PPC ad copy and email messaging.
The first smaller channel test was on an SEO landing page and pitted the control
against one treatment
. The goal of the test was to find the combined effect of page presentation elements, and for the test the treatment differed from the control by removing navigation from the top of the page, adding key links to the bottom of the page, and removing multiple calls-to-action in favor of a single CTA.
The result of this test was impressive, and was the first "win" of the testing program. The treatment beat the control in start rate by 1,312% and conversion rate by 548%.
This success gave the team valuable visitor information, and the confidence to go back and begin testing on the RegOnline homepage leading to the winning fourth homepage test described in obstacle #3. RESULTS
Jones and Guinn both offered takeaways from the testing effort.
Jones said commitment was key, "There have been some dark moments in the process, but we got past them, and it is that commitment to the process that, I think, is the most critical.
Guinn added that patience is indeed a virtue in testing and optimization. She said, "Don't be impatient. You have to understand that even a failure is a learning, but that every failure has something inherent that you can learn and apply to the next test."
The testing and optimization cycle at Active Network is ongoing, but to date, tests performed on the RegOnline brand include:
o 8 tests on the homepage
o 4 tests on SEO and PPC landing pages
o 1 test on PPC ad copy
o 1 test on email messaging
And, the cumulative results of this effort are:
o 141% increase in new accounts in the homepage channel
o 638% increase in new accounts in the SEO/PPC landing pages channel
Guinn offered her final thoughts on testing: "The beauty of optimization is that you can get some learnings in one place and then start applying them to a lot of the different online strategies that you are using to drive leads to your business."Useful links related to this article
1. Homepage test #2 - control
2. Homepage test #2 - treatment
3. Landing page test #1 - control
4. Landing page test #1 - treatmentActive NetworkRegOnlineWebsite Testing: Research and testing leads to new IBM Software Group homepage, 23% increase in demosWebsite Testing: IBM's navigation elements test leads to 128% increase in clickthroughsPage Tests Lift Site Registrations and Conversions: 3 Examples that Stopped Site-Design BickeringHow to Plan Landing Page Tests: 6 Steps to Guide Your ProcessLanding Page Optimization: Is it actually possible to optimize a landing page?Test Plan: Build better marketing tests with the Metrics Pyramid