by David Kirkpatrick, ReporterCHALLENGE
Sometimes, marketing can be seen as more of an art than a science. After all, unlike a discipline such as biology, subjective opinion plays a much bigger role than objective knowledge. If you ask two biologists what type of chimpanzee you’re looking at in the zoo, you’ll likely get one answer, complete with genus and species. However, ask two marketers the best email subject line for a specific marketing send and you may get three answers.
But with testing and optimization, marketers can have an objective, quantitative answer to the question, "What really works for my audience right now?"
Testing allows marketers to learn what subject line, what landing page layout, or what call-to-action text will lead to improved results.
Testing can be done in a vacuum, where a single A/B split test is run, a result is found, and the marketer takes the winning result forward in the campaign. However, marketing testing is much more effective when it is part of a complete testing-and-optimization cycle. Learning what works should lead to new ideas to test against the previous winners. Continuing this cycle leads to fully optimizing the marketing effort.
Intuit is one such company that seeks to improve its marketing through use of this virtuous circle. Its flagship product is QuickBooks, which is one of its business and financial management solutions for small- and mid-sized businesses, financial institutions and accountants. Testing and optimization is a continual process at Intuit.
This testing includes:
o Pay-per-click ads
o Banner ads
o Shopping cart
o Landing pages
Sunil Kaki, Senior Marketing Manager for Quickbooks, explained why Intuit's testing is ongoing:
"I found that even [testing] winners have a lifecycle of being impactful. I think they are good for maybe a quarter, but not more than that because the winning results decay over time. We want to have a fresh winner almost every three months or so."
Since this testing cycle was happening in real-time on a key Intuit landing page, outside elements created a "battlefield testing" atmosphere. Ideally, testing happens in a controlled environment where one variable can be changed while the rest remain static.
In real-world testing on active landing pages, many variables change. For example, on these tests, discounts were applied at the corporate level. When there is limited time, microtests are added to the test plan (adding social elements in this case) for learning within the overall test. Adding to this battlefield atmosphere was the fact that the QuickBooks product changed from a 2010 version to a 2011 version in the middle of testing.CAMPAIGN
One testing and optimization cycle conducted late last year involved a landing page that had not changed for some time, yet received about 250,000 unique visits each month and drove significant revenue selling QuickBooks Desktop products.
Kaki said, "The page we had was [very] old. We had it up there for about a year and we hadn't tested on it. Since it generates such big revenue, we knew that there was an opportunity to optimize and gain some incremental revenue out of the page."
The complete cycle of four tests looked at various aspects of the landing page, including ideas around social media, and comparing how the desktop version of the software coexisted with the software-as-a-service version on the same page.
Kaki pointed out that even though Intuit began seeing results from the first test, the control was kept consistent throughout the entire cycle. He explained how this allowed Intuit to compare each test with the others, and helped mitigate the uncontrollable variable of website traffic patterns over the course of the testing.
He said traffic to the tested landing page was affected by
Intuit promotions, outside events happening at the industry level, and retail partners running promotions that drove website traffic.
QuickBooks is sold in several different versions. These tests involved Pro, Premier and Mac -- Premier being the most expensive. This testing cycle focused on marketing Pro without negatively affecting the other versions, particularly upselling to Premiere.
First we’ll provide all of the metrics from this specific series of tests, and then look at what Kaki and his team learned from the tests (there are links to creative samples of all the treatments in the useful links section at the bottom of this article).
The overall objective for the entire testing and optimization cycle was to determine the ideal landing page to increase revenue per visitor through increased conversion rate and/or increased revenue per order.Test #1. In search of an effective "broad stroke" approachWhat was tested
- For Test #1, the goal was to focus on product and information presentation to help determine the ideal layout
- Three different layout treatments, along with the control, were used to determine how much information was needed for a user to make a decision to purchase
o T1 - Short form with bullets and a dropdown selection menu
o T2 - Long form FAQs
o T3 - Product matrix comparing ad group Targeting Pro, Premier and Mac visually
- Product: 2010 (note: the software version changes to 2011 in later tests)
- Ad group Targeting: No (note: ad group targeting dynamically changes landing page layout, such as a headline, based on keyword searches that reach the page)
- 20% Discount: Yes (note: this is an element imposed on QuickBooks' marketing team by Intuit corporate marketing and was an uncontrollable variable in all tests)
Kaki stated the main goal of the first test was to find out if there were "broad strokes" that could be taken to improve the page, and uncover layout concepts that worked.
Each treatment tested different individual ideas, but the big picture goal was to find an overall winning version of the landing page and begin to fine-tune layout and copy changes in subsequent tests.
Winning pages were determined by revenue per visitor.Results for Test #1
For this testing and optimization cycle the control remained the same landing page. All results are compared to the control.Treatment 1
+7.24% difference (versus control)Treatment 2
-5.26% difference (versus control)Treatment 3
+5.26% difference (versus control)Test #2. Adding a social elementWhat was tested
- Although the short form Treatment 1 "won" the first test, the goal for Test #2 was to build on the conversion performance of the FAQs treatment by increasing overall revenue through a more visible option to purchase the higher-priced "Premier" product
- Treatment 2 added radio buttons as an alternate method for selecting a product
- Treatment 3, a tabbed layout, was included in the test as another variation on layout
- In addition, a Facebook "Like" button was added to determine its impact, if any, on the landing pages
- Product: 2010
- Ad group targeting: No
- 20% Discount: Yes
This test also focused on layout, but also added some additional elements, such as the "Like" button. Part of the battlefield testing element was maximizing learnings and increasing revenue with somewhat drastic changes. In this case it was decided that the social elements were not likely to dramatically influence the overall goal of the test, and the learnings from adding those elements was worth the tradeoff.
The second test found the "big win" for the total testing and optimization cycle. Kaki said adding the social element was very interesting because making social media part of the marketing effort was new for Intuit.
He added that the actual social usage was not particularly high, with fewer than 100 website visitors. But, testing found that those visitors re-engaged with Intuit after the initial visit to the landing page.
"We got very key learnings from that to incorporate in future pages," Kaki stated. "The key takeaway is the (social media) concept is good, but maybe the implementation could be different."
Trying out and testing different implementations of social media on the landing page is in Intuit's future plans.Results for Test #2
+22.52% difference (versus control)Treatment 2
+13.73 difference (versus control)Treatment 3
-6.05% difference (versus control)
Both Treatment 1 and Treatment 2 outperformed the control, but Treatment 1 turned out to be the overall winning page in this testing and optimization cycle. Both treatments involved simplifying page objectives and providing the visitor with enough information to make a purchase decision.Test #3. Refining the best performing landing page treatmentsWhat was tested
- Test #3 involved testing the top-performing treatments from Test #2, but updating the box shots because the product version changed from 2010 to 2011 between Test #2 and Test #3
- The test also included a Product Tour treatment
- Product: 2011
- Ad group targeting: No
- 20% Discount: No
Test #3 only included two treatments, and refined the two high-performing pages from Test #2. This test also featured the worst "winning" results of any in the entire cycle. One possible explanation is that this was the first test that did not include the 20% product discount. Once again, battlefield testing comes into play with two changes for this test -- the product version and Intuit, at the corporate level, ended the 20% discount.Results for Test #3
+2.02% difference (versus control)Treatment 2
-7.80% difference (versus control)Test #4. Refining the user experience with keyword targeting and OS detectionWhat was tested
- Test #4 sought to target the user experience on the landing page by taking a deeper look at keyword targeting and operating system (OS) detection
- In addition, a treatment similar to the original "Choose Your Version" layout was tested with QuickBooks Online and QuickBooks Premier as secondary objectives on the page. QuickBooks Online was a free trial not included in the product mix, but was included next to QuickBooks Premier at the bottom of the page
- Product: 2011
- Ad group targeting : Yes (Treatment 1 (T1, Treatment 3 (T3)
- OS Targeting: Yes (T1, T3) (note: OS targeting looks at the visitor's operating system and dynamically reacts to that information. For example, a visitor to Intuit's landing page using a Mac OS would be served with a default QuickBooks Mac experience.)
- 20% Discount: No
The final round of testing and optimization and included the control and five treatments, with the fifth treatment added halfway through the test, bringing back the original first treatment from the first three tests by removing two elements added to that page for the final test. This test also included a social element in the form a tool where visitors could ask Facebook friends what they thought about Quickbooks.
Similar to Test #3, this test refined earlier results by looking for incremental increases in revenue-per-visitor. Also, this test did not include the 20% product discount.Results for Test #4
-5.05% difference (versus control)Treatment 2
-10.44% difference (versus control)Treatment 3
-8.75% difference (versus control)Treatment 4
-7.41% difference (versus control)Treatment 5
+3.37% difference (versus control)RESULTS
Kaki said one major result Inuit found in the testing and optimization cycle was, "how big of an influence a landing page could have for our product mix. This was very critical, because our products are priced differently. If we are able to influence a higher-priced product in a positive direction and take away units from our lower (priced) products, thinking about it in that way led to a lot more learning rather than just thinking at a very high level of revenue-per-visitor."
He added that Intuit is looking to implement this learning with other company products, and that this testing cycle was just a first step to see if it could produce a lift in revenue-per-visitor.
- Treatment 1, with minimal changes, was the only treatment included in each test along with the control.
Because the control did not change, there were no incremental increases in winning results, but Treatment 1 in Test #2 achieved the best result of the entire cycle with a 22.52% increase over control in revenue-per-visitor
- Overall, treatments performed better than the control with two types of strategies:
o Simplifying the page objectives (less navigation and buttons, one primary call-to-action)
o More effectively promoting product details and guarantee
- It’s imperative to determine the appropriate amount of information to present to the visitors:
o Treatments that overwhelmed the visitor with information (FAQ, tabs) did not perform as well as hypothesized
o Treatment 1 -- the winning treatment throughout the testing cycle -- only included a brief intro paragraph
- Both attempts to increase the sales of non-Pro products (Mac and Premier) through ad group and OS targeting did not yield an increase in revenue overall or for those products. For example, visitor who arrived at the page through a "QuickBooks Premier" search term were shown a box of the Premier product instead of Pro, but this did not result in an increase in Premier revenue over controlUseful links related to this article
2. Test #1 Treatment 1
3. Test #1 Treatment 2
4. Test #1 Treatment 3
5. Test #2 Treatment 1
6. Test #2 Treatment 2
7. Test #2 Treatment 3
8. Test #3 Treatment 1 and 2
9. Test #4 Treatment 1
10. Test #4 Treatment 2
11. Test #4 Treatment 3
12. Test #4 Treatment 4
13. Test #4 Treatment 5IntuitMECLABS
- A science lab that conducts R&D in sales and marketing (and parent company of MarketingSherpa)
Members Library -- Page Tests Lift Site Registrations and Conversions: 3 Examples that Stopped Site-Design Bickering
Members Library -- How to Plan Landing Page Tests: 6 Steps to Guide Your ProcessLanding Page Design: Eye path vs. Thought sequenceLanding Page Optimization: Minimizing bounce rate with clarityOptimization Summit 2011