August 23, 2006
If you run paid search ads, but actual purchases happen on a third-party site -- or even offline -- how on earth can you measure campaign success?
For a new phone launch this spring, Motorola's team invented a predictive scoring system to determine which search ads really work ... and which just get a lot of worthless clicks.
Includes four results that may surprise you:
"Not everything we do has a sale online," says Peter DeLegge, Motorola’s Internet Marketing Communications Manager. "Often, our sales happen through a partner, so there's a handoff."
This spring, Motorola launched a big new model, its new SLVR wireless phone, with a significant search marketing campaign to support the effort.
Paid search is very expensive, especially for big brand names like Motorola. And as most search marketers know, just because a search ad gets a lot of clicks, it doesn’t necessarily mean conversions. DeLegge’s team wanted to track the ROI of a search campaign despite the fact that there was no way they could tell how many of these clicking consumers would convert into buyers.
DeLegge created a search engine marketing scoring model that would take Motorola's objectives and put numbers to a visitor's actions to better predict the ultimate success or failure of each particular keyword in a campaign.
-> The scoring model
“With the SLVR, we cared a lot about building awareness, deepening awareness, creating consideration for the product and creating the sale," DeLegge says. Keeping those goals in mind, his team assigned a score to each action that a customer could take on the SLVR landing page.
The "Buy" button -- the link that sends visitors to partner sites -- was assigned the highest score, "not because we believe that everybody is going to go buy it just because they clicked the link, but it was a very strong sign of purchase consideration."
-> Coding the links
Once the scoring model was finished, DeLegge’s team coded all the links that had a score so each relevant action taken on the site could be tracked.
They then launched the campaign with about 3,000 keywords/phrases across multiple search engines and started looking for the terms with the highest engagement score -- based on the scores that each action was given on the site -- using two key tactics:
o Tactic #1. Dump under-performing keywords
They ran daily and weekly reports to watch how the keywords performed, using the engagement score as the ultimate measure of success.
Sometimes, the team thought they could improve the performance of certain keywords and phrases, in which case they tweaked the phrase and/or the copy of the text ad. Next, they ran A/B split tests to see which version worked better. If the engagement score improved, they allowed the word to remain. If not, they dropped it.
o Tactic #2. Change focus as the campaign continued
With a product like the SLVR, early adopters were an important group to reach. The text ads at the beginning of the campaign had to be geared toward those early adopters, so the copy read something like: "Be the first to check out Motorola's exciting new phone ..."
However, as the two-month campaign progressed, the text focused less on the excitement and newness of the phone and more on its benefits. Copy was more like, "Super thin phone, 1.3 mega pixel camera.”
By the end of the campaign, the team had narrowed the group of keywords and phrases by two-thirds to just over 1,000. "There were no sacred cows,” DeLegge says. “Either it performs in hitting our objectives or we discard it."
Overall, DeLegge expected a 30% conversion rate for the campaign.
DeLegge was completely wrong … and on more than one count.
In fact, the campaign achieved an 82% conversion rate (consumers clicking through to the "Buy" link on the landing page, although they didn’t necessarily buy a SLVR. “I was exceptionally pleased,” DeLegge boasts.
His team also discovered that phrases they thought would be the biggest winners actually weren’t. "Marketers have a bad habit of going with gut feeling. We found a lot of things that were completely counterintuitive."
Here are four search lessons they learned:
Lesson #1. Engagement doesn’t always translate into sales.
Some phrases worked well in terms of engagement score, but the score dropped when the word "buy" was added.
(MarketingSherpa highly recommends that online marketers with offline sales channels that can’t be measured use a downloadable coupon or other guerrilla tactic to help you better understand your consumers’ buying behavior.)
Lesson #2. Brand name search terms don’t always translate into sales.
Some phrases, particularly those that had the word "Motorola" or other terms very specific to the product, had clickthrough rates exceeding 30%. "They performed phenomenally from a clickthrough rate but poorly for our objective," DeLegge says. Because those words underperformed in terms of the engagement score, they were dumped.
Note: This is not the case for all marketers. You need to measure each campaign you do for yourself.
Lesson #3. You never can guess what keywords consumers will click on.
Who would have thought that “MP3 phone” would do far better than “music cell phone”? DeLegge didn’t.
Lesson #4. The social networking aspect of the Web has changed launch campaigns.
By the time the SLVR’s advertising launch happened, it was too late to reach the early adopters. "With an article like [the SLVR], people are blogging about it, all excited about it. We needed to be synchronized more with PR long before the advertising campaign kicks in."
Note: This is a lesson for anyone doing a marketing launch, whether it’s search related or not.
As you can guess, other departments at Motorola are using similar scoring models for other products.
Useful links related to this article
Creative sample of Motorola's SLVR campaign:
Peter DeLegge spoke at July's AD:TECH in Chicago. For information about upcoming shows, visit http://www.ad-tech.com/