June 01, 2004
Case Study

Nissan Tests TV Commercial Creative with Both Online & Offline Study Panels -- Will Results Match?

SUMMARY: Wouldn't it be lovely if you could use the efficiencies of the Internet to test your TV commercials before they ran on-air?
Online research is far quicker and easier than traditional research studies... but are results comparable?

Nissan Canada just ran duplicate research studies to find out with a traditional offline panel and a new-style online panel. And, we've got the results for you.

Includes lots of details and screenshots of how the online panel worked exactly. (Getting the virtual host right was the hardest part):
CHALLENGE
"We receive a lot of television commercials from the US in order to save on costs," explains Jeannie Lam, Brand Analyst for Nissan Canada.

However, the Canadian advertising team don't assume that just because a TV ad tested well in the US that it will work equally well in Canada. So, they test most new ads prior to investing in media buys.

And it's a good thing they do, because results show about 20% of Nissan's US ads fail to resonate with the Canadian marketplace.

However, timing can be tough. A traditional creative test takes at least two-three weeks to conduct. You have to cold call hundreds or even thousands of consumers trying to find enough who not only fit the right profile, but also agree to drive in to spend an hour at a research facility viewing commercials.

Generally the facility only takes 10 consumers at a time, so to get full results from 100 consumers, it could take two weeks.

In the meantime, the media buying department is impatiently waiting to hear if they should buy time for the commercial. And the Canadian creative department is also on hold, wondering if they'll have to create an entirely new ad to fill the slot.

Lam yearned to use the Internet as a TV creative research platform because she figured it could save weeks of time to generate responses. But, the big question was, would the results be the same as if the research were conducted offline?

Can you test creative intended for one media via a different media, and get reliable results?

CAMPAIGN
This April Nissan US sent the Canadian advertising team a new commercial featuring the Murano SUV in New Orleans. "I didn't think that one would jive very well in Canada," says Lam.

The Canadian team decided this was a great chance to test an ad via an online research panel, and to run a control group using traditional means at the same time. They figured the key to winding up with results as similar as possible was in making the user experience as similar as possible.

The traditionally-run panel members were recruited via phone. These members drove to a central Toronto-location, where they were seated in a room with nine other consumers along with a host. Each member could view the commercial full-size on their own monitor, and were asked to record their reactions using a dial. A host in the room with them gave scripted instructions and answered questions.

At the same time, the Nissan team had an email invitation sent to a Canada-wide panel of more than 100,000 consumers who'd proactively double opted-in to participate in research. Respondents had to meet the same qualifications as the offline panel -- plus one more hurdle, they had to be on a broadband connection so they could view the commercial properly.

(According to Millward Brown, just under 70% of Canadians are on broadband. According to the Pew Internet and American Life Project, 25% of all US adults have broadband at home; and among college-educated adults age 35 and younger, penetration has reached 52%.)

Qualified consumers clicked to a Flash-based microsite that was carefully designed to mimic the offline research experience as much as possible, and to be highly usable. For example:

o The microsite fit on a typical computer monitor set at 800x600 resolution without scrolling.

o The TV commercial itself played in a box in the middle of the screen at size (352x240 picas) that was large enough to view clearly.

o There were no confusing or distracting external links that might lead you away from the microsite. Everything was focused on the research process.

o An indicator at the lower left corner showed the consumer what percent of the research they'd done so far. So they always knew how much more there was to go.

o A giant orange "Continue" button dominated the lower right corner of the screen, plus it lit up when it was time to move on to a new section.

o A "mood bar" ran horizontally under the TV commercial. This was meant to replicate the dial consumers use in offline panels. To view the commercial, a consumer had to click the mood bar and keep it clicked. If they took their finger off the clicker, the commercial would stop.

o Consumers also had a volume control button to make the experience as pleasant as possible.

Last but not least, the real-world host was mimicked by an on-screen host. This was perhaps toughest creatively because the team didn't want a virtual host to either be so unusual or so entertaining that the host took attention away from the research itself.

How do you make a virtual host appear to be so natural that consumers will treat them as they would a real-world host?

The answer was to stream a headshot video of a real person who looked like an offline host might. The team chose an actress in her mid-20s. She was blonde and professional-looking (not gussied up as a super-pretty model.) She wore a visible telephone headset -- just like you see operators portrayed wearing in familiar TV commercials.

She didn't intrude into the experience with blinking or talking, but was completely still "on hold" until a consumer clicked to get a question answered. Then she appeared to be answering your question personally via videophone.

Out of all factors, Lam and the rest of the research team were perhaps the most worried about this video host. Would the novelty skew research results?

RESULTS
Both research panels, online and off, reacted almost exactly the same way to the New Orleans commercial. "Canadians thought it was entertaining, but it really wasn't relevant to them," says Lam.

Nissan Canada's creative team were naturally delighted because this gave them the chance to shoot their own commercial in Toronto. (It's currently airing.)

The research team were also delighted because study results meant online might be a viable sole research vehicle in future. Results were not exactly the same, but rather exhibiting similar-enough patterns to echo each other in a predictable way.

Millward Brown Senior Vice President Stephen Popiel explains, "It's a pattern of response that online numbers tend to be a little lower. There's a social dynamic offline, you have human contact. That greater social bond tends to increase responses; people are slightly more positive. This factor is already very well known in phone versus mail-in surveys."

He continues, "Across both studies, the big numbers and the little numbers followed the same pattern of response. It's just that offline the big numbers tend to be a little bigger, and online the little numbers tend to be a little more little. It's not a perfect one-to-one slope."

The cost for online was a bit lower, but Lam says the biggest advantage for switching to online someday is the time-savings. Conducting a study online just so much quicker.

Plus if you have tough panel requirements (only .7% of the Canadian population was qualified to participate in this particular Nissan study), it's easier to get enough participants online because you're not limited to people in a small geographic area.

You'll also get a higher participant show-up rate. Only 2% of online participants dropped off when they clicked through from the initial online qualification survey to the actual study screen. Then, of those that started the study, just .5% dropped off prior to finishing.

On average the study took about 18 minutes per online participant, however one consumer took more than 90-minutes (presumably because he or she took a break and returned.)

Only 3% of participants said they'd strongly prefer a real-life host to be in the room with them. 60% of participants rated their experience with the virtual host as "very satisfying."

Interestingly, only about 20% of participants clicked on the host to get questions answered during the study. So, most satisfied users were happy with the host's virtual presence even though they didn't actually use her. The still photo and idea that she was available was reassuring in and of itself.

Most questions were about how to use the mood bar. The team is working on clarifying instructions a bit more. Plus, of course, as consumers get used to the experience, someday years from now it will be obvious.

Useful links related to this story:

Screenshots of what online participants saw as they took the Nissan study:
http://www.marketingsherpa.com/nissan/ad.html

Delvinia - the interactive agency that created the AskingMedia (TM) rich media tool used by Nissan for the study. (Note: this tool is also used by other companies.)
http://www.delvinia.com

Millward Brown - the research firm Nissan Canada uses to run creative testss
http://www.millwardbrown.com/

Global Market Insite (GMI) - the online research firm that provided the opt-in consumer panel for the Nissan project:
http://www.gmi-mr.com/pages/panels.html

Nissan Canada
http://www.nissan.ca

Improve Your Marketing

Join our thousands of weekly case study readers.

Enter your email below to receive MarketingSherpa news, updates, and promotions:

Note: Already a subscriber? Want to add a subscription?
Click Here to Manage Subscriptions