July 19, 2021
Case Study

14 Market Research Examples

SUMMARY:

Curiosity.

At the heart of every successful marketing campaign is a curious marketer who learned how to better serve a customer.

In this industry, we scratch that curiosity itch with market research.

To help give you ideas to learn about your customer, in this article we bring you examples from Consumer Reports, Intel, Visa USA, Hallmark, Levi Strauss, John Deere, LeapFrog, Spiceworks Ziff Davis and more.

by Daniel Burstein, Senior Director, Content & Marketing, MarketingSherpa and MECLABS Institute

14 Market Research Examples

This article was originally published in the MarketingSherpa email newsletter.

Example #1: National bank’s A/B testing

You can learn what customers want by conducting experiments on real-life customer decisions using A/B testing. When you ensure your tests do not have any validity threats, the information you garner can offer very reliable insights into customer behavior.

Here’s an example from Flint McGlaughlin, CEO of MarketingSherpa and MECLABS Institute, and the creator of its online marketing course.

A national bank was working with MECLABS to discover how to increase the number of sign-ups for new checking accounts.

Customers who were interested in checking accounts could click on an “Open in Minutes” link on the bank’s homepage.

Creative Sample #1: Anonymized bank homepage

Creative Sample #1: Anonymized bank homepage

After clicking on the homepage link, visitors were taken to a four-question checking account selector tool.

Creative Sample #2: Original checking account landing page — account recommendation selector tool

Creative Sample #2: Original checking account landing page — account recommendation selector tool

After filling out the selector tool, visitors were taken to a results page that included a suggested package (“Best Choice”) along with a secondary option (“Second Choice”). The results page had several calls to action (CTAs). Website visitors were able to select an account and begin pre-registration (“Open Now”) or find out more information about the account (“Learn More”), go back and change their answers (“Go back and change answers”), or manually browse other checking options (“Other Checking Options”).

Creative Sample #3: Original checking account landing page — account recommendation selector tool results page

Creative Sample #3: Original checking account landing page — account recommendation selector tool results page

After going through the experience, the MECLABS team hypothesized that the selector tool wasn’t really delivering on the expectation the customer had after clicking on the “Open in Minutes” CTA. They created two treatments (new versions) and tested them against the control experience.

In the first treatment, the checking selector tool was removed, and instead, customers were directly presented with three account options in tabs from which customers could select.

Creative Sample #4: Checking account landing page Treatment #1

Creative Sample #4: Checking account landing page Treatment #1

The second treatment’s landing page focused on a single product and had only one CTA. The call-to-action was similar to the CTA customers clicked on the homepage to get to this page — “Open Now.”

Creative Sample #5: Checking account landing page Treatment #2

Creative Sample #5: Checking account landing page Treatment #2

Both treatments increased account applications compared to the control landing page experience, with Treatment #2 generating 65% more applicants at a 98% level of confidence.

Creative Sample #6: Results of bank experiment that used A/B testing

Creative Sample #6: Results of bank experiment that used A/B testing

You’ll note the Level of Confidence in the results. With any research tactic or tool you use to learn about customers, you have to consider whether the information you’re getting really represents most customers, or if you’re just seeing outliers or random chance.

With a high Level of Confidence like this, it is more likely the results actually represent a true difference between the control and treatment landing pages and that the results aren’t just a random event.

The other factor to consider is — testing in and of itself will not produce results. You have to use testing as research to actually learn about the customer and then make changes to better serve the customer.

In the video How to Discover Exactly What the Customer Wants to See on the Next Click: 3 critical skills every marketer must master, McGlaughlin discussed this national bank experiment and explained how to use prioritization, identification and deduction to discover what your customers want.

This example was originally published in Marketing Research: 5 examples of discovering what customers want.

Example #2: Consumer Reports’ market intelligence research from third-party sources

The first example covers A/B testing. But keep in mind, ill-informed A/B testing isn’t market research, it’s just hoping for insights from random guesses.

In other words, A/B testing in a vacuum does not provide valuable information about customers. What you are testing is crucial, and then A/B testing is a means to help better understand whether insights you have about the customer are either validated or refuted by actual customer behavior. So it’s important to start with some research into potential customers and competitors to inform your A/B tests.

For example, when MECLABS and MarketingExperiments (sister publisher to MarketingSherpa) worked with Consumer Reports on a public, crowdsourced A/B test, we provided a market intelligence report to our audience to help inform their test suggestions.

Every successful marketing test should confirm or deny an assumption about the customer. You need enough knowledge about the customer to create marketing messages you think will be effective.

For this public experiment to help marketers improve their split testing abilities, we had a real customer to work with — donors to Consumer Reports.

To help our audience better understand the customer, the MECLABS Marketing Intelligence team created the 26-page ConsumerReports Market Intelligence Research document (which you can see for yourself at that link).

This example was originally published in Calling All Writers and Marketers: Write the most effective copy for this Consumer Reports email and win a MarketingSherpa Summit package and Consumer Reports Value Proposition Test: What you can learn from a 29% drop in clickthrough.

Example #3: Virtual event company’s conversation

What if you don’t have the budget for A/B testing? Or any of the other tactics in this article?

Well, if you’re like most people you likely have some relationships with other human beings. A significant other, friends, family, neighbors, co-workers, customers, a nemesis (“Newman!”). While conducting market research by talking to these people has several validity threats, it at least helps you get out of your own head and identify some of your blind spots.

WebBabyShower.com’s lead magnet is a PDF download of a baby shower thank you card ‘swipe file’ plus some extras. “Women want to print it out and have it where they are writing cards, not have a laptop open constantly,” said Kurt Perschke, owner, WebBabyShower.com.

That is not a throwaway quote from Perschke. That is a brilliant insight, so I want to make sure we don’t overlook it. By better understanding customer behavior, you can better serve customers and increase results.

However, you are not your customer. So you must bridge the gap between you and them.

Often you hear marketers or business leaders review an ad or discuss a marketing campaign and say, “Well, I would never read that entire ad” or “I would not be interested in that promotion.” To which I say … who cares? Who cares what you would do? If you are not in the ideal customer set, sorry to dent your ego, but you really don’t matter. Only the customer does.

Perschke is one step ahead of many marketers and business leaders because he readily understands this. “Owning a business whose customers are 95% women has been a great education for me,” he said.

So I had to ask him, how did he get this insight into his customers’ behavior? Frankly, it didn’t take complex market research. He was just aware of this disconnect he had with the customer, and he was alert for ways to bridge the gap. “To be honest, I first saw that with my wife. Then we asked a few customers, and they confirmed it’s what they did also. Writing notes by hand is viewed as a ‘non-digital’ activity and reading from a laptop kinda spoils the mood apparently,” he said.

Back to WebBabyShower. “We've seen a [more than] 100% increase in email signups using this method, which was both inexpensive and evergreen,” Perschke said.

This example was originally published in Digital Marketing: Six specific examples of incentives that worked.

Example #4: Spiceworks Ziff Davis’ research-informed content marketing

Marketing research isn’t just to inform products and advertising messages. Market research can also give your brand a leg up in another highly competitive space – content marketing.

Don’t just jump in and create content expecting it to be successful just because it’s “free.” Conducting research beforehand can help you understand what your potential audience already receives and where they might need help but are currently being served.

When Spiceworks Ziff Davis (SWZD) published its annual State of IT report, it invested months in conducting primary market research, analyzing year-over-year trends, and finally producing the actual report.

“Before getting into the nuts and bolts of writing an asset, look at market shifts and gaps that complement your business and marketing objectives. Then, you can begin to plan, research, write, review and finalize an asset,” said Priscilla Meisel, Content Marketing Director, SWZD.

This example was originally published in Marketing Writing: 3 simple tips that can help any marketer improve results (even if you’re not a copywriter).

Example #5: Business travel company’s guerilla research

There are many established, expensive tactics you can use to better understand customers.

But if you don’t have the budget for those tactics, and don’t know any potential customers, you might want to brainstorm creative ways you can get valuable information from the right customer target set.

Here’s an example from a former client of Mitch McCasland, Founding Partner and Director, Brand Inquiry Partners. The company sold a product related to frequent business flyers and was interested in finding out information on people who travel for a living. They needed consumer feedback right away.

“I suggested that they go out to the airport with a bunch of 20-dollar bills and wait outside a gate for passengers to come off their flight,” McCasland said. When people came off the flight, they were politely asked if they would answer a few questions in exchange for the incentive (the $20). By targeting the first people off the flight they had a high likelihood of reaching the first-class passengers.

This example was originally published in Guerrilla Market Research Expert Mitch McCasland Tells How You Can Conduct Quick (and Cheap) Research.

Example #6: Intel’s market research database

When conducting market research, it is crucial to organize your data in a way that allows you to easily and quickly report on it. This is especially important for qualitative studies where you are trying to do more than just quantify the data, but need to manage it so it is easier to analyze.

Anne McClard, Senior Researcher, Doxus worked with Shauna Pettit-Brown of Intel on a research project to understand the needs of mobile application developers throughout the world.

Intel needed to be able to analyze the data from several different angles, including segment and geography, a daunting task complicated by the number of interviews, interviewers, and world languages.

“The interviews were about an hour long, and pretty substantial,” McClard says. So, she needed to build a database to organize the transcripts in a way that made sense.

Different types of data are useful for different departments within a company; once your database is organized you can sort it by various threads.

The Intel study had three different internal sponsors. "When it came to doing the analysis, we ended up creating multiple versions of the presentation targeted to individual audiences," Pettit-Brown says.

The organized database enabled her to go back into the data set to answer questions specific to the interests of the three different groups.

This example was originally published in 4 Steps to Building a Qualitative Market Research Database That Works Better.

Example #7: National security survey’s priming

When conducting market research surveys, the way you word your questions can affect customers’ response. Even the way you word previous questions can put customers in a certain mindset that will skew their answers.

For example, when people were asked if they thought the U.S. government should spend money on an anti-missile shield, the results appeared fairly conclusive. Sixty-four percent of those surveyed thought the country should and only six percent were unsure, according to Opinion Makers: An Insider Exposes the Truth Behind the Polls.

But when pollsters added the option, "...or are you unsure?" the level of uncertainty leaped from six percent to 33 percent. When they asked whether respondents would be upset if the government took the opposite course of action from their selection, 59 percent either didn’t have an opinion or didn’t mind if the government did something differently.

This is an example of how the way you word questions can change a survey’s results. You want survey answers to reflect customer’s actual sentiments that are as free of your company’s previously held biases as possible.

This example was originally published in Are Surveys Misleading? 7 Questions for Better Market Research.

Example #8: Visa USA’s approach to getting an accurate answer

As mentioned in the previous example, the way you ask customers questions can skew their responses with your own biases.

However, the way you ask questions to potential customers can also illuminate your understanding of them. Which is why companies field surveys to begin with.

“One thing you learn over time is how to structure questions so you have a greater likelihood of getting an accurate answer. For example, when we want to find out if people are paying off their bills, we'll ask them to think about the card they use most often. We then ask what the balance was on their last bill after they paid it,” said Michael Marx, VP Research Services, Visa USA.

This example was originally published in Tips from Visa USA's Market Research Expert Michael Marx.

Example #9: Hallmark’s private members-only community

Online communities are a way to interact with and learn from customers. Hallmark created a private members-only community called Idea Exchange (an idea you could replicate with a Facebook or LinkedIn Group).

The community helped the greeting cards company learn the customer’s language.

“Communities…let consumers describe issues in their own terms,” explained Tom Brailsford, Manager of Advancing Capabilities, Hallmark Cards. “Lots of times companies use jargon internally.”

At Hallmark they used to talk internally about “channels” of distribution. But consumers talk about stores, not channels. It is much clearer to ask consumers about the stores they shop in than what channels they shop.

For example, Brailsford clarified, “We say we want to nurture, inspire, and lift one’s spirits. We use those terms, and the communities have defined those terms for us. So we have learned how those things play out in their lives. It gives us a much richer vocabulary to talk about these things.”

This example was originally published in Third Year Results from Hallmark's Online Market Research Experiment.

Example #10: L'Oréal’s social media listening

If you don’t want the long-term responsibility that comes with creating an online community, you can use social media listening to understand how customers talking about your products and industry in their own language.

In 2019, L'Oréal felt the need to upgrade one of its top makeup products – L'Oréal Paris Alliance Perfect foundation. Both the formula and the product communication were outdated – multiple ingredients had emerged on the market along with competitive products made from those ingredients.

These new ingredients and products were overwhelming consumers. After implementing new formulas, the competitor brands would advertise their ingredients as the best on the market, providing almost magical results.

So the team at L'Oréal decided to research their consumers’ expectations instead of simply crafting a new formula on their own. The idea was to understand not only which active ingredients are credible among the audience, but also which particular words they use while speaking about foundations in general.

The marketing team decided to combine two research methods: social media listening and traditional questionnaires.

“For the most part, we conduct social media listening research when we need to find out what our customers say about our brand/product/topic and which words they use to do it. We do conduct traditional research as well and ask questions directly. These surveys are different because we provide a variety of readymade answers that respondents choose from. Thus, we limit them in terms of statements and their wording,” says Marina Tarandiuk, marketing research specialist, L'Oréal Ukraine.

“The key value of social media listening (SML) for us is the opportunity to collect people’s opinions that are as ‘natural’ as possible. When someone leaves a review online, they are in a comfortable environment, they use their ‘own’ language to express themselves, there is no interviewer standing next to them and potentially causing shame for their answer. The analytics of ‘natural’ and honest opinions of our customers enables us to implement the results in our communication and use the same language as them,” Tarandiuk said.

The team worked with a social media listening tool vendor to identify the most popular, in-demand ingredients discussed online and detect the most commonly used words and phrases to create a “consumer glossary.”

Questionnaires had to confirm all the hypotheses and insights found while monitoring social media. This part was performed in-house with the dedicated team. They created custom questionnaires aiming to narrow down all the data to a maximum of three variants that could become the base for the whole product line.

“One of our recent studies had a goal to find out which words our clients used to describe positive and negative qualities of [the] foundation. Due to a change in [the] product’s formula, we also decided to change its communication. Based on the opinions of our customers, we can consolidate the existing positive ideas that our clients have about the product,” Tarandiuk said.

To find the related mentions, the team monitored not only the products made by L'Oréal but also the overall category. “The search query contained both brand names and general words like foundation, texture, smell, skin, pores, etc. The problem was that this approach ended up collecting thousands of mentions, not all of which were relevant to the topic,” said Elena Teselko, content marketing manager, YouScan (L'Oréal’s social media listening tool).

So the team used artificial intelligence-based tagging that divided mentions according to the category, features, or product type.

This approach helped the team discover that customers valued such foundation features as not clogging pores, a light texture, and not spreading. Meanwhile, the most discussed and appreciated cosmetics component was hyaluronic acid.

These exact phrases, found with the help of social media monitoring, were later used for marketing communication.

Creative Sample #7: Marketing communicating for personal care company with messaging based on discoveries from market research

Creative Sample #7: Marketing communicating for personal care company with messaging based on discoveries from market research

“Doing research and detecting audience’s interests BEFORE starting a campaign is an approach that dramatically lowers any risks and increases chances that the campaign would be appreciated by customers,” Teselko said.

This example was originally published in B2C Branding: 3 quick case studies of enhancing the brand with a better customer experience.

Example #11: Levi’s ethnographic research

In a focus group or survey, you are asking customers to explain something they may not even truly understand. Could be why they bought a product. Or what they think of your competitor.

Ethnographic research is a type of anthropology in which you go into customers’ homes or places of business and observe their actual behavior, behavior they may not understand well enough to explain to you.

While cost prohibitive to many brands, and simply unfeasible for others, it can elicit new insights into your customers.

Michael Perman, Senior Director Cultural Insights, Levi Strauss & Co. uses both quantitative and qualitative research on a broad spectrum, but when it comes to gathering consumer insight, he focuses on in-depth ethnographic research provided by partners who specialize in getting deep into the “nooks and crannies of consumer life in America and around the world.” For example, his team spends time in consumers’ homes and in their closets. They shop with consumers, looking for the reality of a consumer’s life and identifying themes that will enable designers and merchandisers to better understand and anticipate consumer needs.

Perman then puts together multi-sensory presentations that illustrate the findings of research. For example, “we might recreate a teenager’s bedroom and show what a teenage girl might have on her dresser.”

This example was originally published in How to Get Your Company to Pay Attention to Market Research Results: Tips from Levi Strauss.

Example #12: eBags’ ethnographic research

Ethnographic research isn’t confined to a physical goods brand like Levi’s. Digital brands can engage in this form of anthropology as well.

While usability testing in a lab is useful, it does miss some of the real-world environmental factors that play a part in the success of a website. Usability testing alone didn’t create a clear enough picture for Gregory Casey, User Experience Designer and Architect, eBags.

“After we had designed our mobile and tablet experience, I wanted to run some contextual user research, which basically meant seeing how people used it in the wild, seeing how people are using it in their homes. So that’s exactly what I did,” Gregory said.

He found consumers willing to open their home to him and be tested in their normal environment. This meant factors like the television, phone calls and other family members played a part in how they experienced the eBags mobile site.

“During these interview sessions, a lot of times we were interrupted by, say, a child coming over and the mother having to do something for the kid … The experience isn’t sovereign. It’s not something where they just sit down, work through a particular user flow and complete their interaction,” Gregory said.

By watching users work through the site as they would in their everyday life, Gregory got to see what parts of the site they actually use.

This example was originally published in Mobile Marketing: 4 takeaways on how to improve your mobile shopping experience beyond just responsive design.

Example #13: John Deere’s shift from product-centric market research to consumer-centric research

One of the major benefits of market research is to overcome company blind spots. However, if you start with your blind spots – i.e., a product focus – you will blunt the effectiveness of your market research.

In the past, “they’d say, Here’s the product, find out how people feel about it,” explained David van Nostrand, Manager, John Deere's Global Market Research. “A lot of companies do that.” Instead, they should be saying, “Let's start with the customers: what do they want, what do they need?”

The solution? A new in-house program called “Category Experts” brings the product-group employees over as full team members working on specific research projects with van Nostrand’s team.

These staffers handle items that don’t require a research background: scheduling, meetings, logistics, communication and vendor management. The actual task they handle is less important than the fact that they serve as human cross-pollinators, bringing consumer-centric sensibility back to their product- focused groups.

For example, if van Nostrand’s team is doing research about a vehicle, they bring in staffers from the Vehicles product groups. “The information about vehicle consumers needs to be out there in the vehicle marketing groups, not locked in here in the heads of the researchers.”

This example was originally published in How John Deere Increased Mass Consumer Market Share by Revamping its Market Research Tactics.

Example #14: LeapFrog’s market research involvement throughout product development (not just at the beginning and the end)

Market research is sometimes thought of as a practice that can either inform the development of a product, or research consumer attitudes about developed products. But what about the middle?

Once the creative people begin working on product designs, the LeapFrog research department stays involved.

They have a lab onsite where they bring moms and kids from the San Francisco Bay area to test preliminary versions of the products. “We do a lot of hands-on, informal qualitative work with kids,” said Craig Spitzer, VP Marketing Research, LeapFrog. “Can they do what they need to do to work the product? Do they go from step A to B to C, or do they go from A to C to B?”

When designing the LeapPad Learning System, for example, the prototype went through the lab “a dozen times or so,” he says.

A key challenge for the research department is keeping and building the list of thousands of families who have agreed to be on call for testing. “We've done everything from recruiting on the Internet to putting out fliers in local schools, working through employees whose kids are in schools, and milking every connection we have,” Spitzer says.

Kids who test products at the lab are compensated with a free, existing product rather than a promise of the getting the product they're testing when it is released in the future.

This example was originally published in How LeapFrog Uses Marketing Research to Launch New Products.

Related resources

The Marketer’s Blind Spot: 3 ways to overcome the marketer’s greatest obstacle to effective messaging

Get Your Free Test Discovery Tool to Help Log all the Results and Discoveries from Your Company’s Marketing Tests

Marketing Research: 5 examples of discovering what customers want

Online Marketing Tests: How do you know you’re really learning anything?


Improve Your Marketing

Join our thousands of weekly case study readers.

Enter your email below to receive MarketingSherpa news, updates, and promotions:

Note: Already a subscriber? Want to add a subscription?
Click Here to Manage Subscriptions