Jeff Brunings admits he's a very lucky Marketing Director in that he doesn't have to focus 100% on short-term sales lead generation campaigns the way some marketers do.
He's already developed a strong in-house prospect list with email permission in the tens of thousands, plus his company, Miller Heiman is one of the leaders in its field of sales team development and consulting to the Fortune 1000.
But, its average account size is large enough that sales cycles can range from 90-days to a year or more. Brunings' campaigns have to keep those leads as warm as possible the whole time so the company's reps can close them.
Plus, the field is crowded with competitors and in danger of becoming commoditized which brings on the dangers of pricing wars. To fight back, Brunings needed to promote Miller Heiman as the thought leader in the marketplace.
In the past he'd impressed prospects by presenting them with white papers and in-person summits with name-brand analysts such as Gartner Group. Now it was time for Miller Heiman to step up to the plate, and become the promoted authority themselves.CAMPAIGN
For the past three years, Brunings had conducted an annual customer survey to learn more about clients' needs and desires. For 2004, he decided to expand the effort to create a statistically valid research report he could then use as marketing tool in and of itself.
-> Step one: Devising the survey
The single biggest mistake most marketers make when writing surveys is to start by brainstorming a list of questions.
Instead, Brunings invited Miller Heiman's senior management team to a kick-off meeting where they, with help from a research methodology firm, "brainstormed what we wanted to understand, what hypotheses we wanted to validate, and based on that understanding how we wanted to apply results."
Only then were the actual questions written, each specifically corresponding to a data point everyone agreed they really needed.
They ended up with 45-questions related to sales effectiveness, plus 25 demographic questions to use when analyzing results.
It's very, very hard to get executives to fill out long surveys, especially salespeople who typically have short attention spans.
So the survey was designed in three ways to make it as easy as possible to get through:
a. Only one freeform "write in" box was included and it was at the very end. Freeform boxes are both hard to respond to, and difficult to analyze results from, so you want to avoid them whenever possible.
b. No drop-downs and no grids were included. All questions (aside from that one freeform box) offered easy-to-click radio button answers.
c. The rating system was super-simple, respondents chose from 1 (strongly agree) to 5 (strongly disagree.) "You lose a lot of impact when your scale is 1-10," notes Brunings. "How big a difference really is there between six and seven? We're talking shades of grey that create more challenges on the back-end when you're comparing data."
-> Step two: Getting hundreds of senior execs to take it
Brunings sent out survey invites in early November 2003 via three channels -- a dedicated issue of his monthly email newsletter sent to prospects and clients, a couple of small pop-ups on the company Web site, and text-style ads on Hoovers.com where he'd already been building brand recognition with banners for the past year. (Link to samples of all three below.)
He used a chance to win one of five $100 Amazon gift certificates and a promise to share survey results as incentives. "Executives don't care about silly incentives like golf bags. They want the survey results."
The survey stayed open for 14 days online.
-> Step three: Using the results as a sales and marketing tool
The minute the survey closed, the team worked frantically to tabulate results. Brunings worried that if he didn't get back to respondents with the promised report in a timely manner they would figure he'd broken his promise -- as happens all too frequently in the survey world.
"You have to get it out to them in the timeframe you promised. It's one of the biggest frustrations and turn-offs people have about participating. The deadline took precedent for us."
Sales execs are not usually big readers, so rather than send out the entire report, the team created a five-page executive summary, featuring the data respondents would find the most juicy and useful. Notably this report did not include marketing copy for Miller Heiman.
"People participated because they wanted to be part of an authentic, genuine research effort. They wanted the results to be the research, not a product pitch. It will ultimately build more credibility for us as a company by not doing that. There's plenty of other ways to draw the connection between the results and the capabilities of our company. This is not the vehicle to make those connections."
Brunings got the word out that the results were ready in stages:
first respondents got an email telling them to pick up their copy. Then a week later Brunings used his monthly email newsletter to let the rest of the prospect and client list know.
In both these cases, there was no barrier on the landing page to picking up the PDF (although there would be for newcomers later.) Why annoy people by making them register to see something when you already have their contact info in your database? It's just rude. (Link to sample of this landing page below.)
Although his sales team all got the link as well, Brunings did not make the mistake of assuming they'd read and digested the report and were ready to use it.
He explains, "In marketing we have a tendency to assume salespeople know how to use the tools we create. Their best tool is their salesmanship [not reading marketing materials]. You have to work with your sales force to really simplify information down to the lowest common denominator. Make it easy."
How? He's sent every rep printed copies, plus CD ROMs, of the report. Plus, he's given them a list of "soundbites", interesting factoids and results nuggets they can drop into their emails and conversation. And, this week he's conducting a webinar for all of the company's far-flung reps to attend so they can learn more and ask questions.
Brunings will also present results at the ongoing summit roadshow the company puts on for clients and prospects. And, he's posted links on the company site, and issued a press release on the survey results.
Last, but not least, the extended survey results are being used by Miller Heiman to develop and tweak both products and marketing copy for 2004.
5.5% of Miller Heiman's email newsletter recipients clicked to and completed the entire survey. Which meant although Brunings had hoped for 750 responses, he got thousands, roughly 50% of which came in within 24 hours.
If Brunings had made the survey request one of several content items in the newsletter, the response rate would have been far lower. Focusing attention on one thing only makes a big difference.
Proving this point, the Hoovers ad had split attention with two response options (the survey and a separate tools offer.) Although the click rate was .07%, just 8% of clicks converted to taking the survey.
Focusing on usability makes a huge difference. A tiny 2.3% of people who started the survey didn't finish it. That's stunning for such a long survey to this marketplace.
Roughly 10% of respondents wrote anything in the freeform box at the end. Of these comments, Brunings was surprised to see most simply said they were excited about getting their copy of the results report. "This market is so hungry for research. They can't afford to buy it and if you create the research that really resonates with them, it's gold."
He learned a tough lesson about timing though. Brunings had promised results in 60-days, fully expecting this was a reasonable deadline even including the holiday season. It wasn't. The team cranked hard, but it took about 75-days in total.
9.6% of participants responded to the first note Brunings sent telling them the results report was ready. .79% of email newsletter readers responded to the note he sent them offering the results the following week. (This would have been higher if the two files didn't have such a high overlap rate.)
Plus, four different reporters who'd heard about the report from word of mouth called Brunings to request interviews *before* he sent out the formal press release.
By the way, Brunings said the top survey result marketers might want to pay attention to is the fact that "46% of sales leaders believe their company does not have a well-defined methodology to identify qualified sales prospects. Senior sales management are concerned their reps are chasing down the wrong opportunities. Marketing is certainly contributing to this obstacle."
He advises companies to start judging marketers' performance by the quality of lead not the quantity. "I get evaluated not only on the average revenue per lead produced, but also on the cost of the lead, and the length of the sales cycle."
Useful links related to this article:
Samples of survey invites, ads, and response
Download link to exec summary of results (requires registration):
M3 Planning - the research firm who helped create and analyze the survey