HR technology company AppVault has taken its testing strategy from infancy to advanced behavioral testing. In this article, the Marketing Manager takes us through that evolution and also through the A/B and automated testing his team has been building upon for the past year with their email marketing and landing pages.
Read how marketing automation and a testing strategy can build upon open rate and clickthrough rate wins to have an impact on your bottom line.
Previously to their current email service provider, AppVault had one in place that Geoffrey Bell, Marketing Manager, AppVault, described as “simplistic.”
“It clearly had no difficulty creating email and getting email out but really lacked, obviously, any type of testing capabilities. You certainly could do it, but it would have to be separate campaigns and a lot of manual work done,” he said.
He wanted to begin testing email to work on conversions instead of just creating an email blast and “hoping for the best.”
“I thought if I could work on getting our emails opened more frequently and more often by our leads and contacts within the marketing automation system, it would ultimately lead to more clicks, more asset downloads, and ultimately more meetings,” he said.
He added that it was critical to move beyond batch and blast in email marketing because, “If we're not testing and measuring to improve, I'm not sure why we're doing it in the first place.”
Based in Atlanta, Georgia, recruitment marketing platform AppVault started in 2002 and typically sells into HR departments for primarily enterprise-level businesses.
“Typically, those departments are from organizations that have a minimum of 5,000 employees and usually at or around 50 to 75 open jobs per month on their career site and available through the organization,” Bell said.
Bell implemented the new ESP on the first of the year in 2016 and was “focused on getting content out there, pushing assets,” and then began testing in mid-year.
He focused on testing email marketing, including send days and times, personalization tactics, as well as testing the AppVault landing page, improving each through wins and losses.
“I don't think you can improve without making mistakes. But, of course, we brush it off and get up,” he continued.
Step #1. Use early testing to set a benchmark
“This is really where the fun stuff is,” Bell said. “At first, we started testing simplistic things like dates and times during the day — Did we get a better open rate when we sent Tuesday morning? Did we get a better open rate when we sent Thursday afternoon? — You could probably have a conversation around email-sending best times.”
Those early tests were to set a benchmark for understanding what the average open and clickthrough rates were and get the simple best practices for the company — like best day and time — settled.
From there, he said that the team graduated to understanding whether or not it would be more beneficial for the email to come from different addresses.
“Who is this coming from? Is this coming from Marketing@AppVault? Is this coming from Info@AppVault,” he asked, “or is this even coming from me, per se, as a marketer?”
He added that they began to understand, with the volume of email that goes out today, “The notion of it coming from an individual certainly helps increase open rate.”
He began moving away from the more generic email addresses and began incorporating himself as the sender — or making it from sales reps — when, “In fact, it was being created and scheduled and executed by me.”
Step #2. Hone testing in on consumer challenges
Bell said that ultimately, the more advanced testing is where he has seen some significant wins.
One example he gave was “using merge fields that are ultimately brought down from the marketing automation system, where you can use things like first name, company name, maybe geographic location.”
An aspect in particular that he has focused on is called install tech, which focuses on a piece of technology that is being used within their career site — usually an applicant tracking system.
“I started incorporating that into the subject lines to see if [it] would increase open rates. Long story short, no question, when a lead saw their install tech . . . open rates really went through the roof,” he said.
The reason, he believes, is because they were self-identifying with the subject line — not because it had their first name or company in it — but because it connected to a challenge they had.
“There are probably 20 [applicant tracking systems] out there, but really like five to eight major players in the ATS industry. So, as a result, those open rates really increased when we started including the installed tech in the subject line,” he said.
Bell ran a test where the control was a subject line that included the person’s name, and the treatment had the installed tech in the subject line. The control had a 13% unique open rate, whereas the treatment featuring the installed tech, had a unique open rate of 66%.
He noted that its important data is accurate and current because, “If we send an email with the wrong ATS in the subject line, we're probably going to get unsubscribed pretty quickly. I can't say enough about data integrity, data cleanliness, and ultimately, the ability to segment, based on those fields within the marketing automation system.”
Bell also said that testing is about building upon what you’ve learned about customers from the earlier tests.
“By and large, the more personalized you can get it, the higher the response rate is. So, in some campaigns, I've used not only install tech, but company merge fields. I've used first name merge fields, geographic merge fields to try to bring that personalization into the email marketing system,” he added.
Step #3. Let testing take the lead with landing page decisions
“I like making decisions that are based off of data and analytics and not intuition,” Bell said.
That instinct is what led him into landing page testing as well, which is “another fun one” in the testing realm.
“If you think about a landing page in general, the call-to-action is obviously like metric number one — did they convert?” he said. “So, what I've tried to play around with, obviously, are things like buttons, colors of buttons, the words that are used inside of the button, even punctuation within the buttons.”
Bell added that all of these changes are to “see what it takes to get one more person to click on that button.”
He also mentioned that the best way to test a simple landing page is through LinkedIn. For example, “We had a nice piece of content. I believe it was a webinar,” he said. “I wanted to go and do a big spend on LinkedIn to try to generate some new leads.”
He took the landing page for this content and duplicated it for testing, making one change — one had the call-to-action on the left, and the other had it on the right.
“I spent $300 on each landing page within LinkedIn — two separate campaigns to understand which one was converting better. As a result, the left side converted significantly higher,” he said.
After that, he spent more money on LinkedIn but promoted the page that had been more successful, thereby optimizing his investment — not only on that one platform, either.
“Just based on that, we can then take that landing page and do third-party advertising on websites. We can run internal email to get folks to that landing page knowing that it was converting better,” he said.
He continued by stating that the ultimate goal of pulling people in with design testing is to see if the fields being asked of prospects — first name, last name, email company, employee size, open job requisition — are actually good qualifiers for the team to know if the person is a good prospect.
In this pursuit, Bell stresses again the importance of good data and being able to trust the information in those fields.
“People move up. People move on. People move back. What you thought was a Director of Talent Acquisition at a Fortune 500, in fact, they are now a VP of HR,” he said. “Really trying to stay on top of your data is incredibly important for personalization and from an email marketing perspective.”
Step #4. Focus on nurture-based testing
“Where I'm starting to see some wins and where I want to spend more time is actually in nurture-based email marketing,” he said.
Bell added that within many marketing automation systems, there are functionalities that help design nurture paths.
“I look at things like an initial email from a rep — [with] pretty HTML, ‘We've got an upcoming webinar,’ and try and get the best subject line that we can in there,” he said. “But the ability to pause seven days and take conditional steps on … did the contact open this email? Did the contact click any link within the body of this email? And then ultimately making steps off of that.”
Bell gave an example: The team started on a small HTML-based email send for one of their healthcare reps, which was pushing a webinar.
If, after seven days, the contact hadn’t opened the email, they did a simple subject line swap. If the contact opened the email, but didn’t click a link within the email, they sent a text-based email that was personalized from a sales rep, differing from the first HTML send.
That send has seen a unique open rate of 31% and a unique click rate of 68%. Bell’s theory is that people who opened the first email thought, “Maybe I’ll come back around to it.”
When receiving the second email with the personal text-based message from a sales rep, the person feels the sense of urgency in why they need to register for the webinar.
“[Of] the 31% of them that opened, well over half went ahead and clicked through — and to me, that is a marketing win,” he said.
“At the end of the day … my team is on the hook for meetings. If we don't hit the number of meetings in a given month, we're not doing our job,” Bell emphasized. “So I want to always think about working smarter, not harder.”
By working smarter, he means sending “less email and getting the same amount of meetings.” He added that more emails means less personalized blasts.
“If we're not getting our meetings, you can rest assured I, my team and anybody else will be saying until 8:00, 9:00 at night trying to get meetings the old-fashioned way over the phone to the West Coast. I have no intention of staying in my office on a Friday until 8:30 or 9:00 at night.”
Bell said that if technology can prevent that and “enable us to make better decisions, get better open rates and better click rates and conversions, then that's clearly where I'm going to spend my time.”
As for where he’s headed in the future, Bell said, “This nurture stuff is really, really powerful. This is where I'm going, where I'd like to spend more time.”
SalesFusion — AppVault’s sales and marketing automation platform
Increase Mobile Conversion Rates
These five free micro classes (each under 12 minutes) apply 25 years of research to help you maximize the impact of your messages in a mobile environment.Get the Course >
Test Discovery Tool
Show business leaders all the results of your testing efforts with this free tool.Get the Tool >
A Model of Your Customer's Mind
These 21 tools and concepts have helped capture more than $500 million in test wins.Download the File >
Research-based Lead Generation Swipe File
22 valid marketing experiments to give you ideas for your next A/B test.Download the File >
Free A/B Test Planning Scenario Tool
This simple tool helps you visualize factors that affect the ROI implications of test sequencing.Get the Tool >
Receive the latest case studies and data on email, lead gen, and social media along with MarketingSherpa updates and promotions.