At MarketingSherpa MarketingExperiments Web Optimization Summit 2014, marketers gathered from around the world to discover what works in conversion rate optimization. From landing page optimization, A/B split testing, value proposition development and data analysis, brand-side marketers and industry experts took the stage to share their successes and insights, as well as actionable takeaways.
In this webinar, Daniel Burstein, Director of Editorial Content, and Allison Banko, Reporter, both of MECLABS, distilled two days of Summit sessions and featured speakers into 30 minutes. Watch this webinar for the best of Web Optimization Summit 2014 with ideas to improve your marketing efforts.
Burstein and Banko covered sessions including:
Download the slides to this presentation
- Michael Norton, Associate Professor, Harvard Business School, on gaining trust through transparency
- Julia Babiarz and Emily Titcomb of Ancestry.com, revealing how the company transformed to provide a customized online experience
- Peter Doucette, Vice President, Consumer Sales and Marketing, The Boston Globe, sharing how the newspaper evolved to a culture of testing and optimization
- And much more
Related ResourcesMarketingSherpa Email Summit 2015
— Call for speakers now openWeb Optimization: Can you repeat your test results?Web Optimization: Ancestry.com improves conversion 20% by reducing choice barriersWeb Optimization Summit 2014 Wrap-up: Top 5 takeaways to improve your testing and optimizationWeb Optimization: How The Boston Globe used customer insight to test value proposition
Hello and welcome to another MarketingSherpa webinar. Today we are going to be talking about the top takeaways from Web Optimization Summit 2014. You are likely driving a lot of traffic to your websites right now with print ads, with paid search ads, everything you're doing. What if you can convert that traffic just a little bit better? You can go from a 5% to 10% conversion for example. You have another five people saying yes to your offer. Hopefully we are going to be able to give you some takeaways to improve your website.
My name is Daniel Burstein. I'm the Director of Editorial Content here at MECLABS, parent company of MarketingSherpa. Joining me today is MarketingSherpa reporter Allison Banko. Thanks for joining us, Allison.Banko:
Thanks for having me. Burstein:
And we want you to participate in today's webinar as well. You can use hashtag #WebOpt14. Ask your questions. Hopefully we can answer them today but also share what works for you in web optimization. How do you improve conversion? How do you optimize your landing pages? If you were are with us just two weeks ago in New York City, please also share your top takeaways from Web Optimization Summit. Also, if you are following hashtag #WebOpt14, we are going to be tweeting some other case studies, other information that can help you improve your site so let's get into it.
So as I said Allison and I, we were in New York City two weeks ago, the TimeCenter, Midtown Manhattan. We had about 300 marketers with us and case study after case study had two days’ worth of case studies, split sessions packed full of ideas for improving conversion rates. And what we’re going to try to do in the next 30 minutes we have is pack as many takeaways as we can from that Summit so you are essentially getting 20 hours of content in just 30 minutes so let's see what we can do.
First of all, we want to talk about the importance of web optimization. How can it help you? I think the best way I can do that is to look at a previous attendee of one of our Summits. This is Kait Vinson. She's a webinar product marketing manager for a company called Accellos. She attended one of our Summits. You can see her she has a lot of ideas for how to improve her website, right? There's the website. The original website she had, here was a new design created, and she was able to create a 56% increase in conversion. That's the power of what we are going to talk about today and what we are going to share with you.
Also, want to share with you some data from our MarketingSherpa Ecommerce Benchmark Study
, which we recently released, surveyed more than 4,000 marketers. Here is one of the things they told us, looked at their conversion rates and we found as you can see all the way up to the left the marketers and the commerce companies testing based on extensive historical data. So what that means eventually is that they are testing, they are learning from their tests, they are optimizing their testing. Again this testing optimization cycle, they have the highest conversion rates of all marketers, so again shows the power of web optimization what you can learn today.
First, let's jump right in if you can launch the poll, we are going to ask you how you want this webinar to go? We have two options. As I said we have 20 hours of content to give you a snapshot, go fast and furious. Drive quickly through it or we can take it slow. We can do a deep dive. We can get a little deeper into some of the different topics we can talk about today.
So you will see the chance to vote on the screen. I get a few people here saying fast please. Got a lot of As, got a lot of fast comments in here. And let's look at the results of the poll and it looks like, OK we can close that out now, looks like 70% of you said fast snapshot. Let's go fast and furious. Got a lot of high-level advice so that will be our challenge today, Allison. So let's jump right into, Allison. And I think next we are going to be talking about one of our featured speakers who is Michael Norton and he had some great experiments so why don't you jump into that?Banko:
Yeah absolutely. Let's do this. So this first session is Michael Norton from Harvard Business School. This session was titled "Trust Your Transparency" and we found this session to be one of our biggest takeaways due to the fact that you can use transparency to gain credit in all aspects of your life, most importantly with your customers.
So lot of, again, activity on the hashtag #WebOpt14 during this session. What stood out to me about this session is he began with the Domino's Pizza tracker. Love pizza. If you are going to mention pizza in the beginning, you have my attention, but his point was that they show their customers what they are doing when making the pizza. You can order pizza, you go online, see exactly where they are at. You see bake, prep, box, delivery so you know exactly what's happening.
Like Michael said, people love seeing the work that you are putting into something and he also did an example here of looking at the government, the submerged state. So we look at this table here and they ask government beneficiaries how many of you not used a government social program? And as you see here a lot of people are using these government programs but they are saying they are not.
So the big deal here is that people were thinking okay the government is you know not doing my work for me, not satisfied. So what Michael Norton and his team did was they ran test where they had website called The Daily Brief and it showed a map of all of Boston and it showed the requests that people would put in for things like a down light pole, stuff like that and people were upset because they would put in a request and they would say was it ever fixed? What happened?
So they said, OK on this map let's drop pins. Let's show what I guess request we closed and what are opened and people loved it. They loved seeing their pins on the map. They like to say, "Hey, I put that in, the government is doing work for me."
So Norton and his team ran that survey again asking about the government, and in fact, people expressed a desire for a larger government with more services. They perceive the government to be doing a better job and even the government does a better job than it is given credit for. That's amazing they showed their work and they got more credit. I mean they weren't actually changing their work. They were just being transparent with their customers.
And I'm not sure if you all can see the slide here again, but if they showed this map, he joked they wouldn't have been too happy. But what you all can take away from this is that you can use transparency, showing your work to get more credit in your life most importantly it can help your customers understand all the services that you provide for them. Just look back at Domino's. They are still making you the pizza whether they show you what they are doing or not. Right? Your customers value it more when you show them what you are doing.
Top takeaway here: things haven't changed much since that third grade math class. Show your work for full credit. Burstein:
Let's jump into the next one you said you want me to go fast and furious. Ryan Hutchings, Director of Marketing from VacationRoost. What you can learn here is there's a really great example of testing for ROI. More importantly, he's very transparent about showing how testing isn't easy. When you test you want to learn. You may fail as well.
So we got to question from Peggy, a marketing manager. She wanted to know about what determining and measuring success so let me give you a few examples of Ryan determines and measures success so he has an idea for his sites about bounce rates. What is a good bounce rate? What is not? You can see it here in the slide. If he is getting above you know 40% definitely into 50%. That's too high of a bounce rate. That's an indicator for him he needs to test.
Let's look at clickthrough rate. Same thing here. He's got a clickthrough rate, let's say, below 50% that's indicator he needs a test. I can't give you the perfect answer about what does clickthrough rates and bounce rates should be but you can see here from MarketingSherpa Ecommerce Benchmark Study
if you are looking overall conversion rate, trying to determine good conversion rate for your site, this is total conversion rate to sale. The majority of marketers are under about 4% to 5%. The great majority are below 15% so many response rates are below 1%.
So if you are having low conversion rates, that is not necessarily an indication that you have to make an improvement. You don't want to benchmark against others but you do want to get an understanding of are you doing things that for example caused a dramatic off?
So if your conversion rate had been let's say 4% or 5% might not necessarily be bad depending on your offer but it suddenly drastically drops to like 3% or 2% that's a great indication, "OK, let me look at my funnel. What's going on here? Is it a technological issue? Is it a messaging issue? Did we change how much content we are putting out? We did change our media investment?" And that's a better indication of saying I need an overall conversion rate of a certain mark.
But if you want to get more of that benchmark data, it's available, it’s a free download courtesy of the sponsor of our research, Magento which is a eBay company. You can go to MarketingSherpa.com/Ecommerce
. We got 95 charts. We did a nine-month study focused on 25 questions that are really core for ecommerce marketers and we got 4,346 benchmark survey respondents to help us answer those questions and you can see that those questions and those answers with all those charts MarketingSherpa.com/Ecommerce
is a free download.
So let's jump a little bit more into Ryan's story of what he learned about testing. So as you can see overall his conversion rates were improving, his results were improving and he did it by testing. Let's look at a series of tests here, right? Here's an example of a lead form and he had the hypothesis, "Hey, should I add security seals to this site?" So that seems like an obvious way to reduce anxiety without improved conversion rates. No, didn't get any liftoff. Now that's not to say that he got a reduction and some tests when you run the tests you will actually get you know less converters off something you think will improve but he got no lift. Took the time to add security seals and no lift.
So he took another look. He said what about a shorter form? Right? Usually we think hey, if we shorten the form we are going to get more conversions. Again, no lift. What about we give the option to call? So VacationRoost, you're booking vacation homes like right maybe people want to pick up the phone and call that will be a significant increase in conversions. No lift.
But here's the kicker. Here's what to understand I showed you that for two reasons. One a lot of times on these MarketingSherpa webinars with case studies we show the great successes and marketers will come up after us, after they will talk to us and they will say, "Oh my gosh, you know my funnel is such a wreck. My ecommerce market is such a wreck. I'm doing horrible things compared to all those impressive results," and I want you to see and always know and inspire you to say hey everyone is struggling through. Nothing is perfect. If you are testing, sometimes you are going to fail. Sometimes you are not going to get a lift but the important thing is to learn.
So this is what Ryan learned. He said hey maybe it was I'm testing too small, maybe I need to combine all of this different elements so what you see up there is a MECLABS Conversion Heuristic
and those different elements stand for motivation — valued prop, incentive, friction, anxiety — what we are looking at is things like anxiety and friction are going to cause people not to convert. Understanding their motivation, understanding being able to deliver a good value, giving them incentives those are the elements that are going to encourage them to convert.
So what Ryan's trying to do is reduce that friction and anxiety, improve those positive elements to lead to conversion, so he put them all together in one test as you can see right there. He had some security seals, he had many things. He was able to get a 19% lift and we learned here when we talk about testing sometimes marketers are afraid of changing more than one element on a page or email because then you won't know sure what element caused that lift.
And when I asked Ryan about this he said look realistically sometimes if you do a true A/B test where you only changing one factor you know for absolute sure what changed that. What change led to that lift but in the real world sometimes that's not enough. It's just a waste of time. You’re only going to get a very minor lift. It's not worth the investment. You need to change many things, have more of a radical re-design with an overall understanding of why you are making that change like using a conversion heuristic and that's where you are going to get your lifts.
Ryan was kind enough he shared some of these values he had. I also want to show another lift he had. So again this was a PPC landing page, made many different changes, was trying to reduce friction and anxiety on that page and there's the treatment and you can see that he tried to have a little security seal there to help a little, have a higher call-to-action, and there increases in 427% increase in clickthroughs.
So I just want to kind of doing that to Ryan to show that, yes, he had many losses, but this is a very successful marketer and the big picture takeaway is when you are putting time into, it's kind of like Jessica wrote this one testing your own pages like golf. You are putting a lot of time into testing, you might get a little, but not get a lot of lifts make sure you are learning from those tests and understand that even if you are not getting lifts all of the time, you are hopefully learning, and you might have to combine some of the tests to get the life.
I also want to do another quick test to show like, hey, we showed you sometimes you are testing, you are going to get a loss, you are not going to get a lift, but sometimes it's incredibly rewarding for making some minor changes.
So this was Cindy Lu from VMware. She presented a case study on her test. This is an example of A/B/C test where the traffic was split in thirds and you will see at the upper-left there, the control, which was a banner on page, made some significant changes on this one. First of all, changed the image, right? So that image is just has a bunch of houses on it. This image right here is very more human and actually shows a human being even though it is a stock, there's humans using your actual tool.
You can see in the control she was white reversed out of blue that can be very difficult to read. She put the text black on white. Also added some clear buttons that you can click through. There's no real buttons, the call-to-action isn't as clear on the control, so there's a few lessons you can learn there but big picture is she got a 956% increase in conversion rate which is click-through on that banner ad.
So while I showed you some examples with Ryan where you can test a lot and be frustrated, but hey, you can also hit out of the park and make some pretty impressive improvements with your testing. Banko:
We have question from John who is a digital marketing manager. He's asking us could you please cover opt-in landing pages. John, yes. That's actually perfect transition into this next presentation we are going to talk on today. Offer page transformation, our Ancestry.com session by Emily and Julia. And I was so excited to see this one come to life on the stage.
Before we hit New York City, I actually interviewed them for their case study so it was awesome to see it come to life on the stage. And it was actually a favorite among the audience as well because the deck was beautiful as we move through you will see these beautiful illustrations of their different personas so to speak, which isn't I guess that much of a surprise, because Julia is the senior interactive art director, so let's get right into it.
So what they tested was offer page so the offer page was where site visitors can turn into members. They were realizing while there was a lot of different ways to get to the offer page, they were treating all of their customers the same. So they said you know what this offer page is worth investing in, let's focus on this page, improve user experience as well as conversion.
So they realized there were really two main groups that were hitting the offer page. There were self-selectors who kind of choose to go there by choice, and there were interrupted browsers which were getting there by essential a wall. They were blocked content so they were hitting it.
So they said all right let's do two pages here. So they tested it. It arrived they arrived at two distinct designs. One for self-selectors and one for interrupted browsers and then I guess the problem here looking at the two pages, they don't really have an instant look and feel, so the teams said, "Alright, let's look a template and let's run a test there, let's try to do a whole re-design on a template that looks more beautiful.”"
So the objective was to replace the current high performing free trial deny page with a template design so they can target, customize, and optimize. All right so let's take a look at the test. Control versus treatment. The template to me looks beautiful. They changed a lot of stuff on the page as you can see however that was the problem. There was a 7.5 % decrease in conversion.
You know the problem here though you look at a test, you look at a typical AB test like Dan was saying earlier and usually there is one element you say that's why it didn't work. The problem with Ancestry.com's team was “listen, we changed so many things on the page that we don't know what led to the fail.” All right, so they said “let's do it again except let's isolate some of our testing elements.” They decided to isolate three factors here button size, small versus large, then they looked at price placement near the call to action, far from call to action, and the next thing they isolated was the imagery placement. You can have it here as you see on the side of on the top.
So they did the nine factor full factorial test and they saw that one won was the standard button price below the call-to-action and images did the left; however, the thing that impacted conversion was the most was where that price was located so they could really pinpoint what made the lift and what you know resulted in that decrease before. So they were able to move forward with this template and even further customize the experience. They customized it for interrupted browsers. You see one offer page here. They did one for interrupted past free trialers. Another template here, upgraders, upgrade to nine and overall they saw a 10 % lift.
So the big takeaway for you all here to understand is that you need to isolate your testing elements. Obviously it's awesome when a test excels and it really works. Management doesn't really ask much questions when a test wins, but when it fails they want to know why and if you isolate your testing elements you can tell them exactly what that reason is.Burstein:
So as we can see from Ancestry's story, the VacationRoost story is hey in one sense you have to test big make a lot of changes so you get some impact but once you do especially if you lose got to understand why those changes are making an impact. So in just a moment I'm going to show you a great case study here where a stock photo had a significant impact in a test, so think about that and the stock photo placements you put your pages.
But also I want to let you know next week Allison and I were going to be at IRCE that's the world's largest ecommerce event. We are going to be running the media center there, the MarketingSherpa Media Center and we are going to be asking ecommerce marketers about the case studies they are presenting there. We are also going to if you are going to be at IRCE, please stop by the MarketingSherpa Media Center say hi. If you are not able to make it, we are going to be sharing all of those videos at MarketingSherpa.com/IRCE
and while we are at it, if you have any questions you would like us to ask those leading ecommerce marketers, feel free to email editor@MarketingSherpa.com.
So let's talk about another leading ecommerce marketer Jacob Baldwin from OneCallNow.com. So look at this page, look at the cell phone. What do you think of it? Is it very modern looking? So here's the problem with stock images — I'm sure we've all faced it — the technology is not always up to date in the stock images, and so sometimes if we pick one with an older computer, an older phone what will have an impact.
Well that's something Jacob wanted to test. You can see here the phone he had. Again, this was a multifactor test; it wasn't just an A/B split test. He changed the button wording, "request a quote," he changed that here to request pricing so shorten that slightly and he also made a significant change to that image so that's a much more modern looking smartphone. Here you go you can see them side by side and so this a multifactor test though could be that the button line could help, could be that the stock image helped, we do know that he got a significant lift off it. 95% increase in conversion initiation.
I mean that's a pretty minor change looking at the words of your buttons, the call to action but also looking at the images you use. So in an ideal world it's great to not use stock images. It's great we've found with previous tests: use real people, use real shots of your products. Sometimes people see stock images so much they can even call out, "Oh, I've seen that here in some other product and then they know it's fake." But if you are going to use stock images and they involve technology, make sure they up to date because with people buying technology they want to buy the latest greatest thing not that old smartphone so.
Peter Doucette is the VP of Consumer Sales and Marketing of The Boston Globe
. He shared with us some of his tests and he tweeted he has Boston Globe
has a better batting average than Ted Williams which is true but it sounds great, but keep in mind, that's challenge with testing.
So Ted Williams is, I guess, Allison said had a 355 lifetime batting average. As a marketer that means you are wrong two-thirds of the time. As a baseball player, that means you are making multi-millions of dollars. As a marketer, it means you are wrong two-thirds of the time. And that's the challenge with testing, you may be wrong two-thirds of the time and not know. Testing you actually find out and you learn.
The great thing about Peter, we are going to see a lot of things in his case study. He moved the optimization of his company onto simple landing page and email tests. He sought to optimizing an entire company so here's some examples reporting, delivery, everything they were involved in he tried to test and find ways to improve what they were doing and what is really interesting is there testing started in the marketing department and so they started by testing offers, but for any of you engaged in content marketing this next slide is going to be very interesting to you.
There were also started testing their editorial, right? So once they started testing their marketing, it kind of bled over into their editorial department. Hey, let's test and see what works best for our headlines for example. We have a question here from Renee, she's an associate director, she wants to entire customers to open emails or other electronic communications.
So often we think about yeah getting customers to open emails. What about getting them to actually read your content? And a major in factor getting people to read your content is the headline.
So during this session we actually ran a live test. Boston Globe
, they are so big, they get so much traffic, they can run, and they can statistically validate, at least maybe not real historical validity, but they can statistically validate a test in five minutes and so during this 30-minute session they ran this test.
You can see now that the editorial we were given was a little rough; you know it's not a fun story. That's the challenge with editorials sometimes in newspapers. The two headlines were “Police Allege Sex Assault Not Reported." and "Did School Run by State Contractor Fail to Report Rape?” Again, kind of rough stories but this is the reality of news reporting.
So take a look. You don't have to vote now, but this is what we asked our audience to do. You can just write down for yourself which one do you think won? Why do you think it won? Just a moment I'm going to show you the results of that, but overall as we said The Boston Globe
as you may have known is bought by new owner John Henry who is a great example of making The Globe
a laboratory. We are sharing that with you to think about OK, this is testing your marketing, but what else can you test? How can you think about your company a laboratory to constantly improve, constantly learn about your customer, constantly improve your operations, your products, everything involved in your company?
Of course, since we are our marketers we want to share one marketing test. This is a digital access price testing and basically what we are testing is what is the proper price point? So not only the proper price point for initial conversion, right? So as we see here 99 cents had a higher conversion than 399. Shouldn't be overly surprising we would think that but think of a digital product, right? They are looking at lifetime value. There's a big margin there, right, for a company like The Boston Globe
or any company producing a digital product. Any company selling information essentially, every new person that you are adding essentially 100% margin.
But what they are really looking for when they are adding that person is not just that initial bump they get in revenue, but what's that lifetime value? So they were able to test and see right away they got a 36% increase in digital access of subscriptions by going with 99 cent price, but if you are testing price, here is a real number to look for: what was the lifetime value?
So when they looked at it over not a full lifetime but 52 weeks of revenue, they found that 99 cent price as you can see above produced significant less revenue, 62% less revenue than a price of 399. So when you’re price testing make sure you are not just looking at that initial price and how much more conversions you can get, but what is your overall lifetime value and that may range depending on what type of company you are. Depending on what products you sell from customers. Sometimes even if you are getting less initial customers, the higher price may actually pay off in the end.
Another thing they did was Boston Globe
at that time was owned by The New York Times
at the time they ran this test and they took an example of digital accordion check out process that work for The New York Times
. So this was their control, they had several steps in the check-out process before you actually buy. This is what they tested and digital accordion.
So essentially you click and each one then is on the same page and it kind of folds up into an accordion and then there's the next step process, then the next step in the process. And you can kind of see down in the middle there about how there's other steps in the process above, there kind of checked off that you've already did them, and then the accordion opens again.
So this worked well for The New York Times
. They tested it since it worked well for New York Times
, they figured it would work well for them, but it got 35% decrease in conversion. So I'm not sure the exact answer of why that is, but there is something different about New York Times
subscribers than there is about Boston Globe
So we can test, we can learn about audiences, and we can try to take those learnings and put them over to different audiences. You really have to understand your segments, understand your audiences, don't just look at the average just because something worked for one segment or one persona or one product or however you set up your audience. Don't just assume it will work for other products. You really have to learn every time, test every time to know what's going to work the next time.
So with that, let’s get back to that headline test. As you can see our audience thought, 79% of them at least thought, the question was the more compelling headline and at the end of the day, after 30 minutes at least, B was leading the question. So I will go back there for a second. You can see those two headlines. When you are testing your own content, it doesn't mean a question is going to work every time, but you test the headline in your content or your subject's lines, you are going to get more readers for your content.
So the big takeaway is look beyond your simple emails, look beyond your landing pages. What else can you test in your organization? Can you test the content? Can you test your operations? In Boston Globe's
case, you know their delivery routes.
So here's another quick test about pricing.Banko:
Dan just talked about The Boston Globe
. We are going to protect the privacy of this next media giant but you should know that they are competitors so we are kind of staying along the same vein. So the background on this test, which was presented by Lauren Pitchford, who is a Senior Optimization Manager here at MECLABS. Background was home delivery subscription has multiple delivery options you know daily, weekly, week day, Sunday only.
The goal was to determine the reordering options would have any effect on the subscription mix and revenue. So they performed an A/B/C test as you see the options are listed by popularity, price low to high, price high to low. So again we took use of that live poll, really engaged the audience, we said what price order generated the most revenue? They said most popular products first.
I'm not going to lie I voted for that one too. Well, guess what, we were wrong. There was actually a 20% increase in revenue per visit. It was priced low to high and the big takeaway here is by simply reordering the options displayed they were able to significantly affect the product mix. Again, thinking about the thought process that your customers go through and Dan do we have time to run through another one?Burstein:
Just two minutes left, what's our biggest lesson that we want to take about from Michael Aagaard and I think when I think everything that Michael Aagaard talked about I think one of the biggest lessons is this simple shark he had right here, right? Now tell us about this. When you are trying to convert customers you have to understand, as we talked about, not all customers are the same. Right just because a test worked, as we showed in previous examples, doesn't mean it works with everyone.
You basically have three types of customers, right? I'm not going to rehash on this scheme; there's a very highly motivated customer. Almost anything you put up there, they are going to want to purchase, right? And then there's the other ones where and so those you don't want to focus too much time on optimizing because no matter what you do they love their product, they are the true fans. They are the people waiting in line for "Star Wars," right?
But then on the other side you have the people almost no matter what you do they are not going to buy it from you. They might have clicked out of curiosity or whatnot or come across your landing page because a friend referred them, or however they got there, but they are really not interested in your product, they are really not going to buy from you, so no matter what you do, you shouldn't focus too much effort on trying to optimize them.
Where you are really trying to optimize is for the great group in the middle there, the maybe people, like that's where you can make some of the small changes we talked about, maybe changing that stock photo image, right? Maybe changing as Allison talked about like how you display the prices of for different products could have a big difference, so keep that in mind.
Some people are always going to buy. Some people are never going to buy. But try to focus on what minor changes or big changes can you make those people in the middle to want to buy from you.
Thank you for joining us. Thank you for taking 30 minutes out of your day. We know that is a significant investment, so help us to continue optimize these webinars. When you close out of this webinar, there's going to be a screen that pops up, it's going to ask for your information on a survey, when you tell us things works, when you tells us things don't work we listen and we try to improve these webinars so please fill out that survey. Thanks again for joining us.Banko:
Thanks for tuning in.