June 24, 2004
Case Study

How Can You Turn Web Site Metrics Data Into Truly Useful Info? Avaya Tests Floating Feedback Link

SUMMARY: Do you ever wonder how to get anything useful out of those fat piles of Web site metrics reports? Are you frustrated by stacks of numbers that don't tell you how to make visitors love your site?



That's exactly the position Avaya's Web team was in. Find out how they tested a floating feedback tool to discover site tweaks to increase visitor satisfaction:
CHALLENGE
Avaya Inc (NYSE:AV) has more than a million customers, more than two million shareholders, 2,500+ authorized partners, and 15,000 employees in 50 countries.

While this is thrilling news when you're writing the annual report, it's hideous when it comes to site metrics.

Just think about it -- you've got loads of traffic coming from around the globe with massively different needs and expectations. You need a site that pleases everyone from small businesses to the Fortune 500, from the press to tech partners, from confused customers to eager prospects...

The first battle, when Jorge Gutierrez joined Avaya two and a half years ago, as Senior Manager for Web Intelligence and Optimization Management, was centralizing data collection. "We deployed tagging technology because it was a hassle chasing Web logs. We wanted to spend time analyzing data, not chasing it."

Results were initially exciting. The team collected reams of reports on pageviews, numbers of visitors, viewtime, files downloaded, etc.

One problem. As Gutierrez says, "It was a lot of data, but not actionable..." Avaya execs wanted to know much more than how many visitors a page received. Marketers had questions like:

o How satisfied are visitors with my site section?

o What problems or frustrations do they encounter there?

o Is the content effective -- is it meeting visitor's expectations?

o How could I tweak the site to encourage visitor loyalty?

o Are there design changes we could make to improve marketing campaign results?

CAMPAIGN
Gutierrez began reviewing every analytics and site surveying technology out there. He preferred ASPs (rather than buying enterprise software and hosting it in-house) because ASPs tend to be easier and quicker to deploy, and may not require lots of help from already busy in-house IT departments. Plus, "I don't want to build a data warehouse."

He told us these two tips on picking the right analytics ASPs:

Tip #1. How easy is it for marketers to read the resulting data?

Gutierrez says, "IT is never going to be your primary user. Your primary user is going to be on the marketing side. You have to facilitate how they use the data. The visual display is critical."

Tip #2. Insist on a pilot trial before signing a contract

"Everyone talks ROI, you may as well hold them to it. Your vendors should all be willing to do a pilot. That's the way you can evangelize with your stakeholders. If we can prove the value, it's a lot easier to build a business case. Starting small is the key."

Gutierrez advises that you never ask a vendor to run a pilot unless you are willing and able to sign a contract at the end of it. However, "Not every pilot should be successful. We admit when something hasn't gone as expected. It comes down to credibility with stakeholders."

"For typical Web tracking tools you need at least 30 days; for trending, you need longer. Every application looks great in demos, but tell the vendor, 'I want to see the same with my own data.'"

After Gutierrez and the team raised the usefulness level of their data by deploying behavioral tools such as path analysis, they still lacked a critical element. It's great to know what people do on your site -- but what marketers really need to know is why visitors do it.

Gutierrez's budget couldn't cover out-of-house usability labs, so he tested the next best thing -- a surveying firm that could run online research panels of selected Avaya visitors.

"We reached out to customers through an email posting to invite them to take the survey. It was up 10 days. We probably got the most data within the first week. The second week was more of a courtesy for someone catching up on their email."

(Note: we highly recommend this -- just because you've got the data you need doesn't mean you should annoy clients who'd like to share their input with you. Don't shut down branded surveys too quickly or you risk a negative experience for late coming users.)

The data was useful, but you can only survey your customers so often and a site is an living organism that needs constant watching and tweaking.

Gutierrez says, "We try to identify issues to fix, and then when we have enough of them, that's when you go back and re-measure success." On the other hand, he learned from the first surveys that, "people are willing to give a lot of feedback."

So, he decided to test adding another ASP program to the site to collect feedback on an ongoing basis.

A little feedback link appeared at the bottom right of visitor's screens as they viewed any page on the site. It floated, so if the visitor scrolled up or down, the little link followed them like a faithful puppy dog.

No matter where you were, or what you were looking at, when you felt moved to give some feedback, there it was. (Sample screenshot below.)

When a visitor clicked on the link, up popped a little box. Gutierrez wanted to get maximum feedback, so he had the box optimized with the least number of questions possible (the more questions, the higher your abandonment rate.) He would only add extra questions on site sections that the team needed specific additional detail on. (Samples of basic and longer boxes below.)

Some researchers think numeric ratings scales may be inaccurate because everyone has a different idea of what a number implies. So, instead of asking visitors to rate the site using a numeric 1-5 scale, Avaya asked them to use a plus/minus scale:
++
+
+-
-
--

Users could also enter a written comment, and an email address. However email was strictly optional, and clearly noted as such. (This is a best practice more companies should follow.) "Some customers and partners might want to get an answer to something -- that's why that email is there. It would go against our privacy policy to use it another way."

RESULTS
"It's fascinating to understand the perceptions of the site through the eyes of the visitor," says Gutierrez. "They do perceive things differently than was our intent. As a result, we have adjusted our site."

The ongoing feedback link is particularly helpful. "You're not going to solve world hunger; but, you may solve a hundreds of small issues that can make a difference."

Gutierrez couldn't reveal exact metrics data, so we asked the vendor providing the feedback link to reveal aggregated data across many clients (which is probably more useful for you as an average anyway):

- In general 0.5%-3% of a site's visitors will click on the floating feedback link. This number drops if you make the link static so it doesn't float. It also drops if you take off the word "feedback" and just show an icon.

- Often 80-90% of feedback box users who leave a written comment will also take a few seconds to use the +- scoring system to rate the site's content and design.

- The optimum number of extra optional questions (beyond the scoring system and the main comment area) is three. Once you go beyond six questions, the "completion rates plummet."

The vendor says, "We hypothesize that the drop-off occurs because adding a 7th question extends the comment card beyond the page fold for certain screen resolutions." Or people could just feel that enough is enough at six.

- Usually about 70% of the data the feedback link collects are things that the Web design and analytics team already guessed at. The other 30% of feedback is often a complete surprise, "primarily because they never thought to test those pages of the Web site." So if you add feedback links, put them everywhere.

One last note -- although Gutierrez couldn't say which changes his team made to please visitors, we did see two factors you may want to try on your own site to improve results:

#1. You versus We

Take a look at Avaya's home page (screen shot below.) You'll see the word "you" or "your" a total of 10 times. Guess how many times you see the words "we", or "our"? None. And the only "us" is in "Contact Us".

If forced into a "we" position, Avaya's online copywriters use the brand name instead. This copy style continues through the site.

#2. User-centric navigation

Left-vertical navigation on the site centers around the user. It's divided into two clear categories, "Go To:" which lists site section names such as "Resource Library", and "Communities:" which lists visitors by the names they would identify themselves with, such as "Investors" or "Business Partners".

We think both tactics are very clever indeed.

Useful links related to this article:

Screenshots of the feedback button at work, plus two useful slides from a presentation Gutierrez created on Avaya's analytics:
http://www.marketingsherpa.com/avaya/ad.html

OpinionLab - the ASP who created and power the feedback button
http://www.opinionlab.com

Vividence - the company Avaya used for occasional usability surveys
http://www.vividence.com

Avaya
http://www.avaya.com

Improve Your Marketing

Join our thousands of weekly case study readers.

Enter your email below to receive MarketingSherpa news, updates, and promotions:

Note: Already a subscriber? Want to add a subscription?
Click Here to Manage Subscriptions