April 25, 2006
How To

Top Three Usability Tests for Web Site Redesign Decisions

SUMMARY: In the insanely competitive world of financial services marketing, your Web design has to be really good (not to mention persuasive) to convert the traffic you worked so hard for.

Most marketers focus on Web analytics, a/b panels and multivariate tests these days. However, these metrics only tell you what visitors did when they came to your site -- not *why* they did it.

Here are three usability tests a real-life marketer used recently to understand the whys and improve online conversions.
Access Group, a non-profit lender serving college and grad school students, has revamped its website fairly substantially every year since 1997.

 

Though the team carefully reviewed analytics reports and survey data, they realized the data couldn't tell them why prospects behaved the way they did, or what products or site features should be changed overall.

"A customer can tell you, 'I want a widget in red,' but they're not telling you that the reason they want the widget in red is because they can't find it in their purse," explains Melissa Layfield, Product Development Manager. "So you're not finding the problem."

That's why Layfield and her team recently conducted three types of usability tests to determine which of three new Web designs would work best for prospects -- graduate students looking for financial aid.  The team also wanted to know whether their nonprofit status was a good thing in the minds of students and whether the status was obvious in the site design.

Technique #1. Misdirection

The team brought a group of 20 or so testing subjects -- mostly graduate students from a nearby university -- into a laboratory setting to test the three Website comps. The subjects had no idea what the test was for, who the client was or what the team was after.

Researchers conducting tests of this type generally pay each subject $100-$200.

The subjects were not all shown the same things. Some may have seen one of the comps plus a competitor site. Some may have seen all three of the comps, while others saw different pairings of comps and competitors.

They were asked open-ended questions, such as, "What do you look for when you're looking for a student loan?" By asking questions that didn't focus on "Would you buy this?" or "What did you think of that?" it helped the team find the true brand experience of the site as well as the answer to which comp worked best.

The research team compares it to the Myers-Briggs personality test. They don't just give you one question; rather, they ask a series of questions that keep it from being clear what the point is.

Lessons learned: Access Group discovered that participants did believe that being a non-profit was a positive element for a student loan provider but that the current Group Website didn't make its non-profit status clear.

 

Technique #2. Understand your site's "experience"


During the lab, the team asked questions about user experience on the site.  The goal was not only to discover how well the tactical function of the student loan form worked but also to help the design team understand the brand perspective inherent in site form design.

"We wanted to know what they found attractive and what their comfort level was," says Layfield.

The Access Group team also tested tasks that users would do at different phases of their experience.

"Someone shopping for a loan for the first time will use different tasks than someone trying to get a repeat loan. So we were able to ensure that we were meeting the needs of a customer throughout an entire lifecycle," she says.

Technique #3. Get man-on-the-street reaction to your site

The team also ran a non-lab test, stopping college students on campus. They had created some packets of information that showed screenshots of the website and screenshots of competitors to see whether the site was viewed as trusted and professional.

In general, participants are asked to answer four or five questions for this type of research and are paid $10.

"One reason that we had different varieties of testing subjects is because it helps validate that you are on the right path when you're getting consistent answers across the board with all of them," says Layfield.

Useful links related to this article:

Perceptive Sciences, the research and usability firm Access Group relied on for the tests

http://www.perceptivesciences.com/

Access Group's student info page

http://accessgroup.com/students/

Improve Your Marketing

Join our thousands of weekly case study readers.

Enter your email below to receive MarketingSherpa news, updates, and promotions:

Note: Already a subscriber? Want to add a subscription?
Click Here to Manage Subscriptions