By Anne Holland, Content Director
Abby Stephenson, who manages Delta.com’s Web usability team, gave a great speech at an optimization conference I attended in San Francisco last week. Their tests over the past year have resulted in nearly an additional $20 million in estimated online-driven revenue.
Important note: Delta’s site didn’t suck before. In fact, ceaseless testing and optimization only improved key page conversions by an average of 4%-5%. But, as Abby notes, when you’re a multibillion-dollar site, 4%-5% is a LOT of revenue. (BTW: In my experience, if your site is an average-to-bad one, a thorough round of testing and optimization generally helps lift conversions 30%-40%.)
Delta.com's top recent lessons learned:
o Cut instructions
Obviously, don't cut instructions that people really need to figure things out. But, if you have a page with really self-evident instructions to the order of “Lather, rinse, repeat,” test taking them off the page.
o Turn statement headlines into action headlines
Instead of telling people what page they are on, (i.e., Order Form) try telling them what action they should take on the page (i.e., Order Here.) I’ve seen lots of headline tests from other marketers who confirm this.
o Strip off “Welcome”
Even though “Welcome to our site” headers look so very, very 1998, many of us still use them. Why? Abby thinks it’s because you feel like you should put something there to be polite. However, when her team tested removing an innocuous Welcome line from the tops of landing page templates used by Delta.com affiliates, conversions increased.
o Test copy next to “Submit” buttons
Just as submit button tests can move the needle, testing copy in the vicinity of the submit button can also make a difference. For example, next to its “Continue” button Delta.com tested “Almost done” versus “Go to Last Step.” The latter handily won.
o Graphics test results mixed
Abby noted that only 25% of the tests Delta.com now runs result in significant results worthy of making site changes. (Again, this is mainly because the site is already very optimized. Most sites would see a higher rate.) She uses tests for two factors -- either to test a Best Practice her team has heard worked for other sites, or to test something internal people are heavily debating.
(Nothing ends an energy-and-time draining debate more quickly and non-acrimoniously than “Let’s test it.”)
As in other organizations, many of Delta’s debates are about graphics, colors and images online. And, as with other organizations, very seldom do graphics, images and color tests really move the conversion needle. (Copy, offer, submit button and number of form fields requested nearly always move the needle.)
However, Abby noted a few exceptions to the graphics-don't-matter-as-much rule:
o “Lightening” a page overall by removing extraneous colors and graphics (including shaded rows in charts) and making dark colors lighter can significantly improve results.
o Removing extraneous directional graphics, such as navigation bars normally on your site template, can improve conversions.
o Adding credit card logo images to a page can *depress* results, sometimes significantly, maybe because they’re seen as eye clutter.
Finally, Abby’s biggest advice for all site test teams, “Only test what you can implement. Many great ideas are impossible to implement.” This means that you absolutely have to invite a Web techie to your testing team meetings, with the request to “Please shoot down all our great, but impossible to roll out, suggestions.”
By the way, if you would like a copy of Abby’s slides, the folks at Optimost who ran the conference said they might make them available on request. Contact info at http://www.optimost.com