November 08, 2007
Concerns about protecting brands keep many Web publishers from experimenting with Web 2.0 features on their sites. But that could be blocking a big opportunity.
Interaction is critical in attracting readers and converting them into subscribers these days. We talked to an expert on best practices to do this without diminishing the value of your core product. Plus, tips on writing community guidelines.
The New York Times’ recent decision to allow readers to post comments to online articles -- not just blog posts -- illustrates the challenge many traditional Web publishers face as they expand their Web 2.0 efforts.
Although the Times wanted to accommodate users’ desire for more interactive features, they were careful to develop a system that rolled out comments only for selected articles and relied on a team of editors to monitor those comments to protect the newspaper’s brand and standards for online content.
Concerns about protecting brands and maintaining the value of professionally-produced content are keeping many Web publishers from experimenting with their own user-generated content features. But those concerns could be blocking sites from a big opportunity, says online community expert Jake McKee, Founder, Ant’s Eye View.
“Adding different levels of interaction is the new advertising. An ad is meant to capture interest and attention from users and get them to do something with it,” McKee says. "That’s what we’re looking at with the ability for users to add comments, discussion or their own content related to certain things.”
For marketers trying to attract readers, retain them and convert them into subscribers, interaction is a crucial element that encourages people to show up every day. The key is finding the right level of interactive features and managing them in a way that welcomes user contributions without diminishing the value of your core product.
We asked McKee to share his tips on developing guidelines and a moderation strategy for interactive features:
-> Tip #1. Choose where to allow user interaction and what types of contributions complement your content
On every Web site, there will be certain sections or types of content that work better for user interaction than others. Although there are rules governing where to start, publishers and other Web site operators can look at their own traffic patterns and user behavior statistics to target the areas ripe for user contributions:
- Some sites may find content that already contains a level of institutional opinion, such as editorials or reviews, might not be as participatory as other sections. Instead, readers may want to sound off on articles that simply relate the facts without any overt bias, creating a forum for users to discuss the implications on their own.
- Content that’s already being shared among the readers through other means, such as emailed links to friends or tagged through external bookmarking sites, could indicate areas where users have strong opinions and may want to participate in other ways.
After identifying places for user interaction, you can then choose the type of contributions that are the best fit:
- User comments are a good match for text articles, following the format popularized by blogs.
- Tools that allow users to rate or rank content can attract more interaction than longer-form comments. For example, Digg allows users to rank user comments by clicking simple thumbs up or thumbs down icons. “There are lot more people giving thumbs up or down than posting comments,” McKee says.
- Don’t limit yourself to text: Multimedia content, such as user videos or photos, also make sense in certain contexts, but require more commitment on the part of users. For example, Amazon.com now allows users to post video reviews of products, in addition to written reviews.
-> Tip #2. Start with a small group of contributors
While targeting certain areas of Web sites for user contributions, you also should develop a profile of the audience for those sections. Then, reach out to a select group of those users to help test and populate the new features with valuable content.
Limiting the population that can use the features initially offers several advantages:
- It’s easier for your community monitor to manage the development and oversight of interactive elements, while collecting feedback on how to improve them. (See below for more on community monitoring.)
- Trusted users can help set the tone and establish a culture of contributions for future participants.
- Those initial users create a base of excited participants who can reach out to friends and help popularize those features.
Here are two ideas for picking a select group of users to help jumpstart interactive features:
- Reach out to existing subscribers or members for content contributions. For example, Zagat.com launched its user restaurant review feature by inviting existing subscribers to begin posting reviews, offering a gift certificate to the member who contributed the most write-ups.
- Create a ranking or trusted-member system that requires users to have achieved a certain status before they can contribute content. For example, you might require users to vote on or rank a certain number of user comments before they can contribute their own content. This technique helps ensure users are familiar with the community’s guidelines before they start participating completely.
-> Tip #3. Write community guidelines in terms users can understand
A clear list of community guidelines is essential for interactive features. Not only do they spell out what users can and can’t do online, they act like a mission statement for the community that outlines the culture you’re trying to create.
Unfortunately, most sites get their community guidelines wrong, McKee says. Typically, sites use their Web site’s terms of service as the community guidelines, offering a long document full of legalese as the way to define acceptable usage. “Nobody reads that crap because it’s so hard to read.”
While those documents have their place (often as a link somewhere on the site), the best community guidelines speak to potential users in simple language that highlights the fact that online communities are interactions between people that will be monitored.
McKee cites the online photo-sharing site Flickr as having one of the best set of community guidelines.
The site breaks down do’s and don’ts in conversational terms, using category headings, such as:
o Do play nice
o Do upload photos that you’ve taken
o Do moderate your content
o Don’t upload anything that isn’t yours
o Don’t upload content that’s illegal or prohibited
o Don’t be creepy
“They’ve set the tone from the moment you’ve joined up.”
-> Tip #4. Appoint a community manager
Another reason to limit the initial scope of user-generated content efforts is to ensure you have someone in place with time to monitor and manage the community. It’s essential, says McKee, for the Web site operator to be an active participant in any interactive feature, ideally with one or more people acting as the company’s highly visible representative.
Although the manager acts as a community moderator, moderation involves more than just watching for spam, offensive content, or other violations of community guidelines. In addition to those tasks, a community manager should:
- Participate in online discussions to help establish the tone and culture of the interactive features.
- Announce new services and features, and solicit feedback and suggestions for improvements.
- Offer specific explanations for removal of content or other disciplinary actions. The practice not only helps prevent the user in question from making the same mistake again, it also educates the entire community about keeping content submissions within guidelines.
-> Tip #5. Use technology to assist the moderator
Along with clear community guidelines and an active monitor, mechanisms that automatically detect or allow users to report problematic content help control a user-generated content initiative.
While technology can’t entirely replace a live monitor, it can make it easier for that moderator to do his or her job, highlighting content that needs immediate attention. Such tools include:
- Filters that help identify offensive language or profanity in user comments and hold them from being displayed on the Web site.
- Links or buttons embedded in each user post that allow other members to flag the content as spam, offensive material or other violations of community guidelines.
- An online form or dedicated email address where users can report abuse or problems with specific community members.
- Algorithms that help prioritize which flagged comments or abuse reports are most pressing. For example, usage patterns may show that truly offensive content is usually flagged by multiple members.
In that case, you could design a system that automatically pulls any piece of content that’s flagged by a certain number of users, rather than waiting for the moderator to receive the notice and manually review it.
However, McKee cautions that filters, reporting tools, and other content scrubbing technologies are no substitute for starting with a select group of users to help establish a community culture, and outlining clear participatory guidelines.
In fact, having too many ways to flag content or report abuse may actually harm efforts to attract users and encourage interaction. “You don’t want to give so many reporting mechanisms that it makes potential users think, ‘Wow, there must be a big problem here.’”
Useful links related to this article
Flickr’s community guidelines:
Digg’s comment rating tool:
Ant’s Eye View: