Tuesday, March 11, 2008

Rating Content on Your Website

RFPs for web content management systems often mention content rating as a requirement. I want to explore the topic of content rating, examine its pros and cons and look at options for implementing content rating on your website.

What is content rating? For purposes of this article, content rating refers to the ability of end users to rate content in terms of its usefulness, relevance or overall quality. (This is not to be confused with another understanding of content rating, namely the categorization of content based on its appropriateness for different age-specific audiences.)

Users who read a news story, for example, may be asked to rate the article on a scale of 1 to 5. The rating is typically associated with the user's account (or their browser cookie if they are not logged in). When the user returns to the site, they can see their own ratings and may also see average ratings for all visitors. Rating data may be used to inform popularity indexes (for example, a listing of a site's most popular news stories) or may drive content or product recommendations.

Some content management vendors offer content rating tools either as part of the core product or as add-ons. Vendors may suggest that content rating can help drive decisions about what kind of content to develop or help support content personalization. An underlying assumption in these kinds of claims is that rating information is both accurate and valid.

Much has been written that suggests these rating systems are, in fact, poor sources of information (see here). First, there are questions about the validity of the chosen scale. For example, do all users interpret the scale similarly? Is one user's "Good" another user's "Average"? Even for a single user, scales need to be designed such that individual ratings are consistent. For example, would a user consistently be able to identify an average article as "Average" and a great article as "Great"?

Second, online rating tools tend to use convenience sampling, meaning that the people who actually rate content are those who tend to rate content (not necessarily a good cross-section of your site visitors). Basing content development decisions on a small, non-representative sample of users does not make good business sense.

Many services have emerged to provide alternatives to content rating. One of the leaders of the pack -- Loomia -- focuses on the business outcomes of content rating to determine how to best measure content popularity. Loomia provides a content rating widget that can easily be integrated into any site. Sitecore works with Loomia seamlessly. Loomia integration consists of two steps:

  1. Provide Loomia with an RSS feed of site content.

  2. Add the desired Loomia widgets to your site.

Both steps can be easily accomplished through Sitecore's RSS module and simple rendering logic. The final product lets users easily add ratings to any content item on your site:

But Loomia doesn't stop with content ratings. As I mentioned earlier, content rating is an unreliable measure of popularity. Instead, Loomia uses rich statistical algorithms that combine ratings, analytics data and other proprietary methods for discerning the value of site content. Loomia takes this data and generates personalized recommendations for users, making your site more sticky (i.e. increasing page views, user sessions and conversion rates).

Here we see two interesting aspects of the discussion brought into clear relief: 1) user-generated content evaluation should focus on business outcomes and 2) evaluation data should be drawn from a representative and valid sample of user behavior. I strongly recommend working with a recommendation service like Loomia when trying to make your content more compelling and discoverable.

No comments: