Court Ruling Impacts Liability For User-Generated Content

May 24, 2007 (12:05 PM EDT)

Read the Original Article at

A recent court ruling holds Web sites responsible for some user-generated content and could change the way sites control such content.

The 9th U.S. Circuit Court of Appeals ruled last week that the Web site can be held liable for violating the federal Fair Housing Act by allowing discriminatory postings. In Fair Housing Council v., the court ruled that the Communications Decency Act doesn't protect the site from claims brought by two California fair housing groups.

The court stated that, by categorizing, channeling, and limiting the distribution of users' profiles, the site was at least partly responsible for creating or developing information on the site. The court held that the site was responsible for some user-generated content because "unlawful information was provided by users in direct response to questions and prompts from the operator of the Web site."

The ruling clarifies the portion of the Communications Decency Act that states providers or users of interactive computer services shall not be treated as the publisher or speaker of any information provided by someone else. Other rulings interpreted the language to mean that Web site operators are not liable for illegal content that others post on their forums.

Michael Bennett, a partner in the intellectual property department of Chicago law firm Wildman Harrold, said that previous cases confirmed and expanded immunity for ISPs, Web site operators, and listservs.

"Then this one comes down and begins to draw some lines," he said during an interview Thursday.

The new ruling means that sites that control user-generated content -- especially dating and automated brokering sites that use "matching" technology -- are considered publishers and are therefore liable, Bennett said.

"If you guide the content too much, or select which of the content will be allowed, you could lose immunity for that portion of the content and be held liable as a publisher," Bennett said. "The problem arises when nonpublishing Web sites want to put up blogs and forums. Typically, they want to guide the content to some extent, so that it reflects an appropriate image for the site and company."

Bennett said the ruling will affect many Web businesses.

"Keeping a close rein on the user content that is published is now a liability," he said. "As a result, some sites may forgo control over the content, others may implement new policies to maintain their sites' content."

Another part of the ruling issued last week indicates that areas open to any type of comments may be safer than others. In the case, some of the inflammatory speech could be found in a general comments section, Bennett said. Since that wasn't controlled or channeled in any way, won't be held liable for those statements.

"When you categorize and prompt information, or give a recipe for particular types of information, that's when you're drifting into an area where you're becoming an information provider," he said.

"When you're channeling and limiting information, then you present the results back, that's going to be speech where the immunity may be forfeited. Anywhere you have enhanced functionality, it would be wise to be concerned," Bennett said. "Operators of sites that have any type of matching services, which to me seems to be what makes them more valuable, [have to have] heightened scrutiny and awareness of their liability."