Internet Content Rating and Labelling Systems

Last Updated: 14 March 2001

"A child who has not been given the opportunity to learn how to manage the world of tomorrow, where the rules will presumably no longer be those of today, and the capacity of reaching information will be more important than generical factual knowledge, will have minor chances of success."

A Charter of Children's Rights in Telematic Networks, Sep 1996.

Contents:


Rating and Labelling Systems

PICS ("Platform for Internet Content Selection")

The Australian Broadcasting Authority, in its July 1996 report, Investigation into the Content of On-line Services, strongly endorsed the Platform for Internet Content Selection (PICS) as a means of protecting children "without censorship". PICS was developed as a result of a merger between the Information Highway Parental Empowerment Group (IHPEG) and the World Wide Web Consortium (W3C).

PICS provides an infrastructure for content labelling and is not a rating system itself. Rating must be done using either embedded rating tags conforming to a particular rating system, or by reference to a third party rating server, which independently rates content. Examples of rating systems are those developed by the Recreational Software Advisory Council (RSAC) and Safe-Surf.

Further background information sources on PICS and content rating systems:

Despite a great deal of early enthusiasm for PICS, a number of problems began surfacing in 1996. Paramount among these are:

  • Instead of being a voluntary system as first envisaged, organisations such as the ABA in Australia, and Internet Watch Foundation, formerly Safety-Net, in the U.K., were considering mandatory rating in 1996/97, thus enforcing a private censorship system under the threat of government intervention. While rating has not been mandated in those countries, the potential for governments to enforce use of rating systems intended to be voluntary remains.
  • The notion that it is mainly sites unsuitable for children that need to rate their pages is flawed. In fact, the opposite applies because no regulatory jurisdiction can impose a rating system on material from another country. Therefore browsers should be configured to reject unlabelled material and it is "safe" sites that need to use ratings.
  • Rating systems are complex, potentially costly to use, often depend on value systems of other cultures, and can have inherent deficiencies in content category definitions.
  • Material will probably need to be rated under multiple rating systems in order to be widely accessible.

The European Commission has produced an extensive report on the problems of Internet content filtering. While it endorsed the PICS concept, it rejected the adoption of a US-centric ratings system, recommending instead the development of a European system.

EFA is opposed to the use of self-labelling systems as a means of regulating Internet content. The issues are further expounded at the following sources:

PICSRules

The PICSRules specification was endorsed by the World Wide Web Consortium (W3C) in December 1997. It defines a language for writing profiles, which are filtering rules that allow or block access to URLs based on PICS labels that describe those URLs. The language is intended as a transmission format - individual implementations must be able to read and write their specifications in this language, but need not use this format internally.

Here is an example of a PICSRules profile designed to allow access based on PICS labels and block everything else:

     1 (PicsRule-1.1
     2     (
     3     ServiceInfo (
     4         name "http://www.coolness.org/ratings/V1.html"
     5         shortname "Cool"
     6         bureauURL "http://labelbureau.coolness.org/Ratings"
     7         )
     8     Policy (RejectUnless "(Cool.Coolness)")
     9     Policy (AcceptIf "((Cool.Coolness > 3) and
           (Cool.Graphics < 3))")
     10    Policy (RejectIf "otherwise")
     11    )
     12 )
EFA was instrumental in sponsoring a submission to W3C on PICSRules from Global Internet Liberty Campaign (GILC) members in which the potential for abuse of PICSRules was highlighted. The submission pointed out the potential for the use of PICSRules profiles for government-imposed censorship. PICSRules facilitates the implementation of server/proxy-based filtering thus providing a more simplified means of enabling upstream censorship, beyond the control of the end user.

The Director of W3C, Tim Berners-Lee, responded to the GILC criticism with some comments on the philosophy of PICSRules. However, these comments do not resolve the issues raised by EFA and other GILC members.

Rating Systems

RSACi and SafeSurf are the two best-known rating systems designed to work within the PICS framework.

The RSACi Rating System incorporates complex definitions under four categories of material: Sex, Nudity, Language and Violence.

The system originates from RSAC's computer games ratings system, an origin which is betrayed in the questionnaire that a webmaster must complete in order to obtain a rating. There are many categories of material which would prove difficult or impossible to rate under the RSACi system, e.g. news reports, automatically-generated Web pages from Search engines, artistic works, health and medical information.

Furthermore, the RSACi content categories do not provide a ready means of rating a great deal of contentious material that many parents would have concerns about, e.g. gambling, religious cults, drugs, extremist political viewpoints, bomb recipes and other illegal or harmful content.

Owing to the perceived problems with the RSACi system, the EFA has condemned the RSACi ratings system as totally unsuited to its stated objective.

In May 1999, RSAC was absorbed by the UK-based Internet Content Rating Association (ICRA). ICRA claims to have developed an international content rating system that is backward-compatible with RSACi, which was announced in December 2000.

The SafeSurf rating system, while less well known than RSACi, is based on a more comprehensive, and hence more complex, content classification regime.

The SafeSurf categories include:
Suitable Age Range, Profanity, Heterosexual themes, Homosexual themes, Nudity and consenting sexual acts, Violent themes, Sexual violence, Accusations or attacks against racial or religious groups, Themes advocating or glorifying illegal drug use, Other adult themes requiring parental caution, and Gambling.

This list illustrates the complexity of the task of rating Internet content. The problem becomes even greater when one considers that material will have have to be rated under all of the main rating system contenders in order to be widely accessible.

Cyber Patrol's CyberNOT Block List forms the basis of its third-party rating server, a content blocking method also supported by PICS. Under this scheme, the client software accesses a rating server to determine the content ratings for requested material. The CyberNOT Service provides rating information on two categories, Sex and Other. The Sex category includes four sub-categories: Gross Depictions; Sexual Acts/Text; Partial Nudity, Nudity. The Other sub-category includes: Racist/Ethnic; Gambling; Satanic/Cult; Drugs & Drug Culture; Militant/Extremist; Violence/Profanity; Questionable/Illegal and Alcohol, Beer & Wine.

Ratings Services

The following services rate materials published by others. Web site authors cannot control what ratings these services will assign. Their labels are not normally embedded within the documents, but rather are distributed via label bureaus.

The above services were available in approximately 1997, however some appear to have ceased operations. For possibly more recent information on rating services, see the PICS Third-Party Rating and Self-Rating Services Lists at the W3C's site.