From the Summer 2002 issue of The News Media & The Law, page 29.
By Jennifer Lynn Williston
Since the advent of the Internet, federal lawmakers have tried to craft legislation that would sanitize cyberspace for minors and yet be narrowly tailored so as to avoid the violation of adults’ free speech rights.
As several recent court decisions show, such a law has not been drafted. Consider these cases:
• Ashcroft v. Free Speech Coalition, in which the U.S. Supreme Court ruled on April 16 that provisions in the Child Pornography Prevention Act barring visual material that “appear to be a minor” or “conveys the impression” that a minor was involved in explicit sexual conduct violate the First Amendment (See NM&L, Spring 2002);
• Ashcroft v. ACLU, in which the U.S. Supreme Court said on May 13 the Child Online Protection Act’s use of contemporary community standards to define “harmful to minors” did not make the law unconstitutional. But it remanded the law to an appellate court for further review, suggesting that it is likely that the entire law is unconstitutional (See NM&L, Winter 2002); and
• American Library Association v. United States, in which a special panel of the U.S. Court of Appeals in Philadelphia (3rd Cir.) on May 31 struck down the Child Internet Protection Act, which required libraries to use Internet filters. The court found that use of the filters would restrict access to protected speech.
The court’s consistent pattern of striking down such laws stretches back to 1997 when the Supreme Court unanimously struck down the Communication Decency Act of 1996, Congress’ first attempt at regulating cyberporn. The Court, in Reno v. ACLU (See NM&L, Summer 1997), determined the law to be unconstitutionally overbroad and restrictive of speech protected under the First Amendment.
In 1998, Congress passed the Child Online Protection Act (“COPA”) to replace portions of the Communications Decency Act. The law established prison sentences and fines up to $100,000 as punishment for placing material “harmful to minors” on a Web site and making it readily available to minors.
In Ashcroft v. ACLU, provisions in COPA that used community standards to identify material that is harmful to minors were challenged as a First Amendment violation. The Supreme Court provided some guidance and ruled that community standards could be used to identify material that is harmful to minors without violating First Amendment rights.
Yet the Court remanded the case on May 13 to the Third U.S. Circuit Court of Appeals for an analysis of the law in its entirety. The Supreme Court held COPA unconstitutional only on the basis that it used contemporary community standards and found it unnecessary to construe the rest of the law or address the trial court’s reasoning in declaring it unconstitutional. The Court suggested that COPA may be unconstitutional because it restricts substantially more speech than is allowed under the First Amendment.
“The scope of our decision today is quite limited,” Justice Clarence Thomas wrote. “We hold only that COPA’s reliance on community standards to identify ‘material that is harmful to minors’ does not by itself render the statute substantially overbroad.”
COPA has yet to be enforced by the government because an injunction was issued in a federal district court in February 1999. The Supreme Court continued this injunction.
With the fate of COPA in limbo, lawmakers have passed additional legislation to curb specific areas of Internet pornography. Yet, lawmakers have found the same problems with the new legislation.
The Supreme Court struck down portions of the Child Pornography Prevention Act of 1996 in April as an unconstitutional restriction of speech. The act expanded current child pornography law to include not only images of actual children engaging in explicit sexual conduct but also “any visual depiction, including any photograph, film, video, picture, or computer-generated image or picture.”
By a vote of 7-2, the Court declared that provisions of the law that outlawed material that “appear to be a minor”or “conveys the impression” that a child was engaging in sex were overbroad. The legislation could have criminalized adult expression, such as a painting that depicts scenes from classical mythology or scenes in recent movies “Traffic,” “American Beauty” and “Lolita” that have social, artistic and literary value.
Explaining the court’s position, Justice Anthony Kennedy wrote: “Our society, like other cultures, has empathy and enduring fascination with the lives and destinies of the young. Art and literature express the vital interest we all have in the formative years we ourselves once knew, when wounds can be so grievous, disappointment so profound, and mistaken choices so tragic, but when moral acts and self-fulfillment are still in reach.”
The vote was 6-3 against a ban on computer-generated images. Sandra Day O’Connor joined Chief Justice William Rehnquist and Justice Antonin Scalia, who dissented in both votes.
“Congress has a compelling interest in ensuring the ability to enforce prohibitions of actual child pornography, and we should defer to its findings that rapidly advancing technology soon will make it all but impossible to do so,” Rehnquist wrote. The law “is targeted to this aim by extending the definition of child pornography to reach computer-generated images that are virtually indistinguishable from real children engaged in sexually explicit conduct.”
Congress has drafted new legislation in an attempt to comply with the Supreme Court’s ruling. The House overwhelmingly passed a bill that contained language banning any computer-generated image that is “virtually indistinguishable from that of a minor engaging in sexually explicit conduct.”
Online pornography continues to create a troublesome situation for lawmakers as the material is readily accessible to children, but it is difficult to develop ways to restrict or regulate the material without violating others’ First Amendment rights.
The difficulties continue as legislation that required libraries to use Internet filters to prevent its patrons from accessing objectionable material or risk losing federal funding was struck down by a federal court panel as a violation of the First Amendment.
President Clinton signed the Child Internet Protection Act into law in December 2000 in an effort to protect children from accessing pornography over the Internet.
The law required public libraries receiving federal funding to adopt and implement Internet safety policies that include operation of a “technology protection measure” that blocks or filters Internet access to Web sites that are obscene, contain child pornography, or are harmful to minors (See NM&L, Spring 2001). Libraries were required to comply with the law by July 1 or face the loss of federal funding.
Writing an opinion for a special three-judge panel on May 31, Justice Edward R. Becker of the U.S. Court of Appeals in Philadelphia (3rd Cir.) indicated that the court sympathized with the government’s goals of preventing library patrons access to obscene material.
“Unfortunately this outcome, devoutly to be wished, is not available in this less than best of all possible worlds,” Becker wrote. “No category definition used by the blocking programs is identical to the legal definition of obscenity, child pornography, or material harmful to minors, and, at all events, filtering programs fail to block access to a substantial amount of content on the Internet” that falls into these categories.
Not only does the required filtering software “underblock” access to obscene material, but it also “overblocks” a large amount of information that is protected speech under the First Amendment, the panel said. For example, under the current filtering system, library patrons may be prevented from obtaining information relating to health and sexuality issues.
According to a study by the National Research Counsel, the panacea the government is searching for to protect children from unsuitable material on the Internet does not exist.
“Though some might wish otherwise, no single approach — technical, legal, economic or educational — will be sufficient,” the researchers wrote in the report. “Rather an effective framework for protecting our children from inappropriate materials and experiences on the Internet will require a balanced composite of all these elements.”
While the government argued that use of filtering software would be equivalent to a library’s practice of selecting and purchasing material for its collection, the court disagreed. It applied a strict scrutiny test and indicated that the “use of filtering software is permissible only if it is narrowly tailored to further a compelling government interest and no less restrictive alternative would serve that interest.”
The court said public libraries could offer filters as a choice for families to use for their own children at the public library; offer education and Internet training courses; enforce the library’s current Internet usage policies; and use privacy screens or recessed monitors.