Skip to main content
Find a Lawyer

If the Supreme Court Holds That Public Libraries Cannot Require Software Filters, Are There Other Ways to Protect Children on the Web?

By JULIE HILDEN


julhil@aol.com
----
Tuesday, Feb. 18, 2003

On March 5, the Supreme Court will hear oral argument in a case involving the Children's Internet Protection Act (CIPA). CIPA mandates that public libraries cannot receive certain types of important federal funding unless they comply with a condition: They must install, on their publicly accessible computers, filtering software that attempts to block the user - whether an adult or a minor - from accessing obscenity or child pornography, both of which are illegal.

A group of libraries, library associations, library patrons, and Web publishers sued to invalidate CIPA, arguing that it violates the First Amendment. After an eight day trial, a special three-judge court, composed of one federal appeals judge and two federal district judges, agreed.

The opinion raises numerous questions, some of them very technical. In this column, I will address only one: if the Supreme Court agrees with the three-judge court, and strikes down CIPA, will there be any way to protect children who use the Internet in public libraries, or to catch those persons who use public library terminals to access child pornography?

Filters That Targeted Two Different Categories of Material

To answer this question, it's necessary first to discuss the filters and what they do, and do not, screen out.

Based on the evidence presented at trial, the three-judge court found that the software filters were at best, "blunt instruments." They "underblock[ed]," failing to reach all obscenity and child pornography. More seriously, they also "overblock[ed]" - preventing patrons from accessing large quantities of First Amendment-protected material that the libraries wanted patrons to be able to see.

Before discussing the problems with the filters, let's first consider how the material the filters were supposed to target - obscenity and child pornography - is defined.

"Obscenity" is material that meets a particular constitutional standard set forth in Miller v. California. For the standard to be satisfied, the judge or jury must find that three tests are satisfied: "the average person, applying contemporary community standards would find that the work, taken as a whole, appeals to the prurient interest"; "the work depicts or describes, in a patently offensive way, sexual conduct specifically defined by the applicable state law"; and "the work, taken as a whole, lacks serious literary, artistic, political, or scientific value".

To make matters even more complicated, this is only the obscenity standard for adult viewers. When minors are the viewers, under Ginsberg v. New York, a lower and less demanding obscenity standard may be used, in which material that is deemed "harmful to minors" can be prohibited.

Meanwhile, "child pornography" is generally defined as material involving children displayed in "sexually explicit conduct." (States laws may vary somewhat, but this is the gist of the federal definition.) It need not meet any obscenity standard. It cannot legally be viewed by any person, adult or child. Its harm lies in the harm done to the children exploited.

Child pornography, the Supreme Court recently ruled, must involve actual children. The Court's recent 6-3 decision in Ashcroft v. Free Speech Coalition made clear that virtual child pornography, which involves images of children that are computer-created, does not count as child pornography. Rather, it is protected by the First Amendment.

As these complex tests indicate, the software filters were in for a losing battle when facing the three-judge court. A filter cannot apply the Miller test; only a judge or jury can do that.

Nor can a filter figure out the library patron's age, and then decide whether to use the Miller or Ginsberg test. CIPA suggests minors' viewing should be subject to Ginsberg's "harmful to minors" test, while adults' viewing should be subject to Miller's three-part test. Yet it does not explain how libraries with a limited number of terminals - or perhaps, in small communities, only one - are supposed to effectively segregate minors and adults. Nor does it explain how the tests can possibly be applied by the filters, which cannot exercise judgment as people can.

Censors are bad enough; computerized sensors are inevitably worse. Can a filter apply "community standards" or assess if a work is "patently offensive" or has "literary or artistic value"? How will a filter know if a particular site is harmful to minors?

And the problems with the filters don't stop there. For instance, a filter will very probably not be able to tell an illegal (indeed, criminal) sexual photograph of an actual child, from an entirely legal (if repugnant) computer-generated sexual image of a child.

Indeed, such filters are not very good at dealing with images at all; they often depend on words used by the site, and when they do so, they cannot really figure out the context in which the words have been used.

As a result, the evidence before the three-judge court showed that much of the material the filters blocked was not only First Amendment-protected, but also obviously useful to library patrons, including teens. A site's mention of certain terms would trigger the filters, even if the site merely discussed sexual health, contraception, medical issues, or other perfectly legal and helpful topics.

Serving CIPA's First Goal: Fighting Child Pornography

All of these problems with the filters led the three-judge court to strike CIPA down. The Supreme Court may well do the same, although the outcome is somewhat hard to predict; the current Court is generally strongly pro-First Amendment, but often not in the context of children, or that of sexuality. If the Court does strike CIPA down, are there other effective ways to protect children?

CIPA conflates two different goals, so it's best to begin by disentangling them.

The first is protecting the actual children who are exploited in child pornography from being further exploited by viewers.

Child pornography, obviously, is a terrible problem. Harsh criminal laws properly crack down on it, and sting operations try to break up child porn rings. But in the end, some child pornographers still purvey their wares. Moreover, viewers of such material who fear being caught may want to take advantage of the anonymity of going to the public library, rather than risking being caught through a private computer with an IP address traceable to them.

Still, because of the limitations of their ability to block targeted sites effectively, filters may not be the best way - or even a particularly good way - to prevent child pornography from being accessed in public libraries. There is another option, which the three-judge court pointed out. The library, or the authorities, could "trac[e] a given URL" - that is, a website's address - "to a particular patron only after determining that the URL corresponds to a website whose content is illegal." (Emphasis added.) The idea is that library patrons can remain anonymous until they try to access child porn; when they do, they are nabbed.

There are certainly problems with this solution: For one thing, everyone would have to get their URLs tracked, and their names taken, and those records remain. We can only trust that the government will not choose to access those records, and in the age of Total Information Awareness, such trust is awfully difficult to extend.

However, some may argue that child pornography is such a pernicious problem, it's worth extending that trust, and keeping the records.

Serving CIPA's Second Goal: Ensuring Children Do Not View Obscenity

The second goal is protecting those children who may view, on public library Internet terminals, material that is "obscene as to minors" under the Ginsberg standard - or, even worse, material that is "obscene as to adults" under the Miller standard.

This problem is, in my opinion, much more mild than the noxious problem of children being used to make pornography. For one thing, what we're basically dealing with is the age-old tradition of teenagers trying to look at sexy pictures. For another, if teenagers aren't able to view obscene materials in the public library, they'll doubtless do it through home computers or through buying pornographic magazines.

The problem is that minors are actively looking for this material, not that they are unwittingly running across it. After all, such material is quite easy to spot and avoid. As the three-judge court noted, "[P]ublic library patrons of all ages, many from ages 11 to 15, have regularly sought to access [Internet pornography] in public library settings." (Emphasis added)

What the teens are seeking, they are likely to find: as the Court also noted, "[a]s of 2002, there were "more than 100,000 pornographic Web sites that can be accessed for free and without providing any registration information." And due to the filters' chronic underblocking, some are going to remain accessible no matter what filters you use.

In sum, telling teenagers they cannot see porn is a losing battle, and a battle that has serious costs. For one thing, as noted above, filters screen out lots of sexual information - concerning sexuality, sexual health, sexually transmitted diseases, and so on - to which teenagers need to have access. It would be great if public libraries could provide a safe forum for teens to get this information. Instead, valuable sites on teen sexuality are blocked so that sex sites can be blocked, too - a case of throwing out the baby with the bathwater.

For another thing, screening for teens hurts adults, who should be able to view all the material the First Amendment protects. In the 1997 case of Reno v. ACLU, the Supreme Court - in striking down two key provisions of the Communications Decency Act (CDA) - strongly criticized measures that water down the speech adults can hear to a level fit only for children. Under the CDA, posting, or even hosting, an "indecent" or "offensive" message in a chatroom where children might see it, was a crime. And that meant adults couldn't see such messages either, even though they had a First Amendment right to do so.

Any software filter aimed at obscenity is sure to have the same effect - of watering down what adults see to accommodate children. If filters attempt to use the lower Ginsberg obscenity test, they use a test that applies only to minors, not adults. On the other hand, if they attempt use the more demanding Miller test, teens will be able to see pornography to their hearts' content, defeating much of the filters' purpose.

And in any event, as I suggested above, a filter can't really properly impose a legal test anyway. As a result, some arbitrary line is going to be drawn, and it's not going to track First Amendment law. Because it will fail to do so, it will inevitably either violate adults' First Amendment rights, or fail to fully deny minors access to pornography.

Is there an alternative to filters? The three-judge court suggested that teens might be forced to use terminals only in the children's room, where librarians can look over their shoulders to view their screens. If they view something the librarian doesn't like, the librarian can then use the "tap on the shoulder method" and either warn the teen or revoke his or her library privileges.

That might be okay for ten-year-olds. But it's condescending, unfair and intrusive, when it comes to sixteen-year-olds. It also invites abuse by vesting a great deal of discretion in the librarian.

To take a pretty realistic example, suppose a sixteen-year-old views the Playboy.com site - which, in my judgment at least, is neither obscene as to adults under Miller, nor obscene as to minors under Ginsberg. If the "tap on the shoulder" method is being used, a judgmental librarian, watching like a hawk, may notice this, and say the teen can never use library computers again. Under the court's solution, the teen would have no remedy, even though the punishment was for viewing entirely legal material. And that's not right. For lower-income teens, especially, a public library card can be one of their few passports to books, and so a ban on their entry is a very serious matter.

In a free country, librarians shouldn't be watching teens like hawks to subjectively censor their Internet use. Adults shouldn't be treated like teens, and teens shouldn't be treated like young children. If we can't come up with better alternatives than this kind of constant surveillance, or the kind of inaccurate filters CIPA mandates, then the best alternatives, when it comes to teenagers' Internet use in public libraries, is probably no regulation at all.


Julie Hilden, a FindLaw columnist, practiced First Amendment law at the D.C. law firm of Williams & Connolly from 1996-99. Currently a freelance writer, Hilden published a memoir, The Bad Daughter, in 1998. Her forthcoming novel Three will be published by Plume Books in August 2003, by Bantam in the U.K., and in French translation by Actes Sud.

Was this helpful?

Copied to clipboard