Skip to main content
Find a Lawyer

DO WE REALLY WANT A SECRET CENSORSHIP SYSTEM

By CHRIS HANSEN

By the time this editorial appears, Congress is likely to have passed legislation that will require libraries and schools to install blocking software on every computer, to prevent access to certain Internet sites. This legislation raises serious constitutional issues. It also raises the issue of whether the push toward government mandating blocking software is good policy. In my view, it is not.

Blocking Software As A "Less Restrictive Alternative" To Criminal Law

Since the mid-1990's, Congress and state legislatures have passed a variety of laws designed to criminalize speech on the Internet if that speech is about sex and if it is accessible to minors. Targeting what is variously called "indecent" speech or speech that is "harmful to minors" or "obscene as to minors," these laws make it a crime to transmit such speech over the Internet if a minor will be able to read it.

The American Civil Liberties Union (ACLU) — for which I am an attorney — and other organizations have successfully challenged those laws. The most notable decision is Reno v. ACLU, in which the Supreme Court struck down the Communications Decency Act as violating the rights of adults — specifically, the principle that the speech adults hear cannot, consistent with the First Amendment, be diluted to a childlike level.

However, in all of those cases, plaintiffs such as the ACLU have presented testimony on the voluntary use of blocking software as a "less restrictive alternative" to criminal laws against speech. (Under the constitution, a law that directly burdens a constitutional right, and thus is subject to strict scrutiny, cannot survive unless it is the least restrictive alternative the state can employ.)

Proponents of restrictions on Internet speech have seized on that alternative to seek to require the use of blocking software, at least on public computers such as those in libraries and schools, in the name of protecting children.

Blocking Software and Its Secret Lists

Blocking software companies prepare lists of Internet sites that they assert meet pre-set criteria. Most of the companies now try to create a variety of lists. Some are lists of sites about sex; some are lists of sites about violence or hate speech or gambling or other subjects.

People who install the software can configure the software to block any or all of the lists. For example, if you have installed the software and chosen to block gambling, then you cannot access the web sites that the company has declared to be gambling sites.

The companies (with one exception) will not disclose their lists. Thus, if you are trying to decide whether to install the software or not, you cannot determine the accuracy or value of the list the software employs to screen out "inappropriate" sites. In short, if you have installed the software, you cannot determine what has been censored. Thus, if the software is installed in a library or school, neither the librarians, nor the teachers, nor the administrators, nor parents will know what is being censored. Thus, they will have no way of second-guessing the sites’ decision as to what sites are appropriate and what sites are not — and are therefore listed and blocked.

How Blocking Software Screens Out Certain Sites

The companies have secret procedures for identifying sites that will be listed and those that will not be listed. Apparently, they use classic search engine systems, such as those used by AltaVista or Yahoo, to identify possible suspects — perhaps by looking for sites that contain key words such as "XXX" or "hot girls" or "blackjack" and perhaps by finding one site they consider objectionable and looking for related sites or linked sites. Then, they assert, a human looks at the site and decides if the content on the site fits criteria such as "sexually explicit," or "promoting gambling."

Even this simple description should make it obvious that such products cannot succeed. They must inevitably be both overinclusive (in that they block sites that do not meet the criteria) and underinclusive (in that they fail to block sites that do meet the criteria). Overinclusiveness and underinclusiveness are troubling as a matter of policy. They are also troubling as a legal matter — for overinclusiveness and underinclusiveness are also legal properties that if established, can show that a particular legal measure is not "narrowly tailored" to serve its goal, and thus is unconstitutional.

Part of the reason for this overinclusiveness and underinclusiveness of blocking software is that the criteria are so subjective. Both the computer and the human reviewer are likely to be extremely inexact in reliably and consistently identifying sites that meet the criteria. Most of the explanation for the failures of these products, however, is due to the size and changing nature of the Internet. There are over 1 billion web pages and something like 25% change every day. Even the most powerful search engines, such as Google, only find less than 1/3 of the sites on the Internet. Imagine then, how impossible it would be for blocking software to try to find all the sites on the Internet every day, and review each to make a careful evaluation of the content. It just can’t be done with current technology.

As a result, errors are rampant. One blocking software product, Cyber Patrol, blocks or has blocked (as sexually explicit) web pages of the Ontario Center for Religious Tolerance, the HIV/AIDS Information Center of the Journal of the American Medical Association (JAMA), the University of Newcastle’s computer science division, Nizkor (a Holocaust remembrance site), Nike shoes, the National Academy of Clinical Biochemistry, the U.S. Army Corps of Engineers, and the MIT Student Association for Free Expression and Planned Parenthood.

Other blocking software programs have blocked as sexually explicit a map of Disney World, the Quakers; the National Journal of Sexual Orientation Law; the AIDS Quilt Info & Links; the Heritage Foundation; Fairness and Accuracy in Reporting; Community United Against Violence; the Glide Memorial Methodist Church; the Center for Reproductive Law and Policy; the entire web site of the San Francisco Chronicle; a bibliography of psychiatry, madness and insanity; the Wesleyan University Philippines Mass Communication Society; and several personal home pages, including a personal page containing photos of National Parks.

Do we really want to require systems on our library and school computers that block sites secretly, and block them with such a high level of error? There are plenty of effective, well-recognized alternatives to blocking software that can be used to prevent unwanted access to objectionable sites. Most libraries now offer guided research, similar to the old card catalogs, that will bring you to valuable sites. Privacy screens can prevent passers-by from seeing sites they don’t want to see. Adults and children can be taught search techniques to maximize effective research, and minimize accidental access to objectionable sites. For the first time in our nation’s history, the federal government is intruding into local control of libraries and schools to require a secret censorship system. We should all object.

Mr. Hansen is a senior staff counsel at the ACLU. He has been involved in most of the major Internet censorship cases for the last five years. He was lead counsel for the ACLU in Reno v. ACLU. He was also counsel in Mainstream Loudoun, the case finding mandatory use of blocking software in libraries to be unconstitutional.

Was this helpful?

Copied to clipboard