frequently asked questionscases & resources
Overview >
By Jarrod F. Reich
Contributing writer

The U.S. Supreme Court has yet to address the issue of hate speech on the Internet. Nor have many federal or state courts ruled on the issue. But it is important to remember that the courts’ rulings in other hate-speech cases are still controlling for future hate-speech cases no matter in which medium they occur, including the Internet.

To borrow from the late Justice Potter Stewart’s remark about obscenity, we may not be able to describe or classify hate speech accurately, but “we know it when we see it.” Mainstream America collectively shudders when it hears racial, anti-Semitic, homophobic, or other derogatory comments aimed at racial or religious minorities or other groups, but the question is: Can we stop it without stepping on people’s First Amendment rights? Even if so, how could it be stopped on the Internet?

Some have argued that racial and ethnic epithets are types of speech that, like “fighting words” (as articulated in Chaplinsky v. New Hampshire in 1942), seem to have “no redeeming value,” can incite violent retaliation, and thus should not enjoy First Amendment protection and can be regulated with no risk of infringing First Amendment rights.

But who decides what is offensive and, moreover, what is offensive enough to be called “hate speech”? Could legislation be drafted that would adequately bar hate speech without being either underinclusive (still allowing some hateful speech) or overbroad (banning protected speech)?

The Supreme Court has not been extremely receptive to hate speech regulation. It has said that such regulation should be “strictly scrutinized” to ensure that it does not prohibit protected speech.

The legacy of the Chaplinsky “fighting words” doctrine as it might be applied to hate speech has evolved into a “speech” vs. “action” dichotomy, as discussed in the following four cases.

Brandenburg v. Ohio: In this 1969 case, the Court explained its modern incitement test, whereby speech does not create the classic “clear and present danger” to citizens unless it is “directed to inciting or producing imminent lawless action and is likely to incite or produce such action.” The case centered around a videotaped and broadcast news piece on an Ohio Ku Klux Klan rally, wherein the viewer could hear racial and anti-Semitic epithets (such as “Freedom for the whites” and “Send the Jews back to Israel”) uttered in the background of the newscast. Although not a “hate speech” case per se (it dealt with an alleged violation of a state criminal syndicalism statute), Brandenburg’s per curiam opinion (all justices writing the opinion in agreement together) made clear that for non-obscene speech to be proscribed by the First Amendment, it must lead to “imminent lawless action.” The Court ruled there was no such imminence in Brandenburg because the epithets were spoken at an earlier time than they were received by its audience because of the television broadcast.

National Socialist Party v. Skokie: This famous 1977 case centered on the efforts of residents of the predominantly Jewish town of Skokie, Ill., to prevent the National Socialist of Nazi Party from holding a planned demonstration there. The Supreme Court denied the residents’ attempts to block the march, because to do so, it said, “albeit reluctantly,” would suppress the Nazis’ First Amendment rights. Said the Court: “[A]nticipation of a hostile audience [cannot] justify the prior restraint ... . [I]t is [the] burden of [Skokie residents] to avoid the [offensive march] if they can do so without unreasonable inconvenience.” The Court held, then, that the speech itself, although hateful, could be avoided.

R.A.V. v. City of St. Paul: In this 1992 case, the city of St. Paul, Minn., enacted an ordinance that banned the placing on public or private property “a symbol, object, appellation, characterization or graffiti, including, but not limited to, a burning cross or Nazi swastika, which one knows or has reasonable grounds to know arouses anger, alarm, or resentment in others on the basis of race, color, creed, religion, or gender.” The plaintiff in this case was arrested for violating this ordinance by placing a burning cross on the front lawn of an African-American family’s house. The Court held the ordinance invalid because it was both overbroad and inderinclusive, and that it even went so far as to constitute viewpoint discrimination. The Court reasoned that “the First Amendment does not permit [a government] to impose special prohibitions on those speakers who express views on disfavored subjects.” The Court considered that it was overbroad in that any such speech used by “proponents of all views” whatever its context would be prohibited. It was underinclusive in that it did not proscribe all fighting words — homophobic epithets and “aspersions about one’s mother” would be allowed under the statute. The Court therefore suggested in this case that any anti-hate crime statute would be presumed unconstitutional and therefore be strictly scrutinized on the grounds that it would be underinclusive, overbroad, and/or constituting viewpoint discrimination.

Wisconsin v. Mitchell: The Supreme Court solidified a speech/action distinction in this 1993 case. The case concerned black youths who had been convicted under a hate-crime statute after severely beating a white person because they were incensed by racist depictions of the movie “Mississippi Burning,” which they had watched. The Wisconsin Supreme Court overturned the convictions on the basis of R.A.V., because their actions constituted “offensive [yet protected] thought.” The Supreme Court reversed the state decision, saying that there was a difference between speech and conduct. “Whereas the ordinance struck down in R.A.V. was explicitly directed at expression (i.e., “speech” or “messages”), the statute in this case is aimed at conduct unprotected by the First Amendment,” the high court said.

The Mitchell court thus seemed to suggest that “hate speech” remains a conundrum: The only way it can be prohibited is if the statute that does so is “content-neutral” — yet the point of proscribing hate speech in the first place is to proscribe the content of the speech.

Perhaps the seminal case on the speech-conduct distinction vis a vis the Internet is Planned Parenthood of the Columbia/Willamette, Inc. v. American Coalition of Life Activists.

This 2001 case involved the “Nuremberg Files” Web site run by the American Coalition of Life Activists. The names and home addresses of abortion doctors were posted on the site, then crossed out or turned gray if the doctors were killed or wounded by anti-abortion zealots. The “Nuremberg Files” site did not explicitly threaten the doctors, but the ACLA lauded and perhaps encouraged the killings.

Some of the doctors whose names appeared on the list sued the ACLA on grounds that, among other things, the speech on the Web site “robbed the doctors of their anonymity and gave violent anti-abortion activists the information to find them” and praised the slaying/injuring of the doctors on the list. The doctors said this speech hurt them in that it constituted “true threats” against them.

Although not a “hate speech” case per se, the case sheds some light on how courts may handle such a case in the future. The U.S. Court of Appeals for the 9th Circuit held that the ACLA’s speech on the Web site was constitutionally protected by the First Amendment. The court said that there was no “imminence” as required since Brandenburg to prove a danger, and that “advocating illegal action at some indefinite future time is protected [by the First Amendment]. If the First Amendment protects speech advocating violence, then it must also protect speech that does not advocate violence but still makes it more likely,” the 9th Circuit court said.

Further, the court noted that the ACLA did not urge its members to commit the violence or have anyone commit the violence on the ACLA’s behalf. “While pungent, even highly offensive, ACLA’s statements carefully avoided threatening the doctors with harm in the sense that there are no ‘quotable quotes’ calling for violence,” the court held. A generalized implied threat (by giving those who would commit heinous acts the information required for committing them), the court said, could not be suppressed without violating the First Amendment.

The “Nuremberg Files” site, the 9th Circuit concluded, “cannot fairly be read as calling for future violence against several hundred other doctors, politicians, judges, and celebrities on the list; otherwise any statement approving past violence could automatically be construed as calling for future violence.”


FBI steps up monitoring of hate groups' Web sites

After conviction of white supremacist Matthew Hale for trying to have judge murdered, government says it won't tolerate anyone crossing line from protected speech to advocating violence. 04.28.04

Last appeal refused in 'wanted' poster case
9th Circuit decision, which ordered anti-abortion activists to pay nearly $5 million in damages, stands. 05.02.06

Some states pushing for laws to curb online bullying
Internet allows students to insult others in relative anonymity; some say cyberbullying can be more damaging than traditional bullying like fistfights, classroom taunts. 02.11.07

Ore. lawmakers pass cyberbullying legislation
Bill would require schools to create plan to address electronic harassment that happens on campus, near campus, on school buses or at school-related activities. 06.19.07

Feds seek removal of alleged threats on Pa. abortion foe's site
Blogger says he didn't write message suggesting former abortion provider be shot, but he did post it. 08.29.07

Federal judge orders removal of anti-abortion blog posts
Court grants injunction, bars man from publishing similar messages threatening abortion providers. 11.12.07

Mo. city makes online harassment a crime
Dardenne Prairie officials pass measure just days after learning that 13-year-old resident killed herself last year after receiving cruel messages on Internet. 11.26.07

print this   Print

Last system update: Friday, February 8, 2008 | 17:20:18
Internet & First Amendment issues >
Indecency online
Online libel
Hate speech online
Virtual child pornography
Copyright, P2P & Google