It’s very easy to hear potentially dangerous speech and immediately call for more laws. But what we need to remember is that the more power we give to the government, the more likelihood there is for abuse. That’s really the crux of the First Amendment. It's there to protect us from the government. So we have to consider, despite how well-intended any law could be, how is the government going to use that law? And ultimately, what's the risk of that law being used against any kind of speech that the government finds objectionable, whether or not it's actually incitement?
The Brandenburg case allows us some freedom to speak provocatively and to advocate for unpopular ideas without being dragged to court or off to jail. That is the much-needed breathing room given to us by the First Amendment—between our ability to speak freely and the government's power to crack down on speech it considers dangerous.
Just thinking about comedy, for example, or anyone who speaks in some kind of hyperbolic or exaggerated way, all of that type of speech that isn't taken on its face would be at risk of becoming illegal if that incitement standard were somehow changed. So there's a lot of speech that's protected under Brandenburg—and for good reason.
Brandenburg allows for change in technology and different media and looks just at the speech itself—that is, what the speech is calling for, and whether or not anyone is likely to heed that call. And I think those are the considerations that should be made. Will you have some parties on social media or elsewhere who are going to walk very close to that line of incitement and flout the freedom that they have? For sure. But if that line was drawn in any other way, then you really risk government abusing its power to crack down on speech that it arbitrarily finds objectionable, and outlawing speech that most would agree should be protected under the First Amendment.
Brandenburg v. Ohio limited the scope of banned speech to that which would likely incite or produce imminent lawless action. Is there a lesson we can learn from that case in this age of social media, conspiracy theories, and organized disinformation campaigns? Are additional free speech guardrails needed?
FIRST AMENDMENT
Justin Silverman JD’11 is executive director of the nonprofit New England First Amendment Coalition. The organization addresses access to government, and protection of the First Amendment.
Interview by Alyssa Giacobbe
Justin Silverman JD’11
Return to Features
Return to Table of Contents
Suffolk Law Professor Elbert Robertson
Q&A with Justin Silverman
Brandenburg means there has to be an imminent threat of violence, and imminent doesn't mean like tomorrow or the day after tomorrow. It really means now, and it's got to be somewhat of an inducement and incitement to a violent or other criminal activity now. As long as it's not one of these categories—obscenity, defamation, imminent provocation, giving away military secrets, or the facilitation of imminent violence—then we're still in the “free speech” zone.
Now, let’s not get confused. That doesn't mean that there aren’t entities that have a primary responsibility for regulating forms of speech. Opposing language and other forms of expression that are not in the public interest is the primary responsibility of the Federal Communications Commission, for example. The regulatory bodies can use their discretion and judgment in setting what the standards of the form of expression are going to be.
I was with the FCC right after the Telecommunications Act of 1996 passed and the internet was still sort of being put together down the hallway. No one really did a full analysis of the scope and power of the system of interconnected networks and network effects at that time. The idea was that the platforms were like what telephones were before: platform carriers. And we didn't want to make platform carriers liable for the content they carried, because they didn't determine what that content was. That was the model.
But that just doesn't fit the reality of what the internet has become. Internet service providers do affect both the content of what's carried and who gets the message. That's the problem that Facebook finds itself in, because they've gotten to be so big and so integral in terms of our basic understanding of the world around us. They have the power to determine who can get the message and who can send it. Often with prior knowledge of the content of the message and its source, they make strategic marketing decisions to disseminate it to receptive populations regardless of truth value or predictable societal effects.
Now, just because you don't like what is being said should not lead you to the conclusion that the carrier effectively promoted the result. So we go back to the January 6 insurrection, the claim that the election was stolen. That was a lie. It was a big lie. But people who believe in the Trumpian cause don't believe so, they believe that that's the truth. Whether or not Facebook is liable should not depend on the truth or falsity of that information, so much as the fact that Facebook put it out there, knowing that the information was a call to arms and a call to violence. The question is, if Facebook is making money doing that, can they be sanctioned under the public interest standard based on the effect of what they've done? Facebook says it had no idea how this information was going to be used. But do you believe it?
Images from top: Getty Images, MattSilverman.tv, Michael J. Clarke
Q&A with Professor Elbert Robertson
Justin Silverman JD’11 is executive director of the nonprofit New England First Amendment Coalition. The organization works to defend, promote, and expand public access to government, as well as to advance and protect the five freedoms of the First Amendment.
Suffolk Law Professor Elbert Robertson served as a special antitrust attorney and advisor for the Office of General Counsel for the Federal Communications Commission in Washington, D.C. Since 1997, he has taught courses at Suffolk Law in antitrust, corporations, law and economics, and other topics. Robertson also currently serves as a member of the Massachusetts State Advisory Committee of the U.S. Commission on Civil Rights.
Suffolk Law Professor Elbert Robertson served as a special antitrust attorney and advisor for the Office of General Counsel for the Federal Communications Commission in Washington, D.C. Since 1997, he has taught courses at Suffolk Law in antitrust, corporations, law and economics, and other topics. Robertson also currently serves as a member of the Massachusetts State Advisory Committee of the U.S. Commission on Civil Rights.