Section 230 Under Assault: It’s Not Just a Big Tech Problem

Advertising Law

Section 230 of the Communications Decency Act (CDA) is once again at the center of a major political debate, with momentum building for an overhaul of the statute that many view as having served a critical role in the rise of big tech and social media. On March 25, the heads of the big tech trio Facebook, Google and Twitter testified before the House Commerce Committee, defending their Internet platforms against fierce attacks from lawmakers on both sides on the companies’ content moderation policies and practices. While the Democrats and Republicans may differ on the nature of the shortcomings of Section 230, there appears to be a growing consensus among lawmakers that Congress should take action to make big tech accountable for conduct taking place on their platforms.

All Eyes on Section 230

Section 230, which was passed in 1996 when the Internet was still in its infancy, provides online platforms, such as Facebook, Google and Twitter, immunity1 against a range of laws for third-party user-generated content (UGC) hosted or published on their platforms. It was the intent of Congress “to promote the free exchange of information and ideas over the Internet and to encourage voluntary monitoring for offensive or obscene material.” Carafano v. Metrosplash.com, Inc., 339 F.3d 1119, 1122 (9th Cir. 2003). Congress recognized that online platforms did not serve the function of a publisher itself, but were merely the conduit for such information. Congress was concerned that tort-based lawsuits could chill speech and innovation “in the new and burgeoning Internet medium” and that imposing tort liability on intermediaries was “simply another form of intrusive government regulation of speech.” Zeran v. America Online, Inc., 129 F.3d 327 (4th Cir. 1997).

In particular, the Section 230 debate centers on two substantive provisions:

(1) Section 230(c)(1) states that a “provider or user of an interactive computer service” (e.g., an online platform or, Internet service provider) is not deemed to be “the publisher or speaker” of UGC. In essence, this removes the legal responsibilities that would ordinarily fall on publishers’ shoulders with regard to third-party content, such as claims of defamation, invasion of privacy, negligence, false advertising and unfair competition. See Carafano, 339 F.3d at 1122 (“Internet publishers are treated differently from corresponding publishers in print, television and radio”).

To put this into present context, in instances where UGC is published online touting fake COVID-19 vaccines or dangerous “Tide Pod” challenges, or promoting violence and hate speech, the online publishers essentially bear no civil responsibility for publishing such content. And, indeed, this lack of accountability is giving lawmakers heartburn. “Your platforms are my biggest fear as a parent,” noted Rep. Cathy McMorris Rodgers (R-Wash.) during the March 25 hearing.

(2) Section 230(c)(2) states that a “provider or user of an interactive computer service” shall not be held liable where it voluntarily takes action “in good faith to restrict access to or availability of material” that it considers to be “obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.”

While this seems straightforward at first glance, both the phrases “otherwise objectionable” and “good faith” are undefined in the statute and have drawn the ire of many critics, with some arguing that online platforms interpret the statute broadly to justify their politically motivated actions.

The Department of Justice conducted a review of Section 230 in 2020 and recommended, among other things, to (i) “replace vague terminology” for the “catch-all ‘otherwise objectionable’ language” and (ii) clarify the meaning of “good faith” as it “should encourage platforms to be more transparent and accountable to their users, rather than hide behind blanket Section 230 protections.”2 In September 2020, the Justice Department submitted draft legislation to Congress on behalf of the Trump administration based on such recommendation.

Courts have grappled with statutory interpretation as well. The Ninth Circuit in Enigma Software Group USA, LLC v. Malwarebytes, Inc. held that “providers do not have unfettered discretion to declare online content ‘objectionable’” and that “blocking and filtering decisions that are driven by anticompetitive animus are not entitled to immunity under section 230(c)(2).” 946 F.3d 1040, 1049 (9th Cir. 2019).

In contrast, in February, the Second Circuit in Domen v. Vimeo unanimously sided with the defendant for unilaterally removing videos that it deemed to be in violation of the platform’s own policy of posting “videos that harass, incite hatred or include discriminatory or defamatory speech.” Plaintiffs sued under a number of state laws for censorship, and Vimeo obtained dismissal of the claims under CDA immunity grounds. The Second Circuit affirmed the district court’s decision, noting that 230(c)(2) is a “broad provision” that bars liability where online providers restrict access to content that they “consider … objectionable” (emphasis in the original) 2021 WL 922749 (2d Cir. March 11, 2021).

Section 230 and Its Impact on Advertisers

While the recent debate on Section 230 has generally been focused on big tech’s lack of adequate content moderation or overly aggressive content moderation, brands have a major stake in any Section 230 reforms due to the brands’ increasing reliance on social media platforms and UGC on their own digital channels to communicate with their customers and promote their products.

First, Section 230 impacts brand safety and awareness. Online platforms earn billions of dollars annually displaying brand advertisements on their sites. Those premium ads may be placed on pages or forums touting fake news, inciting violence or promoting illicit conduct. These practices can harm trusted brands, which consumers may associate with the harmful content. Society takes a hit as well, with consumers believing the content is valid simply because a trusted brand appears to endorse it. Accordingly, changes in Section 230 may impact how brands advertise on online platforms in the future.

Second, brands may be entitled to qualified immunity under Section 230 as a provider of an interactive computer service. When a brand allows its customers to post product reviews on its website or participate in any UGC promotion hosted by the brand, the brand could arguably be deemed to be a provider of an interactive computer service to the extent that it is merely acting as a platform without adopting any content posted by the customers. In fact, in a famous case involving a UGC promotion conducted by Quiznos in 2010, Quiznos claimed Section 230 immunity to combat Subway’s false advertising claims relating to an online contest hosted on Quiznos’ website where contestants posted UGC videos comparing the two chains’ sandwiches. Doctor’s Associates, Inc. v. QIP Holders LLC, 2010 WL 669870 (D. Conn. Feb. 19, 2010). Subway did not challenge Quiznos’ claim that it was a provider of an interactive computer service; instead, the dispute centered on whether Quiznos was “actively responsible for the creation and development of disparaging representations about Subway” in the UGC, which the court said was a question for the jury. The case settled out of court without answering this question.

Why It Matters

Section 230 is arguably one of the most important pieces of Internet legislation that has hastened the exponential growth of the Internet and digital advertising. Brands and the advertising industry have both benefited and suffered from the broad immunity available under Section 230. However, with the rising chorus of politicians and consumer advocacy groups demanding Section 230 reforms after a tumultuous year of disinformation and misinformation mayhem online, the future of Section 230 in its current form is uncertain. Any amendments to Section 230 could have a significant impact on how both big tech and brands conduct business as they re-evaluate such risks.


Special thanks to Manatt transactional associate Kendrick Coq for contributing valuable time and research assistance to this article.


1. Section 230(e) expressly provides that its immunity provisions will not apply to (1) federal criminal laws, (2) intellectual property laws, (3) any state law that is “consistent with” Section 230, (4) the Electronic Communications Privacy Act of 1986, and (5) certain civil actions or state prosecutions where the underlying conduct violates specified federal laws prohibiting sex trafficking.

2. “Section 230—Nurturing Innovation or Fostering Unaccountability?” U.S. Department of Justice, June 2020 (available at https://www.justice.gov/file/1286331/download).

manatt-black

ATTORNEY ADVERTISING

pursuant to New York DR 2-101(f)

© 2024 Manatt, Phelps & Phillips, LLP.

All rights reserved