The First Amendment allows antitrust action against media companies for their business practices, but not for their editorial judgments. Section 230 mirrors this distinction by protecting providers of interactive computer services from being “treated as the publisher” of content provided by others, including decisions to withdraw or refuse to publish that content (230(c)(1)), and by further protecting decisions made “in good faith” to take down content, regardless of who created it (230(c)(2)(A)). Section 230 provides a critical civil procedure shortcut: when providers of interactive computer services are sued for refusing to carry the speech of others, they need not endure the expense of litigating constitutional questions. Thus, changing Section 230 could dramatically increase litigation costs, but it would not ultimately create new legal liability for allegedly “biased” or “unfair” content moderation. Nor will the First Amendment permit new quasi-antitrust remedies that compel websites to carry content they find objectionable.

By Berin Szóka1


The media are not exempt from the antitrust laws. As the Supreme Court ruled in Associated Press (1945), publishers “are engaged in business for profit exactly as are other business men who sell food, steel, aluminu­m, or anything else people need or want… The fact that the publisher handles news while others handle food does not … afford the publisher a peculiar constitutional sanctuary in which he can with impunity violate laws regulating his business practices.”2 The same applies to digital media: “whatever the challenges of applying the Constitution to ever-advancing technology, ‘the basic principles of freedom of speech and the press, like the First Amendment’s command, do not vary’ when a new and different medium for communication appears.”3 But what about editorial judgments? May a publisher, of traditional or new media, be subject to liability under the antitrust laws for refusing to carry the speech of others?

Increasingly, Republicans invoke antitrust as a tool to combat what they see as “censorship” of their speech by digital media companies. They allege that these companies have “gatekeeper” control over speech and are using that power in “biased” or “unfair” ways to curate, or moderate, the speech of users, right-wing publishers such as The Federalist, and apps or websites such as Parler, an alternative “free speech” social network with a radically hands-off approach to content moderation.4 Republicans routinely blame this so-called censorship on Section 230 of the Telecommunications Act of 1996.5

But when a website refuses to carry speech that it deems to be inconsistent with its values, or deprioritizes that speech in some fashion, the website is exercising well-established First Amendment rights of private parties not to carry the speech of others. Section 230 merely provides a civil procedure shortcut, ensuring that courts will dismiss the lawsuit without the website enduring the expense of litigating the First Amendment question. Section 230 draws the same line as the First Amendment itself: antitrust suits may proceed against “interactive computer service” (“ICS”) providers only insofar as those suits target “business practices,” rather than editorial judgments.



In Associated Press, the Court ruled that the nation’s leading press poll had violated the antitrust laws by giving member newspapers the right to veto requests by local competitors to join the press pool6 — clearly an anti-competitive “business practice,” not an editorial judgment. Likewise, in Lorain Journal (1951), the Court ruled that a newspaper could be sued when it refused to carry ads from local advertisers who refused to join in a boycott of a new radio station.7 This abuse of its market power was a “business practice,” not an exercise of editorial discretion protected by the First Amendment. As Prof. Eugene Volokh notes, “[t]he Lorain Journal Co. rule … does not authorize restrictions on a speaker’s editorial judgment about what content is more valuable to its readers.”8

Likewise, the Tenth Circuit has noted that “the First Amendment does not allow antitrust claims to be predicated solely on protected speech.”9 This is true even when speech has clear economic consequences. Thus, Moody’s, the leading bond rater, could not subject to antitrust action based on its publication of an article giving a school district’s bonds a negative rating.10 In each decision cited by the plaintiff in that case, “the defendant held liable on an antitrust claim engaged in speech related to its anticompetitive scheme,” but that speech was purely incidental to the antitrust violation.11 Those courts did not impose antitrust liability for editorial judgments, and none of these cases involved editorial decisions not to publish, or otherwise associate with, the speech of others. The Supreme Court has established a clear right to do just that, regardless of market power.

In Miami Herald (1974), the Court struck down a 1913 state law that required newspapers to carry replies by any political candidates subject to attack by the newspaper. “The choice of material to go into a newspaper, and the decisions made as to limitations on the size and content of the paper, and treatment of public issues and public officials — whether fair or unfair — constitute the exercise of editorial control and judgment.”12 It did not matter whether, as the political candidate alleged, the “elimination of competing newspapers in most of our large cities, and the concentration of control of media that results from the only newspaper’s being owned by the same interests which own a television station and a radio station” had “place[d] in a few hands the power to inform the American people and shape public opinion.”13 Subsequent courts have recognized that even newspapers with “substantial” or “virtual” monopolies have a First Amendment right to refuse to carry content the government seeks to compel them to carry.14

Existing antitrust law has led to the same result. For example, Facebook blocked users from accessing its site while using a browser extension that super-imposed its own ads onto Facebook’s website, the extension-maker brought an antitrust suit, and the court dismissed it: “Facebook has a right to control its own product, and to establish the terms with which its users, application developers, and advertisers must comply in order to utilize this product…. Facebook has a right to protect the infrastructure it has developed, and the manner in which its website will be viewed.15 Antitrust law is so well-settled here that the court did not even need to mention the First Amendment. Instead, the court simply cited Trinko: “as a general matter, the Sherman Act ‘does not restrict the long recognized right of [a] trader or manufacturer engaged in an entirely private business, freely to exercise his own independent discretion as to parties with whom he will deal.’”16 That “discretion” closely parallels the First Amendment right of publishers to exclude content they find objectionable under Miami Herald.



Courts have recognized that social media publishers have the same rights under Miami Herald as newspapers to reject content (including ads) provided to them by third parties.17 Various arguments have been made as to why the First Amendment should not protect “Big Tech” companies’ right to exclude content they find objectionable, just as it protects the right of newspapers not to run letters to the editor or parade organizers to exclude signs.18 Some claim digital media are “public fora.” Yet the Court recently declared that “merely hosting speech by others is not a traditional, exclusive public function and does not alone transform private entities into state actors subject to First Amendment constraints.”19 Lower courts have reached the same conclusion regarding digital media.20

Nor may digital media be forced to carry the speech of others as common carriers like railroads or telephone networks,21 which must “offer service indiscriminately and on general terms.”22 Like antitrust, common carriage regulation regulates business practices, not expressive decisions. “What appears to be essential to the quasi-public character implicit in the common carrier concept is that the carrier ‘undertakes to carry for all people indifferently….’”23 This, railroads and telephone networks do, but social media clearly do not: every social media site provides detailed terms of service and expressly reserves the right to remove content that violates those terms.

This is radically different from, say, providers of mass-market broadband service, which have promised not to block, throttle, or otherwise restrict access to lawful content even when they were not required to do so by net neutrality rules.24 In 2017, a three-judge panel of the D.C. Circuit upheld the FCC’s 2015 reclassification of broadband providers as common carriers. When broadband providers sought rehearing by the full D.C. Circuit, then-Judge Kavanaugh argued that imposing common carrier status on ISPs violated the First Amendment. Not so, explained the two judges who wrote the panel decision below, because the rules applied only insofar as broadband providers represented to their subscribers that their service would connect to “substantially all Internet endpoints” — and thus merely “require[d] ISPs to act in accordance with their customers’ legitimate expectations.”25 Conversely, the judges wrote, ISPs could easily avoid the burdens of common carriage status, and exercise their First Amendment rights: “[T]he rule does not apply to an ISP holding itself out as providing something other than a neutral, indiscriminate pathway—i.e., an ISP making sufficiently clear to potential customers that it provides a filtered service involving the ISP’s exercise of ‘editorial intervention.’”26 A “filtered” service is exactly what every social network offers to its customers, and “filtering” is exactly what they must expect.

Those proposing to force websites to carry speech against their will often invoke Turner Broadcasting v. FCC (1994), in which the Supreme Court ruled that requiring cable companies to carry local broadcasters’ channels for free did not violate the First Amendment. In fact, a careful reading of the case illustrates why social media cannot be compelled to carry content they find objectionable. Initially, a special three-judge panel upheld the law, concluding that the law was “simply industry-specific antitrust and fair trade practice regulatory legislation: to the extent First Amendment speech is affected at all, it is simply a by-product of the fact that video signals have no other function than to convey information.”27 On direct appeal to the Supreme Court, the government again argued that “the must-carry provisions are nothing more than industry-specific antitrust legislation, and thus warrant rational-basis scrutiny under this Court’s ‘precedents governing legislative efforts to correct market failure in a market whose commodity is speech,’” citing Associated Press and Lorain Journal, and arguing that the law should be subject, like any economic regulation, to rational basis review.28/a> The Court rejected this argument and ruled that must-carry mandates triggered heightened scrutiny under the First Amendment. This alone proves the initial point made above: antitrust cannot itself be used to compel carriage of speech.

Ultimately, the Court upheld the must-carry mandate for cable, but only because it applied intermediate, rather than strict, scrutiny. Intermediate scrutiny for two relevant reasons — neither of which applies to social media. First, although the law gave some broadcasters a right to cable carriage (and therefore favored their speech over the cable providers), the majority nonetheless concluded that the law was not content-based. The cable providers had not objected to any content or viewpoints expressed in the broadcasters’ programming; rather, as the majority noted, cable operators suffered an economic loss from not being able to charge for the one-third or so of their channel capacity allotted to broadcasters. Whether “must carry” for cable was really content-neutral in Turner was debatable: the majority saw no “subtle means of exercising a content preference” while the minority argued Congress’s “interest” in platforming “diverse and antagonistic sources” was not “content-neutral.”29 Regardless, the agenda behind “must carry” for social media is unmistakable: conservatives decrying the “monopoly power” of social media and invoking values like “neutrality,” are, in fact, trying to compel websites to carry particular kinds of content that they deem objectionable but which conservatives find politically useful.

The second reason the Court applied intermediate scrutiny was that forcing cable companies to carry local broadcasters’ channels would not, it concluded, “force cable operators to alter their own messages to respond to the broadcast programming they are required to carry.”30 Noting that the FCC had first instituted some form of must-carry mandate in 1966, the Supreme Court concluded: “Given cable’s long history of serving as a conduit for broadcast signals, there appears little risk that cable viewers would assume that the broadcast stations carried on a cable system convey ideas or messages endorsed by the cable operator.”31

Unlike cable operators, social media cannot be expected to operate as pure conduits. Instead, they can be, and are, generally associated with, the content they allow. Like newspapers, and unlike telephone networks, social media sites are increasingly held accountable for the consequences of the speech they carry. They are regularly boycotted by users — and, increasingly, by advertisers, under growing pressure from their own investors — for refusing to take down objectionable content. This is business reality for Facebook, as reflected in the multiple references in its most recent quarterly report to “risk factors” related to how the company’s handling of content is perceived.32 In Facebook’s last quarterly earnings call, CEO Mark Zuckerberg spent most of his time explaining how the company would handle misinformation about the then-impending election.33

While these market expectations have clearly increased in recent years, they are not new. In Stratton Oakmont v. Prodigy (N.Y. Sup. Ct. May 24, 1995), Prodigy, which provided both Internet access and a curated proto-social network, was held liable for user content because it had “held itself out as an online service that exercised editorial control over the content of messages posted on its computer bulletin boards.”34 Specifically, Prodigy advertised itself as follows: “We make no apology for pursuing a value system that reflects the culture of the millions of American families we aspire to serve. Certainly, no responsible newspaper does less when it chooses the type of advertising it publishes, the letters it prints, the degree of nudity and unsupported gossip its editors tolerate.”35

Taking such responsibility as “Good Samaritans” is precisely what Congress intended to encourage in enacting Section 230. By overruling Stratton Oakmont, the law has ensured that content moderation does not create legal liability for websites — either when they take content down or when they leave it up. Clearly, Congress did not want social media to be forced to function as mere conduits (like telegraph and telephone networks) for the speech of others.



Section 230 has two primary functions. First, the law ensures that providers (and users) of “interactive computer services” (including websites and other tech companies) cannot, with several exceptions (most notably, federal criminal law), be held liable for content created by others. Specifically, Section 230(c)(1) provides: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”36

Second, Section 230 likewise protects websites from being held liable for how they present or moderate content. Section 230(c)(2)(A) explicitly covers content moderation, protecting “any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be . . . objectionable.”37 Because 230(c)(1) does not explicitly address “restriction” of access to content, many commentators across the political spectrum (and nearly all commentators on the right) assume that the two functions of the statute map neatly onto these two provisions, with (c)(1) protecting “hosting” and only (c)(2)(A) protecting moderation. In fact, (c)(1) has consistently been interpreted to cover both. In its 1997 Zeran decision, the Fourth Circuit became the first appellate court to interpret Section 230(c)(1), concluding that “lawsuits seeking to hold a service provider liable for its exercise of a publisher’s traditional editorial functions — such as deciding whether to publish, withdraw, postpone or alter content — are barred.”38 Subsequent courts have consistently held that 230(c)(1) protects content moderation, and nearly all content moderation cases are resolved under 230(c)(1) rather than 230(c)(2)(A).39

The two provisions serve different functions: (c)(1) protects ICS providers and users — but only insofar as they are not themselves information content providers (“ICPs”) — i.e. not responsible, even “in part, for the creation or development of information provided through the Internet or any other interactive computer service.”40/a> But (c)(2)(A)’s protections apply even when an ICS provider or user is responsible, at least in part, for developing the content at issue.

These two immunities, especially (c)(1), allow these ICS providers and users to short-circuit the expensive and time-consuming litigation process, usually with a motion to dismiss — and before incurring the expense of discovery. In this sense, Section 230 functions much like anti-SLAPP laws, which, in 30 states,41 allow defendants to quickly dismiss strategic lawsuits against public participation (“SLAPPs”) — brought by everyone from businesses fearful of negative reviews to politicians furious about criticism — that seek to use the legal system to silence speech.

Without Section 230 protections, ICS providers would face what the Ninth Circuit, in its landmark Roommates decision, famously called “death by ten thousand duck-bites:”42 liability at the scale of the billions of pieces of content generated by users of social media sites and other third parties every day. As that court explained, “section 230 must be interpreted to protect websites not merely from ultimate liability, but from having to fight costly and protracted legal battles.”43



Like the First Amendment itself, both the (c)(1) and (c)(2)(A) immunities protect decisions to moderate or prioritize third-party content only insofar as these are truly editorial decisions (forms of expression and association), rather than pure business practices (conduct).

The (c)(1) immunity bars only those claims that hold an ICS provider (or user) liable as the “publisher” of content provided by another — for, as Zeran noted, “deciding whether to publish, withdraw, postpone or alter content.”44 This is precisely the same editorial discretion protected under Miami Herald.45 “Since all speech inherently involves choices of what to say and what to leave unsaid . . . for corporations as for individuals, the choice to speak includes within it the choice of what not to say.”46 In Hurley v. Irish-American Gay, Lesbian and Bisexual Group of Boston, the Supreme Court barred the city of Boston from forcing organizers of a St. Patrick’s Day parade to include pro-LGBTQ individuals, messages, or signs that conflicted with the organizers’ beliefs.47 The “general rule,” declared the Court, is that “the speaker has the right to tailor the speech, applies not only to expressions of value, opinion, or endorsement, but equally to statements of fact the speaker would rather avoid.”48

The (c)(2)(A) immunity does not explicitly turn on whether the cause of action involves the ICS provider being “treated as the publisher,” yet it clearly protects a subset of the rights protected by Miami Herald: decisions to “restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.”49 Including the broad catch-all “otherwise objectionable” ensures that this immunity is fully coextensive with the First Amendment right of ICS providers and users, as recognized under Hurley, to avoid speech they find repugnant.

Just as The Lorain Journal was not acting “as a publisher” when it refused to host ads from advertisers that did not boycott its radio competitor, it was also not acting “in good faith.” The “publisher” and “good faith” clauses limit the scope of Section 230’s protections, and ensure that Section 230 tracks the protections of the First Amendment. Thus, either immunity could be defeated by showing that the actions at issue were not matters of editorial judgment, but of anti-competitive business practices.

1 Berin Szóka (@BerinSzoka, is President of TechFreedom, a non-profit, non-partisan technology think tank dedicated to studying the legal issued raised by the Digital Revolution. He has practiced Internet and telecommunications law since 2005. TechFreedom has received financial support for its work from foundations, individuals, and companies across the technology sector.

2 Associated Press v. United States, 326 U.S. 1, 7 (1945).

3 Brown v. Entm’t Merchants Ass’n, 564 U.S. 786, 790 (2011) (quoting Joseph Burstyn, Inc. v. Wilson, 343 U.S. 495, 503 (1952)) (emphasis added).

4 Reply Comments of TechFreedom, In the Matter of National Telecommunications and Information Administration Petition for Rulemaking to Clarify provisions of Section 230 of the Communications Act of 1934, RM – 11862, 21-28, 30-38 (Sept. 17, 2020)

5 47 U.S.C. § 230 (commonly referred to as Section 230 of the Communications Decency Act).

6 326 U.S. at 10.

7 Lorain Journal Co. v. United States, 342 U.S. 143, 152, 155 (1951).

8 Eugene Volokh & Donald Falk, First Amendment Protection for Search Engine Search Results, UCLA School of Law Research Paper No. 12-22, at 22 (April 20, 2012) (emphasis added).

9 Jefferson Cty. Sch. Dist. No. R-1 v. Moody’s Inv’r Servs., 175 F.3d 848, 860 (10th Cir. 1999).

10 Id.

11 Id. at 859 (“See, e.g., Federal Trade Comm’n v. Superior Court Trial Lawyers Ass’n, 493 U.S. 411, 430-32 (1990) (upholding finding that an attorneys’ association’s boycott of assignments to cases involving indigent defendants violated the antitrust laws even though the boycott had an expressive component); National Society of Professional Engineers v. United States, 435 U.S. 679 (1978) (upholding finding that a professional association’s ban on competitive bidding for engineering services violated the antitrust laws even though one means of carrying out the ban was through the publication of an ethical code); American Society of Mechanical Eng’rs v. Hydrolevel Corp., 456 U.S. 556 (1982) (upholding finding that professional association violated the antitrust laws through the issuance of an inaccurate safety report used to undermine a competitor’s product); Wilk v. American Medical Ass’n, 895 F.2d 352, 357-58, 371 (7th Cir. 1990) (upholding finding that a medical association’s boycott of chiropractors violated the antitrust laws even though one means of enforcing the boycott was through the association’s code of ethics). More generally, the School District relies on decisions holding that the First Amendment does not provide publishers with immunity from antitrust laws. See Citizen Publishing Co v. United States, 394 U.S. 131, 135 (1969) (upholding injunction prohibiting newspaper publishers from engaging in joint operating agreement) . . . .”).

12 Miami Herald Publishing Co. v. Tornillo, 418 U.S. 241, 258 (1974); see also Sinn v. The Daily Nebraskan, 829 F.2d 662 (8th Cir. 1987).

13 Id. at 250.

14 Volokh & Falk, supra note 7 at 23 (“the Ninth Circuit has concluded that even a newspaper that was plausibly alleged to have a ’substantial monopoly‘ could not be ordered to run a movie advertisement that it wanted to exclude, because ‘[a]ppellant has not convinced us that the courts or any other governmental agency should dictate the contents of a newspaper.‘ Associates & Aldrich Co. v. Times Mirror Co., 440 F.2d 133, 135 (9th Cir. 1971). And the Tennessee Supreme Court similarly stated that, ‘[n]ewspaper publishers may refuse to publish whatever advertisements they do not desire to publish and this is true even though the newspaper in question may enjoy a virtual monopoly in the area of its publication.‘ Newspaper Printing Corp. v. Galbreath, 580 S.W. 2d 777, 779 (Tenn. 1979).”).

15 Sambreel Holdings LLC v. Facebook, Inc., 906 F. Supp. 2d 1070, 1075-76 (S.D. Cal. 2012).

16 Id. at 1075 (quoting Verizon Commc’ns Inc. v. Law Offices of Curtis V. Trinko, 540 U.S. 398, 408 (2004)).

17 See, e.g. Langdon v. Google, Inc., 474 F. Supp. 2d 622 (D.Del. 2007).

18 Assocs. & Aldrich Co. v. Times Mirror Co., 440 F.2d 133, 136 (9th Cir. 1971) (a newspaper could not be compelled to print content as-provided, even if the content that the editor rejected was not objectionable.)

19 Manhattan Community Access Corp. v. Halleck, 139 S. Ct. 1921, 1930 (2019).

20 See, e.g. Prager Univ. v. Google LLC, 951 F.3d 991 (9th Cir. 2020); Howard v. Am. Online, Inc., 208 F.3d 741, 754 (3d Cir. 2000).

21 See generally Berin Szóka & Corbin Barthold, Justice Thomas’s Misguided Concurrence on Platform Regulation, Lawfare (April 14, 2021),

22 Cellco Partnership v. Fed. Commc’ns Comm’n, 700 F.3d 534, 546 (D.C. Cir. 2012).

23 Nat. Ass’n of Reg. Utility Com’rs v. F.C.C., 525 F.2d 630, 641 (D.C. Cir. 1976).

24 See, e.g. Comcast, Open Internet (2021), (“We do not block, slow down or discriminate against lawful content.”).

25 U.S. Telecom Ass’n v. Fed. Commc’ns Comm’n, 855 F.3d 381 (D.C. Cir. 2017).

26 Id. at 389 (Srinivasan, J., concurring) (citing In the Matter of Protecting & Promoting the Open Internet, 30 F.C.C. Rcd. 5601 (2015)).

27 Turner Broadcasting System, Inc. v. F.C.C., 819 F. Supp. 32, 40 (D.D.C. 1993).

28 Turner Broadcasting System, Inc. v. F.C.C., 512 U.S. 622, 640 (1994).

29 Id. at 645.

30 Id. at 655.

31 Id. at 655; see also United States v. Midwest Video Corp., 406 U.S. 649 (1972).

32 U.S. Securities and Exchange Commission, Quarterly Report Pursuant to Section 13 or 15(d) of the Securities Exchange Act of 1934 (Form 10-Q), at 53 (Oct. 29, 2020),

33 Facebook, Inc., FB Q3 2020 Earnings Call Transcript, (Oct. 29, 2020)

34 Stratton Oakmont, Inc. v. Prodigy Servs. Co., No. 31063/94, 1995 WL 323710, at *2 (N.Y. Sup. Ct. May 24, 1995).

35 Id.

36 47 U.S.C. § 230(c)(1).

37 47 U.S.C. § 230(c)(2)(A).

38 Zeran v. Am. Online, 129 F.3d 327, 330 (4th Cir. 1997); see also Barrett v. Rosenthal, 146 P.3d 510, 515 (Cal. 2006).

39 Elizabeth Banker, Internet Association: A Review of Section 230’s Meaning & Application Based On More Than 500 Cases at 10, Internet Association (July 27, 2020), (concluding, based on a study of more than 500 Section 230 cases, that Section 230 “typically” protects content moderation “through application of subsection (c)(1) rather than (c)(2).”).

40 47 U.S.C. § 230(f)(3) (“The term ‘information content provider’ means any person or entity that is responsible, in whole or in part, for the creation or development of information provided through the Internet or any other interactive computer service.”).

41 Austin Vining & Sarah Matthews, Reporters Committee for Freedom of the Press, Introduction to Anti-SLAPP Laws,,New%20York%2C%20Oklahoma%2C%20Oregon%2C.

42 Fair Hous. Council of San Fernando Valley v., LLC, 521 F.3d 1157, 1174 (9th Cir. 2008).

43 Id.

44 Zeran, 129 F.3d at 330.

45 Miami Herald, 418 U.S. at 258.

46 Pacific Gas and Elec. Co. v. Pub. Utilities Comm’n of California, 475 U.S. 1, 11, 16 (1986) (plurality opinion) (emphasis in original).

47 515 U.S. 557, 573 (1995).

48 Id. at 573.

49 47 U.S.C. § 230(c)(2)(A).