Preliminary Injunction Against Texas Social Media 'Censorship' Law
Preliminary Injunction Against Texas Social Media 'Censorship' Law
Preliminary Injunction Against Texas Social Media 'Censorship' Law
ORDER
Before the Court is Plaintiffs NetChoice, LLC d/b/a NetChoice (“NetChoice”), a 501(c)(6)
District of Columbia organization, and Computer & Communications Industry Association d/b/a
CCIA (“CCIA”), a 501(c)(6) non-stock Virginia corporation’s (“Plaintiffs”) Motion for Preliminary
Injunction, (Dkt. 12), Defendant Texas Attorney General Ken Paxton’s (the “State”) response in
opposition, (Dkt. 39), and Plaintiffs’ reply, (Dkt. 48). The Court held the preliminary injunction
hearing on November 29, 2021. (Dkt. 47). After considering the parties’ briefs and arguments, the
record, and the relevant law, the Court denies the motion to dismiss and grants the preliminary
injunction.
I. BACKGROUND
In the most recent legislative session, the State sought to pass a bill that would “allow
Texans to participate on the virtual public square free from Silicon Valley censorship.” Senator
1
Case 1:21-cv-00840-RP Document 51 Filed 12/01/21 Page 2 of 30
tweeting “[s]ilencing conservative views is un-American, it’s un-Texan[,] and it’s about to be illegal in
https://t.co/JsPam2XyqD. After a bill failed to pass during the regular session or the first special
session, Governor Abbott called a special second legislative session directing the Legislature to
consider and act on legislation “protecting social-media and email users from being censored.”
/files/press/PROC_second_called_session_87th_legislature_IMAGE_08-05-21.pdf. The
Legislature passed House Bill 20 (“HB 20”), and Governor Abbott signed it into law on September
HB 20 prohibits large social media platforms from “censor[ing]” a user based on the user’s
“viewpoint.” Tex. Civ. Prac. & Rem. Code § 143A.002 (“Section 7”). Specifically, Section 7 makes it
unlawful for a “social media platform” to “censor a user, a user’s expression, or a user’s ability to
receive the expression of another person based on: (1) the viewpoint of the user or another person;
(2) the viewpoint represented in the user’s expression; or (3) a user’s geographic location in this state
or any part of this state.” Id. § 143A.002(a)(1)-(3). The State defines social media platforms as any
website or app (1) with more than 50 million active users in the United States in a calendar month,
(2) that is open to the public, (3) allows users to create an account, and (4) enables users to
communicate with each other “for the primary purpose of posting information, comments,
messages, or images.” Tex. Bus. & Com. Code §§ 120.001(1), 120.002(b); Tex. Civ. Prac. & Rem.
Code § 143A.003(c). HB 20 applies to sites and apps like Facebook, Instagram, Pinterest, TikTok,
Twitter, Vimeo, WhatsApp, and YouTube. (Prelim. Inj. Mot., Dkt. 12, at 11); (see CCIA Decl., Dkt.
12-1, at 3–4; NetChoice Decl., Dkt. 12-2, at 3–4). HB 20 excludes certain companies like Internet
service providers, email providers, and sites and apps that “consist[] primarily of news, sports,
2
Case 1:21-cv-00840-RP Document 51 Filed 12/01/21 Page 3 of 30
entertainment, or other information or content that is not user generated but is preselected by the
provider” and user comments are “incidental to” the content. Tex. Bus. & Com. Code §
120.001(1)(A)–(C). HB 20 carves out two content-based exceptions to Section 7’s broad prohibition:
(1) platforms may moderate content that “is the subject of a referral or request from an organization
with the purpose of preventing the sexual exploitation of children and protecting survivors of sexual
abuse from ongoing harassment,” and (2) platforms may moderate content that “directly incites
criminal activity or consists of specific threats of violence targeted against a person or group because
of their race, color, disability, religion, national origin or ancestry, age, sex, or status as a peace
HB 20 also requires social media platforms to meet disclosure and operational requirements.
Tex. Bus. & Com. Code § 120.051, 120.101–.104 (“Section 2”). Section 2 requires platforms to
publish “acceptable use policies,” set up an “easily accessible” complaint system, produce a
“biannual transparency report,” and “publicly disclose accurate information regarding its content
management, data management, and business practices, including specific information regarding
how the social media platform: (i) curates and targets content to users; (ii) places and promotes
content, services, and products, including its own content, services, and products; (iii) moderates
content; (iv) uses search, ranking, or other algorithms or procedures that determine results on the
platform; and (v) provides users’ performance data on the use of the platform and its products and
If a user believes a platform has improperly “censored” their viewpoint under Section 7, the
user can sue the platform, which may be enjoined, and obtain attorney’s fees. Tex. Civ. Prac. & Rem.
Code § 143A.007(a), (b). Lawsuits can be brought by any Texan and anyone doing business in the
state or who “shares or receives expression in this state.” Id. §§ 143A.002(a), 143A.004(a), 143A.007.
In addition, the Attorney General of Texas may “bring an action to enjoin a violation or a potential
3
Case 1:21-cv-00840-RP Document 51 Filed 12/01/21 Page 4 of 30
violation” of HB 20 and recover their attorney’s fees. Id. § 143A.008. Failure to comply with Section
2’s requirement also subjects social media platforms to suit. The Texas Attorney General may seek
injunctive relief and collect attorney’s fees and “reasonable investigative costs” if successful in
Finally, HB 20 contains a severability clause. Tex. Civ. Prac. & Rem. Code § 143A.008(a). “If
any application of any provision in this Act to any person, group of persons, or circumstances is
found by a court to be invalid or unconstitutional, the remaining applications of that provision to all
other persons and circumstances shall be severed and may not be affected.” Id. § 143A.008(b).
HB 20 goes into effect on December 2, 2021. Id. § 143A.003–143A.008 (noting that the
Plaintiffs recently challenged a similar Florida law in the Northern District of Florida in
NetChoice v. Moody, successfully obtaining a preliminary injunction to halt the enforcement of that
law. The district court in that case described the Florida legislation as “an effort to rein in social-
media providers deemed too large and too liberal.” No. 4:21CV220-RH-MAF, 2021 WL 2690876, at
*12 (N.D. Fla. June 30, 2021). The Florida court concluded that
Id. The court’s preliminary injunction has been appealed to the Eleventh Circuit.
B. Procedural Background
Plaintiffs are two trade associations with members that operate social media platforms that
would be affected by HB 20. (Compl., Dkt. 1, at 1–2); (Prelim. Inj. Mot., Dkt. 12, at 11). Plaintiffs
filed their lawsuit on September 22, 2021, challenging HB 20 because it violates the First
4
Case 1:21-cv-00840-RP Document 51 Filed 12/01/21 Page 5 of 30
Amendment; is void for vagueness; violates the commerce clause, full faith and credit clause, and the
Fourteenth Amendment’s due process clause; is preempted under the supremacy clause by the
Communications Decency Act, 47 U.S.C. § 230; and violates the equal protection clause of the
Fourteenth Amendment. (Compl., Dkt. 1, at 31, 35, 38, 41, 44). In their motion for preliminary
injunction, Plaintiffs request that this Court preliminarily enjoin the Texas Attorney General from
enforcing Sections 2 and 7 of HB 20 against Plaintiffs and their members. (Dkt. 12, at 54).
In response to the motion for preliminary injunction, the State requested expedited
discovery, (Mot. Discovery, Dkt. 20), which Plaintiffs opposed, (Dkt. 22). The Court granted the
State’s request, in part, permitting “narrowly-tailored, expedited discovery” before the State would
be required to respond to the preliminary injunction motion. (Order, Dkt. 25, at 3). The Court
expressed its confidence in the State to “significantly tailor its discovery requests . . . to obtain
precise information without burdening Plaintiffs’ members.” (Id. at 4). Several days later, Plaintiffs
filed a motion for protective order, (Dkt. 29), which the Court granted, (Order, Dkt. 36). In that
Order, the Court allowed the State to depose Plaintiffs’ declarants, request documents relied on by
Additionally, the State filed a motion to dismiss about to two weeks after Plaintiffs filed their
motion for preliminary injunction. (Mot. Dismiss, Dkt. 23). The State argues that Plaintiffs lack
associational or organizational standing. (Id.). Plaintiffs respond that they have associational standing
to represent their members covered by HB 20 and also have organizational standing. (Resp. Mot.
Finally, Plaintiffs filed a motion to strike the expert report of Adam Candeub, which was
attached to the State’s opposition to the preliminary injunction motion. (Mot. Strike, Dkt. 43).
Plaintiffs challenge the report by Candeub, who is a law professor at Michigan State University, for
being a “second legal brief” that offers “nothing more than (incorrect) legal conclusions.” (Id. at 2).
5
Case 1:21-cv-00840-RP Document 51 Filed 12/01/21 Page 6 of 30
Plaintiffs argue that it is well-established that an expert may not render conclusions of law. (Id.).
They also argue that his “methodology” is unreliable because his tests are simply legal standards. (Id.
at 4–5). Immediately before this Court issued this opinion, the State filed an opposition brief. (Dkt.
50). Because the Court does not rely on Candeub’s report, the Court will dismiss Plaintiffs’ motion
Federal Rule of Civil Procedure 12(b)(1) allows a party to assert lack of subject-matter
jurisdiction as a defense to suit. Fed. R. Civ. P. 12(b)(1). Federal district courts are courts of limited
subject matter jurisdiction and may only exercise such jurisdiction as is expressly conferred by the
Constitution and federal statutes. Kokkonen v. Guardian Life Ins. Co. of Am., 511 U.S. 375, 377 (1994).
A federal court properly dismisses a case for lack of subject matter jurisdiction when it lacks the
statutory or constitutional power to adjudicate the case. Home Builders Ass’n of Miss., Inc. v. City of
Madison, 143 F.3d 1006, 1010 (5th Cir. 1998). “The burden of proof for a Rule 12(b)(1) motion to
dismiss is on the party asserting jurisdiction.” Ramming v. United States, 281 F.3d 158, 161 (5th Cir.
2001), cert. denied, 536 U.S. 960 (2002). “Accordingly, the plaintiff constantly bears the burden of
proof that jurisdiction does in fact exist.” Id. In ruling on a Rule 12(b)(1) motion, the court may
consider any one of the following: (1) the complaint alone; (2) the complaint plus undisputed facts
evidenced in the record; or (3) the complaint, undisputed facts, and the court’s resolution of
disputed facts. Lane v. Halliburton, 529 F.3d 548, 557 (5th Cir. 2008).
A preliminary injunction is an extraordinary remedy, and the decision to grant such relief is
to be treated as the exception rather than the rule. Valley v. Rapides Parish Sch. Bd., 118 F.3d 1047,
1050 (5th Cir. 1997). “A plaintiff seeking a preliminary injunction must establish that he is likely to
succeed on the merits, that he is likely to suffer irreparable harm in the absence of preliminary relief,
that the balance of equities tips in his favor, and that an injunction is in the public interest.” Winter v.
6
Case 1:21-cv-00840-RP Document 51 Filed 12/01/21 Page 7 of 30
Nat. Res. Def. Council, Inc., 555 U.S. 7, 20 (2008). The party seeking injunctive relief carries the burden
of persuasion on all four requirements. PCI Transp. Inc. v. W. R.R. Co., 418 F.3d 535, 545 (5th Cir.
2005).
III. DISCUSSION
In its motion to dismiss, the State asserts that Plaintiffs lack associational and organizational
standing and their complaint should be dismissed. (Dkt. 23). Under Article III of the Constitution,
federal court jurisdiction is limited to cases and controversies. U.S. Const. art. III, 2, cl. 1; Raines v.
Byrd, 521 U.S. 811, 818 (1997). A key element of the case-or-controversy requirement is that a
plaintiff must establish standing to sue. See Lujan v. Defenders of Wildlife, 504 U.S. 555, 561 (1992). To
establish Article III standing, a plaintiff must demonstrate that she has “(1) suffered an injury-in-
fact, (2) that is fairly traceable to the challenged conduct of the defendant, and (3) that is likely to be
redressed by a favorable judicial decision.” Id. at 560–61. “[W]hen standing is challenged on the basis
of the pleadings, we ‘accept as true all material allegations of the complaint, and . . . construe the
complaint in favor of the complaining party.”’ Pennell v. City of San Jose, 485 U.S. 1, 7 (1988) (quoting
“Associations may assert the standing of their own members.” Texas Ass’n of Manufacturers v.
United States Consumer Prod. Safety Comm’n, 989 F.3d 368, 377 (5th Cir. 2021). An association must
meet three elements to establish associational standing: (1) “its members would otherwise have
standing to sue in their own right,” (2) “the interests at stake are germane to the organization’s
purpose,” and (3) “neither the claim asserted nor the relief requested requires the participation of
individual members in the lawsuit.” Id. Plaintiffs easily meet these requirements for associational
standing. The Court steps through each of the three requirements below.
7
Case 1:21-cv-00840-RP Document 51 Filed 12/01/21 Page 8 of 30
Plaintiffs’ members include social media platforms like “Facebook, Google, YouTube, [and]
Twitter,” as recognized by the State, (Mot. Dismiss, Dkt. 23, at 3), that would be subject to
regulation by the State through HB 20. Despite the State’s contention otherwise, (Mot. Dismiss,
Dkt. 23, at 3–4), Plaintiffs show that their members would suffer an injury-in-fact if HB 20 goes into
effect. “[A] plaintiff satisfies the injury-in-fact requirement where he alleges ‘an intention to engage
in a course of conduct arguably affected with a constitutional interest, but proscribed by a statute,
and there exists a credible threat of prosecution thereunder.”’ Susan B. Anthony List v. Driehaus, 573
U.S. 149, 159 (2014) (quoting Babbitt v. Farm Workers, 442 U.S. 289, 298 (1979)). In their complaint,
Plaintiffs allege that their members are “directly subject to and regulated by H.B. 20 because they
qualify as ‘social media platforms’ within H.B. 20’s definition of the term,” “exercise editorial
judgments that are prohibited by H.B. 20,” and will “face serious legal consequences for failing to
comply with” HB 20. (Compl, Dkt. 1, at 5–6). Plaintiffs state that some of its members, like
Facebook and YouTube, would be compelled to publish content that violates their policies and
otherwise would be removed through their exercise of editorial judgment. (Compl., Dkt. 1, at 6).
Plaintiffs’ members do not resemble ‘“passive receptacle[s]’ where users are free to share their
speech without review or rebuke unless unlawful,” as the State claims. (Mot. Dismiss, Dkt. 23, at 5).
Plaintiffs also allege that “Paxton has given every indication that he intends to use all legally available
enforcement tools against Plaintiffs’ members” and support that allegation with Paxton’s press
releases and posts. (Id. at 10) (“In a January 9, 2021, tweet criticizing Twitter, Facebook, and Google
for allegedly targeting ‘conservative’ speech, Defendant Paxton vowed, ‘As AG, I will fight them
Additionally, Plaintiffs have alleged that HB 20 threatens their members with classic
economic harms. “[E]conomic injury is a quintessential injury upon which to base standing.” Tex.
8
Case 1:21-cv-00840-RP Document 51 Filed 12/01/21 Page 9 of 30
Democratic Party v. Benkiser, 459 F.3d 582, 586 (5th Cir. 2006). In their Complaint, Plaintiffs allege that
their members “will incur significant costs to comply with the provisions in Sections 2 and 7 of H.B.
20. The statute will force members to substantially modify the design and operation of their
platforms. The necessary modifications will impose onerous burdens upon members’ respective
platforms and services, interfering with their business models and making it more difficult for them
to provide high quality services to their users.” (Compl., Dkt. 1, at 7). Furthermore, Plaintiffs allege
their members will suffer damage to their brands and goodwill, (id. at 8), and their members will be
forced to disclose technical information that will cost them competitive advantage and make it
harder to block content, (id. at 7–8). Based on these detailed allegations, the complaint sufficiently
The State does not dispute this prong of the standing analysis. As Plaintiffs note in their
opposition brief: “Defendant does not dispute Plaintiffs satisfy the second prong. Nor could he.
H.B. 20’s intrusion on the rights of Internet websites and applications is germane to Plaintiffs’
The State argues that Plaintiffs’ claims require the participation of Plaintiffs’ members. (Mot.
Dismiss, Dkt. 23, at 14). Plaintiffs seek to block the State’s enforcement of the provisions of HB 20
that are facially unconstitutional. A facial challenge generally is not fact intensive and does not
require individual members to participate. Nat’l Press Photographers Ass’n v. McCraw, 504 F. Supp. 3d
568, 580 (W.D. Tex. 2020) (recognizing associational standing to bring “facial” “content-based,”
“vagueness,” “overbreadth,” and “preemption” challenges). Plaintiffs assert facial challenges “based
on the doctrines of compelled speech, infringing editorial discretion, a ‘content-based’ and speaker-
based law, ‘vagueness, ‘overbreadth,’ ‘preemption,’ and extraterritorial regulation.” (Resp. Mot.
9
Case 1:21-cv-00840-RP Document 51 Filed 12/01/21 Page 10 of 30
Dismiss, Dkt. 28, at 21). Each doctrine forms the basis for finding HB 20 facially invalid. (See id.)
(citing Nat’l Press, 504 F. Supp. 3d at 580; Nat’l Inst. of Family & Life Advocates v. Becerra, 138 S. Ct.
2361, 2378 (2018) (sustaining content-based facial challenge based on compelled speech); Miami
Herald Pub. Co. v. Tornillo, 418 U.S. 241, 258 (1974) (sustaining content-based facial challenge based
on infringing editorial discretion); Ass’n for Accessible Medicines v. Frosh, 887 F.3d 664, 668 (4th Cir.
2018) (a “state law violates the extraterritoriality principle if it [] expressly applies to out-of-state
commerce”) (emphasis added); Garza v. Wyeth LLC, 2015 WL 364286, at *4 (S.D. Tex. Jan. 27,
2015) (“The preemption decision is not evidence-based but is rather a question of law.”)). While the
State argues the Court cannot determine whether Plaintiffs’ members are common carriers, which
the State argues is a crucial step in this Court’s First Amendment analysis, without the participation
of Plaintiffs’ members, (Mot. Dismiss, Dkt. 23, at 15), the Court finds that it can determine, if
necessary, whether Plaintiffs’ members are common carriers. Likewise, the Court can rule on
Plaintiffs’ other facial challenges, like their commerce clause claim, and conduct the proper level of
scrutiny analysis on Plaintiffs’ First Amendment claim. Additionally, Plaintiffs’ requested relief—
enjoining Paxton from enforcing Sections 2 and 7 of HB 20 against them and their members—is a
proper and tailored remedy that would not necessarily require the individual participation of their
members. “Injunctive relief ‘does not make the individual participation of each injured party
indispensable to proper resolution[.]”’ Texas Ent. Ass’n, Inc. v. Hegar, 10 F.4th 495, 505 (5th Cir.
2021) (quoting Hunt v. Washington State Apple Advert. Comm’n, 432 U.S. 333, 342 (1977)).
organizational standing to challenge HB 20. In their complaint, Plaintiffs allege the “already incurred
costs and will continue to divert their finite resources—money, staff, and time and attention—away
from other pressing issues facing their members to address compliance with and the implications of
10
Case 1:21-cv-00840-RP Document 51 Filed 12/01/21 Page 11 of 30
H.B. 20 for Internet companies.” (Compl., Dkt. 1, at 5). Plaintiffs continue that they would “no
longer divert those finite resources to address H.B. 20” if it were declared unlawful and enjoined.
(Id.). Plaintiffs’ injury as an organization need not be “large” or “substantial.” OCA-Greater Houston v.
Texas, 867 F.3d 604, 612 (5th Cir. 2017) (“[I]t need not measure more than an ‘identifiable trifle.’
This is because ‘the injury in fact requirement under Article III is qualitative, not quantitative, in
nature.’”) (quoting Ass’n of Cmty. Organizations for Reform Now v. Fowler, 178 F.3d 350, 357 (5th Cir.
1999)). Plaintiffs sufficiently allege that they have diverted resources and incurred expenses as an
organization to prepare for HB 20’s effects on Plaintiffs’ members. See id. at 611–14.
Having considered the State’s arguments and having found that Plaintiffs have both
challenge it based on their own alleged injuries, the Court denies the State’s motion to dismiss. This
Court’s ruling is supported by the fact that the Northern District of Florida enjoined a similar
Florida law that was challenged by these same exact Plaintiffs, and there was no dispute in that
case—in which the State of Texas filed an amicus brief—that Plaintiffs lacked standing to assert the
Plaintiffs bring several claims against the State, and the Court focuses on Plaintiffs’ claim
that HB 20 violates the First Amendment.1 To succeed on their motion for a preliminary injunction,
then, Plaintiffs must show that HB 20 compels private social media platforms to “disseminate third-
party content and interferes with their editorial discretion over their platforms.”2 (Prelim. Inj. Mot.,
1 The Court need not and does not reach the issues of whether HB 20 is void for vagueness, preempted by
the Communications Decency Act, or violates the Commerce Clause.
2 Findings and conclusions about the merits of this case should be understood only as statements about
Plaintiffs’ likelihood of success based on the record and law currently before this Court.
11
Case 1:21-cv-00840-RP Document 51 Filed 12/01/21 Page 12 of 30
The parties dispute whether social media platforms are more akin to newspapers that engage
in substantial editorial discretion—and therefore are entitled to a higher level of protection for their
speech—or a common carrier that acts as a passive conduit for content posted by users—and
therefore are entitled to a lower level of protection, if any. Plaintiffs urge the Court to view social
media platforms as having editorial discretion to moderate content, and the State advocates that
social media platforms act as common carriers that may be compelled by the government to publish
speech that is objectionable. Before the Court attempts to settle that debate, the Court evaluates
whether the First Amendment guarantees social media platforms the right to exercise editorial
discretion.
More than twenty years ago, the Supreme Court recognized that “content on the Internet is
as diverse as human thought,” allowing almost any person to “become a town crier with a voice that
resonates farther than it could from any soapbox.” Reno v. Am. C.L. Union, 521 U.S. 844, 870 (1997).
The Reno Court concluded that its “cases provide no basis for qualifying the level of First
Amendment scrutiny that should be applied to this medium.” Id. Disseminating information is
“speech within the meaning of the First Amendment.” Sorrell v. IMS Health Inc., 564 U.S. 552, 570
(2011) (citing Bartnicki v. Vopper, 532 U.S. 514, 527 (2001) (“[I]f the acts of ‘disclosing’ and
‘publishing’ information do not constitute speech, it is hard to imagine what does fall within that
Social media platforms have a First Amendment right to moderate content disseminated on
their platforms. See Manhattan Cmty. Access Corp. v. Halleck, 139 S. Ct. 1921, 1932 (2019) (recognizing
that “certain private entities[] have rights to exercise editorial control over speech and speakers on
their properties or platforms”). Three Supreme Court cases provide guidance. First, in Tornillo, the
Court struck down a Florida statute that required newspapers to print a candidate’s reply if a
12
Case 1:21-cv-00840-RP Document 51 Filed 12/01/21 Page 13 of 30
newspaper assailed her character or official record, a “right of reply” statute. 418 U.S. at 243. In
1974, when the opinion was released, the Court noted there had been a “communications
revolution” including that “[n]ewspapers have become big business . . . [with] [c]hains of
newspapers, national newspapers, national wire and news services, and one-newspaper towns [being]
the dominant features of a press that has become noncompetitive and enormously powerful and
influential in its capacity to manipulate popular opinion and change the course of events.” Id. at
248–49. Those concerns echo today with social media platforms and “Big Tech” all the while
newspapers are further consolidating and, often, dying out. Back to 1974, when newspapers were
viewed with monopolistic suspicion, the Supreme Court concluded that newspapers exercised
“editorial control and judgment” by selecting the “material to go into a newspaper,” deciding the
“limitations on the size and content of the paper,” and deciding how to treat “public issues and
public officials—whether fair or unfair.” Id. at 258. “It has yet to be demonstrated how
governmental regulation of this crucial process can be exercised consistent with First Amendment
In Hurley v. Irish-Am. Gay, Lesbian & Bisexual Grp. of Bos., the Supreme Court held that a
private parade association had the right to exclude a gay rights group from having their own float in
their planned parade without being compelled by a state statute to do otherwise. 515 U.S. 557, 572–
73 (1995). The Massachusetts law at issue—which prohibited discrimination in any public place of
“public accommodation, resort[,] or amusement”— did not “target speech or discriminate on the
basis of its content, the focal point of its prohibition being rather on the act of discriminating against
individuals.” Id. at 572. The Court reasoned that the state’s equal-access law “alter[ed] the expressive
content” of the private organization. Id. “[T]his use of the State’s power violates the fundamental
rule of protection under the First Amendment, that a speaker has the autonomy to choose the
content of his own message.” Id. at 573. The Court clarified: “Indeed this general rule, that the
13
Case 1:21-cv-00840-RP Document 51 Filed 12/01/21 Page 14 of 30
speaker has the right to tailor the speech, applies not only to expressions of value, opinion, or
endorsement, but equally to statements of fact the speaker would rather avoid.” Id.
Finally, the Supreme Court ruled that California could not require a private utility company
to include a third party’s newsletters when it sent bills to customers in Pac. Gas & Elec. Co. v. Pub.
Utilities Comm’n of California, 475 U.S. 1, 20–21 (1986). There, for decades, the private utility company
sent a newsletter to its customers with monthly bills, and California required it to include the third-
party newsletter, a newsletter the private utility company disagreed with. Id. at 4–5. Relying on
Tornillo, the Court analogized that “[j]ust as the State is not free to tell a newspaper in advance what
it can print and what it cannot, the State is not free either to restrict [the private utility company’s]
speech to certain topics or views or to force [it] to respond to views that others may hold.” Id. at 11
(internal quotation marks and citations omitted). “[A] forced access rule that would accomplish
these purposes indirectly is similarly forbidden.” Id. The private utility company had the “right to be
free from government restrictions that abridge its own rights in order to enhance the relative voice
of its opponents.” Id. at 14 (internal quotation marks omitted). That was because a corporation has
the “choice of what not to say” and cannot be compelled to “propound political messages with
The Supreme Court’s holdings in Tornillo, Hurley, and PG&E, stand for the general
proposition that private companies that use editorial judgment to choose whether to publish
content—and, if they do publish content, use editorial judgment to choose what they want to
publish—cannot be compelled by the government to publish other content. That proposition has
repeatedly been recognized by courts. (See Prelim. Inj. Mot., Dkt. 12, at 26) (collecting cases).
Satisfied that such editorial discretion is protected from government-compelled speech, the Court
14
Case 1:21-cv-00840-RP Document 51 Filed 12/01/21 Page 15 of 30
This Court starts from the premise that social media platforms are not common carriers.3
“Equal access obligations . . . have long been imposed on telephone companies, railroads, and postal
services, without raising any First Amendment issue.” United States Telecom Ass’n v. Fed. Commc’ns
Comm’n, 825 F.3d 674, 740 (D.C. Cir. 2016). Little First Amendment concern exists because
common carriers “merely facilitate the transmission of speech of others.” Id. at 741. In United States
Telecom, the Court added broadband providers to its list of common carriers. Id. Unlike broadband
providers and telephone companies, social media platforms “are not engaged in indiscriminate,
neutral transmission of any and all users’ speech.” Id. at 742. User-generated content on social media
platforms is screened and sometimes moderated or curated. The State balks that the screening is
done by an algorithm, not a person, but whatever the method, social media platforms are not mere
conduits. According to the State, our inquiry could end here, with Plaintiffs not needing to prove
more to show they engage in protected editorial discretion. During the hearing, the Court asked the
State, “[T]o what extent does a finding that these entities are common carriers, to what extent is that
important from your perspective in the bill’s ability to survive a First Amendment challenge?” (See
Minute Entry, Dkt. 47). Counsel for the State responded, “[T]he common carriage doctrine is
essential to the First Amendment challenge. It’s why it’s the threshold issue that we’ve briefed . . . .
It dictates the rest of this suit in terms of the First Amendment inquiry.” (Id.). As appealing as the
State’s invitation is to stop the analysis here, the Court continues in order to make a determination
about whether social media platforms exercise editorial discretion or occupy a purgatory between
Social media platforms “routinely manage . . . content, allowing most, banning some,
arranging content in ways intended to make it more useful or desirable for users, sometimes adding
3HB 20’s pronouncement that social media platforms are common carriers, Tex. H.B. No. 20, 87th Leg., 2nd
Sess. § 1(4) (2021), does not impact this Court’s legal analysis.
15
Case 1:21-cv-00840-RP Document 51 Filed 12/01/21 Page 16 of 30
their own content.” NetChoice, 2021 WL 2690876, at *7. Making those decisions entails some level of
editorial discretion, id., even if portions of those tasks are carried out by software code. While this
Court acknowledges that a social media platform’s editorial discretion does not fit neatly with our
20th Century vision of a newspaper editor hand-selecting an article to publish, focusing on whether
a human or AI makes those decisions is a distraction. It is indeed new and exciting—or frightening,
depending on who you ask—that algorithms do some of the work that a newspaper publisher
previously did, but the core question is still whether a private company exercises editorial discretion
over the dissemination of content, not the exact process used. Plaintiffs’ members also push back on
the idea that content moderation does not involve judgment. For example, Facebook states that it
makes decisions about “billions of pieces of content” and “[a]ll such decisions are unique and
context-specific[] and involve some measure of judgment.” (Facebook Decl., Dkt. 12-4, at 9).
This Court is convinced that social media platforms, or at least those covered by HB 20,
curate both users and content to convey a message about the type of community the platform seeks
to foster and, as such, exercise editorial discretion over their platform’s content. Indeed, the text of
HB 20 itself points to social media platforms doing more than transmitting communication. In
Section 2, HB 20 recognizes that social media platforms “(1) curate[] and target[] content to users,
(2) place[] and promote[] content, services, and products, including its own content, services, and
products, (3) moderate[] content, and (4) use[] search, ranking, or other algorithms or procedures
that determine results on the platform.” Tex. Bus. & Com. Code § 120.051(a)(1)–(4). Finally, the
State’s own basis for enacting HB 20 acknowledges that social media platforms exercise editorial
viewpoints and ideas.” Governor Abbott Signs Law Protecting Texans from Wrongful Social Media Censorship,
16
Case 1:21-cv-00840-RP Document 51 Filed 12/01/21 Page 17 of 30
speak without being censored by West Coast oligarchs.” Bryan Hughes (@SenBryanHughes),
1424846466183487492 Just like the Florida law, a “constant theme of [Texas] legislators, as well as
the Governor . . . , was that the [platforms’] decisions on what to leave in or take out and how to
present the surviving material are ideologically biased and need to be reined in.” NetChoice, 2021 WL
2690876, at *7. Without editorial discretion, social media platforms could not skew their platforms
ideologically, as the State accuses of them of doing. Taking it all together, case law, HB 20’s text, and
the Governor and state legislators’ own statements all acknowledge that social media platforms
exercise some form of editorial discretion, whether or not the State agrees with how that discretion
is exercised.
Tex. Civ. Prac. & Rem. Code §§ 143A.001(1), 143A.002. The State emphasizes that HB 20 “does not
prohibit content moderation. That is clear from the fact that [HB 20] has an entire provision
dictating that the companies should create acceptable use policies . . . [a]nd then moderate their
content accordingly.” (See Minute Entry, Dkt. 47). The State claims that social media platforms
could prohibit content categories “such as ‘terrorist speech,’ ‘pornography,’ ‘spam,’ or ‘racism’” to
prevent those content categories from flooding their platforms. (Resp. Prelim. Inj. Mot., Dkt. 39, at
21). During the hearing, the State explained that a social media platform “can’t discriminate against
users who post Nazi speech . . . and [not] discriminate against users who post speech about the anti-
white or something like that.” (See Minute Entry, Dkt. 47). Plaintiffs point out the fallacy in the
State’s assertion with an example: a video of Adolf Hitler making a speech, in one context the
viewpoint is promoting Nazism, and a platform should be able to moderate that content, and in
17
Case 1:21-cv-00840-RP Document 51 Filed 12/01/21 Page 18 of 30
another context the viewpoint is pointing out the atrocities of the Holocaust, and a platform should
be able to disseminate that content. (See id.). HB 20 seems to place social media platforms in the
untenable position of choosing, for example, to promote Nazism against its wishes or ban Nazism
as a content category. (Prelim. Inj. Mot., Dkt. 12, at 29). As YouTube put it, “YouTube will face an
impossible choice between (1) risking liability by moderating content identified to violate its
standards or (2) subjecting YouTube’s community to harm by allowing violative content to remain
disseminate content violate the First Amendment. The platforms have policies against content that
express a viewpoint and disallowing them from applying their policies requires platforms to “alter
the expressive content of their [message].” Hurley, 515 U.S. at 572–73. HB 20’s restrictions on
actions that “de-boost” and “deny equal access or visibility to or otherwise discriminate against
expression” impede platforms’ ability to place “post[s] in the proper feeds.” Tex. Civ. Prac. & Rem.
Code § 143A.001(1); NetChoice, 2021 WL 2690876, at *3. Social media platforms “must determine
how and where users see those different viewpoints, and some posts will necessarily have places of
prominence. See NetChoice, 2021 WL 2690876, at *3. HB 20 compels social media platforms to
significantly alter and distort their products. Moreover, “the targets of the statutes at issue are the
editorial judgments themselves” and the “announced purpose of balancing the discussion—reining
in the ideology of the large social-media providers—is precisely the kind of state action held
unconstitutional in Tornillo, Hurley, and PG&E.” Id. HB 20 also impermissibly burdens social media
platforms’ own speech. Id. at *9 (“[T]he statutes compel the platforms to change their own speech in
other respects, including, for example, by dictating how the platforms may arrange speech on their
sites.”). For example, if a platform appends its own speech to label a post as misinformation, the
platform may be discriminating against that user’s viewpoint by adding its own disclaimer. HB 20
18
Case 1:21-cv-00840-RP Document 51 Filed 12/01/21 Page 19 of 30
restricts social media platforms’ First Amendment right to engage in expression when they disagree
Furthermore, the threat of lawsuits for violating Section 7 of HB 20 chills the social media
authorizing the Texas Attorney General to sue for violations—and even “potential” violations—of
Section 7’s “censorship” restrictions. Tex. Civ. Prac. & Rem Code §§ 143A.002; 143A.008. In
response to the State’s interrogatories, NetChoice explained that the “threat of myriad lawsuits
based on individual examples of content moderation threaten and chill the broad application of
those [content moderation] policies, and thus H.B. 20’s anti-moderation provisions interfere with
Plaintiff’s members’ policies and practices. . . . Using YouTube as an example, hate speech is
necessarily ‘viewpoint’-based, as abhorrent as those viewpoints may be. And removing such hate
speech and assessing penalties against users for submitting that content is ‘censor[ship]’ as defined
HB 20 additionally violates Plaintiffs’ members’ First Amendment rights with its Section 2
requirements. First, under Section 2, a social media platform must provide “public disclosures”
about how the platform operates in a manner “sufficient to enable users to make an informed
4 The Court notes that two other Supreme Court cases address this topic, but neither applies here. PruneYard
Shopping Center v. Robins is distinguishable from the facts of this case. 447 U.S. 74 (1980). In PruneYard, the
Supreme Court upheld a California law that required a shopping mall to host people collecting petition
signatures, concluding there was no “intrusion into the function of editors” since the shopping mall’s
operation of its business lacked an editorial function. Id. at 88. Critically, the shopping mall did not engage in
expression and “the [mall] owner did not even allege that he objected to the content of the [speech]; nor was
the access right content based.” PG&E, 475 U.S. at 12. Similarly, Rumsfeld v. Forum for Aca-
demic & Institutional Rights, Inc. has no bearing on this Court’s holding because it did not involve government
restrictions on editorial functions. 547 U.S. 47 (2006). The challenged law required schools that allowed
employment recruiters on campus to also allow military employment recruiters on campus—a restriction on
“conduct, not speech.” Id. at 62, 65. As the Supreme Court explained, “accommodating the military’s message
does not affect the law schools’ speech, because the schools are not speaking when the host interviews and
recruiting receptions.” Id. at 64.
19
Case 1:21-cv-00840-RP Document 51 Filed 12/01/21 Page 20 of 30
choice regarding the purchase of or use of access to or services from the platform.” Tex. Bus. &
Com. Code § 120.051(b). HB 20 states that each platform must disclose how it “(1) curates and
targets content to users; (2) places and promotes content, services, and products, including its own
content, services, and products; (3) moderates content; [and] (4) uses search, ranking, or other
algorithms or procedures that determine results on the platform[.]” Id. § 120.051(a)(1)–(4). Second, a
social media platform must “publish an acceptable use policy” that explains what content the
platform will allow, how the platform will ensure compliance with the policy, and how users can
inform the platform about noncompliant content. Id. § 120.052. Third, a social media platform must
publish a “biannual transparency report” that requires information about the platform’s enforcement
▪ “the total number of instances in which the social media platform was alerted to
illegal content, illegal activity, or potentially policy-violating content” and by what
means (i.e., by users, employees, or automated processes);
▪ how often the platform “took action” with regard to such content including “content
removal,” “content demonetization,” “content deprioritization,” “the addition of an
assessment to content,” “account suspension,” “account removal,” and “any other
action” that accords with the acceptable use policy, “categorized by” “the rule
violated” and “the source for the alert”;
▪ “the country of the user who provided the content for each instance described”
above;
▪ “the number of instances in which a user appealed the decision to remove the user’s
potentially policy-violating content;”
20
Case 1:21-cv-00840-RP Document 51 Filed 12/01/21 Page 21 of 30
Fourth, a social media platform must provide a “complaint system to enable a user to submit
a complaint in good faith and track the status of the complaint” regarding either a report of violative
content or “a decision made by the social media platform to remove content posted by the user.” Id.
§ 120.101. For reports of illegal content, the covered platform must “make a good faith effort to
evaluate the legality of the content or activity within 48 hours of receiving the notice,” excluding
Fifth, a social media platform must offer a notice and appeal system for any content that it
decides to remove. Subject to limited exceptions, every time a covered platform “removes” content,
it must give the user (1) a notice of the removal; (2) an opportunity to appeal; and (3) a written
explanation of the decision on appeal, including an explanation for any reversal. Id. § 120.103.
During the appeal process, a social media platform must “review the [removed] content,”
“determine whether the content adheres to the platform’s acceptable use policy,” and “take
To pass constitutional muster, disclosure requirements like these must require only “factual
and noncontroversial information” and cannot be “unjustified or unduly burdensome.” NIFLA, 138
S. Ct. at 2372. Section 2’s disclosure and operational provisions are inordinately burdensome given
the unfathomably large numbers of posts on these sites and apps. For example, in three months in
2021, Facebook removed 8.8 million pieces of “bullying and harassment content,” 9.8 million pieces
of “organized hate content,” and 25.2 million pieces of “hate speech content.” (CCIA Decl., Dkt.
12-1, at 15). During the last three months of 2020, YouTube removed just over 2 million channels
and over 9 million videos because they violated its policies. (Id. at 16). While some of those removals
are subject to an existing appeals process, many removals are not. For example, in a three-month-
period in 2021, YouTube removed 1.16 billion comments. (YouTube Decl., Dkt. 12-3, at 23–24).
Those 1.16 billion removals were not appealable, but, under HB 20, they would have to be. (Id.).
21
Case 1:21-cv-00840-RP Document 51 Filed 12/01/21 Page 22 of 30
Over the span of six months in 2018, Facebook, Google, and Twitter took action on over 5 billion
17 million cases of content regarding child safety, and 12 million cases of extremism, hate speech,
and terrorist speech. (NetChoice Decl., Dkt. 12-2, at 8). During the State’s deposition of Neil
Christopher Potts (“Potts”), who is Facebook’s Vice President of Trust and Safety Policy, Potts
stated that it would be “impossible” for Facebook “to comply with anything by December 1, [2021].
. . [W]e would not be able to change systems in that nature. . . . I don’t see a way that we would
actually be able to go forward with compliance in a meaningful way.” (Potts Depo., Dkt. 39-2, at 2,
46). Plaintiffs also express a concern that revealing “algorithms or procedures that determine results
on the platform” may reveal trade secrets or confidential and competitively-sensitive information.
civil society to speak when they otherwise would have refrained.” Washington Post v. McManus, 944
F.3d 506, 514 (4th Cir. 2019). “It is the presence of compulsion from the state itself that
compromises the First Amendment.” Id. at 515. The provisions also impose unduly burdensome
disclosure requirements on social media platforms “that will chill their protected speech.” NIFLA,
138 S. Ct. at 2378. The consequences of noncompliance also chill the social media platforms’ speech
and application of their content moderation policies and user agreements. Noncompliance can
subject social media platforms to serious consequences. The Texas Attorney General may seek
injunctive relief and collect attorney’s fees and “reasonable investigative costs” if successful in
content and speaker. First, HB 20 excludes two types of content from its prohibition on content
22
Case 1:21-cv-00840-RP Document 51 Filed 12/01/21 Page 23 of 30
moderation and permits social media platforms to moderate content: (1) that “is the subject of a
referral or request from an organization with the purpose of preventing the sexual exploitation of
children and protecting survivors of sexual abuse from ongoing harassment,” and (2) that “directly
incites criminal activity or consists of specific threats of violence targeted against a person or group
because of their race, color, disability, religion, national origin or ancestry, age, sex, or status as a
peace officer or judge.” Tex. Civ. Prac. & Rem. Code § 143A.006(a)(2)–(3). When considering a city
ordinance that applied to ‘“fighting words’ that . . . provoke violence[] ‘on the basis of race, color,
creed, religion[,] or gender,”’ the Supreme Court noted that those “who wish to use ‘fighting words’
in connection with other ideas—to express hostility, for example, on the basis of political affiliation,
union membership, or []sexuality—are not covered.” R.A.V. v. City of St. Paul, Minn., 505 U.S. 377,
391 (1992). As Plaintiffs argue, the State has “no legitimate reason to allow the platforms to enforce
their policies over threats based only on . . . favored criteria but not” other criteria like sexual
orientation, military service, or union membership. (Prelim. Inj. Mot., Dkt. 12, at 35–36); see id.
HB 20 applies only to social media platforms of a certain size: platforms with 50 million
monthly active users in the United States. Tex. Bus. & Com. Code § 120.002(b). HB 20 excludes
social media platforms such as Parler and sports and news websites. (See Prelim. Inj. Mot., Dkt. 12,
at 17). During the regular legislative session, a state senator unsuccessfully proposed lowering the
threshold to 25 million monthly users in an effort to include sites like “Parler and Gab, which are
popular among conservatives.” Shawn Mulcahy, Texas Senate approves bill to stop social media companies
from banning Texans for political views, TEX. TRIBUNE (Mar. 30, 2021), https://www.texas
often a tell for content discrimination.” NetChoice, 2021 WL 2690876, at *10. The discrimination
between speakers has special significance in the context of media because “[r]egulations that
discriminate among media, or among different speakers within a single medium, often present
23
Case 1:21-cv-00840-RP Document 51 Filed 12/01/21 Page 24 of 30
serious First Amendment concerns.” Turner Broad. Sys., Inc. v. F.C.C., 512 U.S. 622, 659 (1994). The
record in this case confirms that the Legislature intended to target large social media platforms
perceived as being biased against conservative views and the State’s disagreement with the social
media platforms’ editorial discretion over their platforms. The evidence thus suggests that the State
discriminated between social media platforms (or speakers) for reasons that do not stand up to
scrutiny.
4. HB 20 Is Unconstitutionally Vague
Plaintiffs argue that HB 20 contains many vague terms, some of which the Court agrees are
prohibitively vague. “A fundamental principle in our legal system is that laws which regulate persons
or entities must give fair notice of conduct that is forbidden or required.” FCC v. Fox TV Stations,
Inc., 567 U.S. 239, 253 (2012). “[T]he void for vagueness doctrine addresses at least two connected
but discrete due process concerns: first, that regulated parties should know what is required of them
so they may act accordingly; second, precision and guidance are necessary so that those enforcing
the law do not act in an arbitrary or discriminatory way. When speech is involved, rigorous
adherence to those requirements is necessary to ensure that ambiguity does not chill protected
First, Plaintiffs take issue with HB 20’s definition for “censor:” “block, ban, remove,
deplatform, demonetize, de-boost, restrict, deny equal access or visibility to, or otherwise
discriminate against expression.” Tex. Civ. Prac. & Rem. Code § 143A.001(1). Plaintiffs argue that
requiring social media platforms to require “equal access or visibility to” content is “hopelessly
indeterminate.” (Prelim. Inj. Mot., Dkt. 12, at 37) (quoting id.). The Court agrees. A social media
platform is not static snapshot in time like a hard copy newspaper. It strikes the Court as nearly
impossible for a social media platform—that has at least 50 million users—to determine whether
any single piece of content has “equal access or visibility” versus another piece of content given the
24
Case 1:21-cv-00840-RP Document 51 Filed 12/01/21 Page 25 of 30
huge numbers of users and content. Moreover, this requirement could “prohibit[] a social media
platform from” displaying content “in the proper feeds” NetChoice, 2021 WL 2690876, at *3.
Second, Plaintiffs argue that the definition of “social media platform” is unclear. Under HB
20, social media platform means an “Internet website or application that is open to the public,
allows a user to create an account, and enables users to communicate with other users for the
primary purpose of posting information, comments, messages, or images.” Tex. Bus. & Com. Code
§ 120.001(1). Plaintiffs argue that it is unclear which websites and applications “enable[] users to
communicate with other users for the primary purpose of posting information, comments,
messages, or images.” Id. Without more, the Court is not persuaded that that phrase is impermissibly
vague.
The definition for “social media platform” excludes “an online service, application, or
website: (i) that consists primarily of news, sports, entertainment, or other information or content
that is not user generated but is preselected by the provider; and (ii) for which any chat, comments,
or interactive functionality is incidental to, directly related to, or dependent on the provision of the
content described by Subparagraph (i).” Id. § 120.001(1)(C)(i), (ii). Plaintiffs object to the word
“primarily” used to define excluded companies whose sites “primarily” consist of “news, sports,
ordinary intelligence would have no idea what ‘primarily’ refers to as the relevant denominator.”
(Prelim. Inj. Mot., Dkt. 12, at 38). In this context, “primarily” is too indeterminate to enable
companies with a website, application, or online service to determine whether they are subject to HB
20’s prohibitions and requirements. Plaintiffs also contend that “[o]rdinary people would further
have no idea what makes a chat or comment section ‘incidental to, directly related to, or dependent
on’ a platform’s preselected content.” (Prelim. Inj. Mot., Dkt. 12, at 38) (quoting Tex. Bus. & Com.
25
Case 1:21-cv-00840-RP Document 51 Filed 12/01/21 Page 26 of 30
Code § 120.001(1)(C)(ii)). Plaintiffs have not established that that terminology is impermissibly
vague.
Third, HB 20 empowers the Texas Attorney General to seek an injunction not just against
violations of the statute but also “potential violations.” Tex. Civ. Prac. & Rem. Code § 143A.008.
Unlike other statutes that specify that the potential violation must be imminent, HB 20 includes no
such qualification. See, e.g., Tex. Occ. Code § 1101.752(a) (authorizing the attorney general to seek
injunctive relief to abate a potential violation “if the commission determines that a person has
violated or is about to violate this chapter”). Subjecting social media platforms to suit for potential
violations, without a qualification, reaches almost all content moderation decisions platforms might
make, further chilling their First Amendment rights. (Prelim. Inj. Mot., Dkt. 12, at 39).
Fourth, Plaintiffs contend that Section 2’s disclosure and operational requirements are
overbroad and vague. “For instance, H.B. 20’s non-exhaustive list of disclosure requirements grants
the Attorney General substantial discretion to sue based on a covered platform’s failure to include
unenumerated information.” (Id.). While the Court agrees that these provisions may suffer from
infirmities, the Court cannot at this time find them unconstitutionally vague on their face.
strict scrutiny. Strict scrutiny is satisfied only if a state has adopted ‘“the least restrictive means of
achieving a compelling state interest.”’ Americans for Prosperity Found. v. Bonta, 141 S. Ct. 2373, 2383,
210 L. Ed. 2d 716 (2021) (quoting McCullen v. Coakley, 573 U.S. 464, 478 (2014)). Even under the less
rigorous intermediate scrutiny, the State must prove that HB 20 is ‘“narrowly tailed to serve a
significant government interest.’” Packingham v. North Carolina, 137 S. Ct. 1730, 1736 (2017) (quoting
McCullen, 573 U.S. at 477). The proclaimed government interests here fall short under both
standards.
26
Case 1:21-cv-00840-RP Document 51 Filed 12/01/21 Page 27 of 30
The State offers two interests served by HB 20: (1) the “free and unobstructed use of public
forums and of the information conduits provided by common carriers” and (2) “providing
practices by common carriers.” (Resp. Prelim. Inj. Mot., Dkt. 39, at 33–34) (internal quotation marks
and brackets omitted). The State’s first interest fails on several accounts. First, social media
platforms are privately owned platforms, not public forums. Second, this Court has found that the
covered social media platforms are not common carriers. Even if they were, the State provides no
convincing support for recognizing a governmental interest in the free and unobstructed use of
common carriers’ information conduits.5 Third, the Supreme Court rejected an identical government
interest in Tornillo. In Tornillo, Florida argued that “government has an obligation to ensure that a
wide variety of views reach the public.” Tornillo, 418 U.S. at 247–48. After detailing the “problems
related to government-enforced access,” the Court held that the state could not commandeer private
companies to facilitate that access, even in the name of reducing the “abuses of bias and
manipulative reportage [that] are . . . said to be the result of the vast accumulations of unreviewable
power in the modern media empires.” Id. at 250, 254. The State’s second interest—preventing
“discrimination” by social media platforms—has been rejected by the Supreme Court. Even given a
fatal objective” for the First Amendment’s “free speech commands.” Hurley, 515 U.S. at 578–79.
5 In PG&E, the Supreme Court did not recognize such a governmental interest; rather, it held that a private
utility company retained editorial discretion and could not be compelled to disseminate a third-party’s speech.
475 U.S. at 16–18. The Supreme Court’s narrow reasoning in Turner does not alter this Court’s analysis.
There, the Court applied “heightened First Amendment scrutiny” because the law at issue “impose[d] special
obligations upon cable operators and special burdens upon cable programmers.” Turner Broad. Sys., Inc., 512
U.S. at 641. The law, on its face, imposed burdens without reference to the content of speech, yet
“interfere[d] with cable operators’ editorial discretion” by requiring them to carry some broadcast stations. Id.
at 643–44, 662. When it held that cable operators must abide by the law and carry some broadcast channels,
the Court’s rationale turned on preventing “40 percent of Americans without cable” from losing “access to
free television programming.” Id. at 646. The analysis applied to the regulation of broadcast television has no
bearing on the analysis of Internet First Amendment protections. See Reno, 521 U.S. at 870.
27
Case 1:21-cv-00840-RP Document 51 Filed 12/01/21 Page 28 of 30
Even if the State’s purported interests were compelling and significant, HB 20 is not
narrowly tailored. Sections 2 and 7 contain broad provisions with far-reaching, serious
consequences. When reviewing the similar statute passed in Florida, the Northern District of Florida
found that that statute was not narrowly tailored “like prior First Amendment restrictions.”
NetChoice, 2021 WL 2690876, at *11 (citing Reno, 521 U.S. at 882; Sable Commc’n of Cal., Inc. v. FCC,
492 U.S. 115, 131 (1989)). Rather, the court colorfully described it as “an instance of burning the
house to roast a pig.” Id. This Court could not do better in describing HB 20.
Plaintiffs point out that the State could have created its own unmoderated platform but
likely did not because the State’s true interest is divulged by statements made by legislators and
Governor Abbott: the State is concerned with “West Coast oligarchs” and the “dangerous
movement by social media companies to silence conservative viewpoints and ideas.” Bryan Hughes
status/1424846466183487492; Governor Abbott Signs Law Protecting Texans from Wrongful Social Media
Dkt. 48, at 27) (“H.B. 20’s true interest is in promoting conservative speech on platforms that
legislators perceive as ‘liberal.’”). In the State’s opposition brief, the State describes the social media
platforms as “skewed,” saying social media platforms’ “current censorship practices” lead to “a
skewed exchange of ideas in the Platforms that prevents such a search for the truth.” (Resp. Prelim.
HB 20’s severability clause does not save HB 20 from facial invalidation. This Court has
found Sections 2 and 7 to be unlawful. Both sections are replete with constitutional defects,
28
Case 1:21-cv-00840-RP Document 51 Filed 12/01/21 Page 29 of 30
onerously burdensome disclosure and operational requirements. Like the Florida statute, “[t]here is
nothing that could be severed and survive.” NetChoice, 2021 WL 2690876, at *11.
The remaining factors all weigh in favor of granting Plaintiffs’ request for a preliminary
injunction. “There can be no question that the challenged restrictions, if enforced, will cause
irreparable harm. ‘The loss of First Amendment freedoms, for even minimal periods of time,
unquestionably constitutes irreparable injury.’” Roman Cath. Diocese of Brooklyn v. Cuomo, 141 S. Ct. 63,
67 (2020) (quoting Elrod v. Burns, 427 U.S. 347, 373 (1976) (plurality op.)). Absent an injunction, HB
20 would “radically upset how platforms work and the core value that they provide to users.”
(Prelim. Inj. Mot., Dkt. 12, at 52). HB 20 prohibits virtually all content moderation, the very tool
that social medial platforms employ to make their platforms safe, useful, and enjoyable for users.
(See, e.g., CCIA Decl., Dkt. 12-1, at 9, 13) (“[M]any services would be flooded with abusive,
objectionable, and in some cases unlawful material, drowning out the good content and making their
services far less enjoyable, useful, and safe.”) (“Content moderation serves at least three distinct vital
functions. First, it is an important way that online services express themselves and effectuate their
community standards, thereby delivering on commitments that they have made to their
communities. . . . Second, content moderation is often a matter of ensuring online safety. . . . Third,
content moderation facilitates the organization of content, rendering an online service more
useful.”). In addition, social media platforms would lose users and advertisers, resulting in
irreparable injury. (Prelim. Inj. Mot., Dkt. 12, at 53–54); (see, e.g., NetChoice Decl., Dkt. 12-2, at 5–6
(“Not only does the Bill impose immediate financial harm to online businesses, it risks permanent,
irreparable harm should any of those users or advertisers decide never to return to our members’
sites based on their past experience or the detrimental feedback they have heard from others.”).
29
Case 1:21-cv-00840-RP Document 51 Filed 12/01/21 Page 30 of 30
The irreparable harm to Plaintiffs’ members outweighs any harm to the State from a
preliminary injunction. NetChoice, 2021 WL 2690876, at *11. Since the State lacks a compelling state
interest for HB 20, the State will not be harmed. See Texans for Free Enter. v. Texas Ethics Comm’n, 732
F.3d 535, 539 (5th Cir. 2013). Finally, courts have found that ‘“injunctions protecting First
Amendment freedoms are always in the public interest.’” Id. at 539 (quoting Christian Legal Soc’y v.
Walker, 453 F.3d 853, 859 (7th Cir. 2006)). In this case, content moderation and curation will benefit
users and the public by reducing harmful content and providing a safe, useful service. (See, e.g., CCIA
Decl., Dkt. 12-1, at 9, 13). Here, an “injunction will serve, not be adverse to, the public interest.”
IV. CONCLUSION
For these reasons, IT IS ORDERED that the State’ s motion to dismiss, (Dkt. 23), is
DENIED.
12), is GRANTED. Until the Court enters judgment in this case, the Texas Attorney General is
ENJOINED from enforcing Section 2 and Section 7 of HB 20 against Plaintiffs and their
members. Pursuant to Federal Rule of Civil Procedure 65(c), Plaintiffs are required to post a
$1,000.00 bond.
_____________________________________
ROBERT PITMAN
UNITED STATES DISTRICT JUDGE
30