From Casetext: Smarter Legal Research

NetChoice LLC v. Reyes

United States District Court, District of Utah
Sep 10, 2024
2:23-cv-00911-RJS-CMR (D. Utah Sep. 10, 2024)

Opinion

2:23-cv-00911-RJS-CMR 2:24-cv-00031-RJS-CMR

09-10-2024

NETCHOICE, LLC, Plaintiff, v. SEAN D. REYES, in his official capacity as Attorney General of Utah; and KATHERINE HASS, in her official capacity as Director of the Division of Consumer Protection of the Utah Department of Commerce, Defendants. HANNAH PAISLEY ZOULEK, a Utah resident; JESSICA CHRISTENSEN, a Utah resident; LU ANN COOPER, a Utah resident; M.C., a Utah resident, by and through her parent, LU ANN COOPER; VAL SNOW, a Utah resident; and UTAH YOUTH ENVIRONMENTAL SOLUTIONS, a Utah association, Plaintiffs, v. KATHERINE HASS, in her official capacity as Director of the Utah Division of Consumer Protection; SEAN REYES, in his official capacity as Utah Attorney General, Defendants.


Cecilia M. Romero Magistrate Judge

MEMORANDUM DECISION AND ORDER

ROBERT J. SHELBY UNITED STATES CHIEF DISTRICT JUDGE

Earlier this year, the State of Utah enacted the Utah Minor Protection in Social Media Act. The Act, which takes effect on October 1, 2024, seeks to protect young Utahans' mental health and personal privacy by requiring social media platforms to verify users' ages and impose special restrictions on minors' accounts.

Now before the court are two cases challenging the Act, alleging it violates the rights to free expression and due process under the First and Fourteenth Amendments of the United States Constitution. The first is brought by Plaintiff NetChoice, LLC. NetChoice is a trade association comprised of internet companies, including household names such as Google, Meta, Snap, and X. The second is brought by Plaintiffs Hannah Paisley Zoulek, Jessica Christensen, Lu Ann Cooper, M.C., Val Snow, and Utah Youth Environmental Solutions. The Zoulek Plaintiffs are minors, adults, and a youth-led organization who use social media platforms to learn, express themselves, and interact with others. Both NetChoice and the Zoulek Plaintiffs seek orders enjoining Defendants Sean D. Reyes and Katherine Hass from enforcing the Act while the court resolves the parties' constitutional challenges.

Dkt. 52 (case no. 2:23-cv-00911), Plaintiff's Motion for Preliminary Injunction (NetChoice Motion); Dkt. 37 (case no. 2:24-cv-00031), Motion for Preliminary Injunction and Memorandum of Law in Support (Zoulek Motion). In subsequent citations to the respective dockets, the court will note “NetChoice” or “Zoulek” in a parenthetical following the docket number. Throughout this Memorandum Decision and Order, the court will refer to Plaintiff NetChoice, LLC as “NetChoice.” The court will refer to Plaintiffs Hannah Paisley Zoulek, Jessica Christensen, Lu Ann Cooper, M.C., Val Snow, and Utah Youth Environmental Solutions, collectively, as “the Zoulek Plaintiffs.” The court will refer to NetChoice and the Zoulek Plaintiffs, collectively, as “Plaintiffs.

As explained below, the court finds NetChoice is substantially likely to succeed on its claim the Act violates the First Amendment and grants its request for a preliminary injunction. The court recognizes the State's earnest desire to protect young people from the novel challenges associated with social media use. But owing to the First Amendment's paramount place in our democratic system, even well-intentioned legislation that regulates speech based on content must satisfy a tremendously high level of constitutional scrutiny. And on the record before the court, Defendants have yet to show the Act does.

Separately, the court finds the Zoulek Plaintiffs have not sufficiently alleged their standing to challenge the Act's constitutionality and denies their request for a preliminary injunction. The court dismisses the Zoulek Plaintiffs' claims without prejudice and invites them to move for leave to file an amended complaint if they wish to do so.

BACKGROUND

I. The Parties

A. NetChoice

NetChoice is a nonprofit trade association for internet companies. It seeks to “promote online commerce and speech,” “increase consumer access and options via the [i]nternet,” and “minimiz[e] the burdens that would prevent businesses from making the [i]nternet more accessible and useful.” NetChoice members include many prominent internet companies: Dreamwidth; Google, which owns and operates YouTube; Meta, which owns and operates Facebook and Instagram; Nextdoor; Pinterest; Snap Inc., which owns and operates Snapchat; and X.

Dkt. 51 (NetChoice), Plaintiff's First Amended Complaint (NetChoice FAC) ¶ 8.

Id.

Id. ¶ 11. The Act does not regulate all NetChoice members. Id. For clarity, however, this Order refers to the subset of NetChoice members subject to the Act as “members.”

According to NetChoice, each of these companies operate websites and applications that publish, disseminate, create, or distribute protected speech “by displaying text, audio, images, or video to users-including user-generated content.” And through NetChoice member sites, users “gain access to information and communicate with one another about it on any subject that might come to mind.”

NetChoice Motion at 5 (internal quotations and citations omitted).

NetChoice FAC ¶ 19.

B. The Zoulek Plaintiffs

The Zoulek Plaintiffs are Utah residents and a Utah-based association who use social media to “communicate, express themselves, associate with peers, and learn.” They contend the Act would restrict their ability “to communicate and access information.”

Dkt. 36 (Zoulek), First Amended Complaint for Declaratory and Injunctive Relief (Zoulek FAC) ¶ 5.

Id.

Hannah Paisley Zoulek is a soon-to-be college student who uses social media for educational purposes and to connect with communities of users expressing themselves through creative writing. Jessica Christensen is a prominent advocate for former members of polygamous groups who herself escaped a polygamous family at age fifteen. Minors and adults who have left or are seeking help in leaving abusive homes frequently contact her through social media. Lu Ann Cooper is the co-founder and president of Hope After Polygamy, which provides support to individuals, including minors, who are in or have left polygamist communities. Hope After Polygamy maintains several social media accounts to educate the public about its services and communicate with minors seeking help. M.C. is Cooper's daughter and a high school student who uses social media to connect with friends, explore creative interests, and obtain information about a range of topics. Val Snow is a YouTuber who makes videos about mental health, resilience, and LGBTQ-related issues. Both minors and adults watch his content and contact him through the channel for support. And Utah Youth Environmental Solutions (UYES) is a “youth-led grassroots organization that seeks to educate young people in Utah regarding climate change and environmental issues.” It uses social media to advertise opportunities for involvement, promote other resources and information, and communicate with minors interested in the organization.

Id. ¶ 7.

Id. ¶ 8.

Id.

Id. ¶ 9.

Id.

Id. ¶ 10.

Id. ¶ 11.

Id.

Id. ¶ 12.

Id.

C. Defendants

Defendants are Katherine Hass and Sean Reyes, both sued in their official capacity. Hass is Director of the Division of Consumer Protection of the Utah Department of Commerce (the Division). The Act grants enforcement authority to the Division and its Director. Reyes is the Attorney General of Utah. He has authority to “give legal advice to, and act as counsel for, the [D]ivision in the exercise of the [D]ivision's responsibilities.”

NetChoice FAC ¶¶ 14-15; Zoulek FAC ¶¶ 13-14.

NetChoice FAC ¶ 15; Zoulek FAC ¶ 13.

NetChoice FAC ¶ 15 (citing Utah Code § 13-71-301); Zoulek FAC ¶ 13 (citing Utah Code § 13-71-301).

NetChoice FAC ¶ 14; Zoulek FAC ¶ 14.

NetChoice FAC ¶ 14 (quoting Utah Code § 13-71-301(4)(b)); Zoulek FAC ¶ 14.

II. The Act

In March 2024, the State enacted the Utah Minor Protection in Social Media Act. Scheduled to take effect on October 1, 2024, the Act partially replaces Utah's Social Media Regulation Act of 2023, which the State repealed after NetChoice and the Zoulek Plaintiffs filed separate cases challenging its constitutionality. The Act purports to advance Utah's “compelling interest in safeguarding the well-being and privacy of [Utah] minors” who use social media services.

NetChoice FAC ¶¶ 38-39; Zoulek FAC ¶ 43.

NetChoice FAC ¶¶ 38-39; Zoulek FAC ¶ 42.

Utah Code § 13-71-102(1). The Act contains various legislative findings related to this “compelling interest.” Those findings state:

- the proliferation of social media services has led to the widespread collection and utilization of personal information, exposing minors to potential privacy and identity related harms;
- the addictive design features of certain social media services contribute to excessive use of a social media service by minors, impacting sleep patterns, academic performance, and overall health;
- social media services are designed without sufficient tools to allow adequate parental oversight, exposing minors to risks that could be mitigated with proper parental involvement and control;
- the state has enacted safeguards around products and activities that pose risks to minors, including regulations on motor vehicles, medications, and products and services targeted to children;
- prolonged and unregulated social media use has been linked to adverse effects on the mental health of minors, including increased rates of anxiety, depression, and social isolation;
- existing measures employed by social media companies to protect minors have proven insufficient; and
- the state should ensure that minors' personal data is given special protection, as minors may have less awareness of the risks, consequences, and safeguards related to a social media company's processing of minors' personal data.

A. Central Coverage Definition

The Act regulates “[s]ocial media compan[ies],” defined as “entit[ies] that own[] or operate[] a social media service.” A “social media service” is, in turn, defined as “a public website or application” that:

Id. § 13-71-101(13).

(i) displays content that is primarily generated by account holders and not by the social media company;
(ii) permits an individual to register as an account holder and create a profile that is made visible to the general public or a set of other users defined by the account holder;
(iii) connects account holders to allow users to interact socially with each other within the website or application;
(iv) makes available to each account holder a list or lists of other account holders
with whom the account holder shares a connection within the system; and
(v) allows account holders to post content viewable by other users.

Id. § 13-71-101(14)(a).

The Act expressly excludes from the “social media service” definition “email[,]” “cloud storage[,]” and “document viewing, sharing, or collaboration services.”

Id. § 13-71-101(14)(b).

B. The Act's Requirements

The Act's requirements are readily divided in two parts. First, the Act requires social media companies to “implement an age assurance system to determine whether a current or prospective Utah account holder . . . is a minor.” The system must be “reasonably calculated to enable a social media company to identify whether a current or prospective Utah account holder is a minor with an accuracy rate of at least 95%.” And in conjunction with this requirement, social media companies must “implement a review process allowing account holders to appeal the account holder's age designation by submitting documentary evidence to establish the account holder's age range.” The company must “review evidence submitted by the account holder and make a determination within 30 days of submission of the evidence.”

Id. § 13-71-201(1). The Act defines a minor as “an individual under 18 years old” that “has not been emancipated” or “married.” Id. § 13-71-101(8).

Id. § 13-71-101(2).

Id. § 13-71-201(3)(a).

Id. § 13-71-201(3)(b).

Second, the Act subjects social media companies to special rules with respect to Utah minors' accounts. Relevant to this case, the Act requires social media companies to “set default privacy settings to prioritize maximum privacy, including settings” that:

(a) restrict the visibility of a Utah minor account holder's account to only
connected accounts;
(b) limit the Utah minor account holder's ability to share content to only connected accounts;
(c) restrict any data collection and sale of data from a Utah minor account holder's account that is not required for core functioning of the social media service;
(d) disable search engine indexing of Utah minor account holder profiles;
(e) restrict a Utah minor account holder's direct messaging capabilities to only allow direct messaging to connected accounts; and
(f) allow a Utah minor account holder to download a file with all information associated with the Utah minor account holder's account[.]

The Act defines a connected account as “an account on the social media service that is directly connected to: (a) the minor account holder's account; or (b) an account that is directly connected to an account directly connected to the minor account holder's account.” Id. § 13-71-101(3). Directly connected “means an account on the social media service that is connected to another account by (a) sending a request to connect to another account holder and having the request to connect accepted by the other account holder; or (b) receiving a request to connect from another account holder and accepting the request to connect.” Id. § 13-71-101(5).

The Act does not define what falls within the scope of “required for core functioning of the social media service.”

Id. § 13-71-202(1).

These default privacy setting may not be changed without a social media company “first obtaining verifiable parental consent.”

Id. § 13-71-204(1). Verifiable parental consent “means authorization from a parent for a social media service to collect, use, and disclose personal information of a Utah minor account holder, that complies with” certain “verifiability requirements.” Id. § 13-71-101(18). To comply, the social media service must “provide advance notice to the parent describing information practices related to the minor account holder's personal information” and “receive confirmation that the parent received the notice ....” Id. § 13-71-101(18)(a), (b).

The Act also requires social media companies to “disable” certain “features that prolong user engagement” on Utah minors' accounts. These features include “autoplay functions that continuously play content without user interaction[,]” “scroll or pagination that loads additional content as long as the user continues scrolling[,]” and “push notifications prompting repeated user engagement.”

Id. § 13-71-202(5).

The court will follow the parties in referring to this feature as seamless pagination.

Id. § 13-71-202(5)(a)-(c). Apart from “push notifications,” the Act does not further define these features. A push notification is “an automatic electronic message displayed on an account holder's device, when the user interface for the social media service is not actively open or visible on the device, that prompts the account holder to repeatedly check and engage with the social media service.” Id. § 13-71-101(11).

Finally, the Act states a covered social media company's “terms of service related to a Utah minor account holder shall be presumed to include an assurance of confidentiality for the Utah minor account holder's personal information.” This presumption “may be overcome if the social media company obtains verifiable parental consent.” And the presumption of confidentiality “does not apply to a social media company's internal use or external sharing of a Utah minor account holder's personal information if the use or sharing is necessary” to:

Id. § 13-71-204(2).

Id. § 13-71-204(3).

(a) maintain or analyze functioning of the social media service;
(b) enable network communications;
(c) personalize the user's experience based on the user's age and location;
(d) display a username chosen by the Utah minor account holder;
(e) obtain age assurance information as required under [Utah Code] Section 1371-201; or
(f) comply with the requirements of this chapter or other federal or state laws.

Id. § 13-71-204(4).

C. Enforcement

The Act grants the Division enforcement authority and authorizes the Attorney General to “give legal advice to, and act as counsel for the [D]ivision in the exercise of [its] [enforcement] responsibilities.” Specifically, the Division Director “may impose an administrative fine of up to $2,500 for each violation” of the Act, and the Division “may bring an action in court to enforce a provision” of the Act. For actions brought in court, the court may:

Id. § 13-71-301(1), 301(2).

Id. § 13-71-301(3)(a)(i).

Id. § 13-71-301(3)(a)(ii). The Act authorizes both the Division and the “attorney general on behalf of the [D]ivision” to seek civil penalties in a civil action. Id. § 13-71-301(4)(b).

(i) declare that the act or practice violates a provision of [the Act];
(ii) enjoin actions that violate [the Act];
(iii) order disgorgement of any money received in violation of [the Act];
(iv) order payment of disgorged money to an injured purchaser or consumer;
(v) impose a civil penalty of up to $2,500 for each violation of [the Act];
(vi) award actual damages to an injured purchaser or consumer; and
(vii) award any other relief that the court deems reasonable and necessary.

Id. § 13-71-301(3)(b).

“If a court grants judgment or injunctive relief to the [D]ivision,” the Act further provides “the court shall award the [D]ivision” its “reasonable attorney fees[,]” “court costs[,]” and “investigative fees.”

Id. § 13-71-301(3)(c). The Act also subjects “[a] person who violates an administrative or court order issued for a violation of [the Act] . . . to a civil penalty of no more than $5,000 for each violation.” Id. § 13-71-301(4)(a).

The Act provides a “safe harbor” for social media companies who implement “age assurance” and “verifiable parental consent” mechanisms that comport with rules promulgated by the Division. The safe harbor provides that a social media company “is not subject to an enforcement action for a violation of Section 13-71-201 [the age assurance requirement] if the social media company implements and maintains an age assurance system that complies with rules made by the [D]ivision.” And the same provision assures that “[a] social media company is considered to have obtained verifiable parental consent if the social media company obtains parental consent through a mechanism that complies with . . . rules made by the [D]ivision.”

Id. § 13-71-302. At oral argument, Defendants indicated the Division will not publish these rules until after the Act takes effect on October 1, 2024.

Id. § 13-71-302(2).

Id. § 13-71-302(3).

III. Procedural History

NetChoice and the Zoulek Plaintiffs initiated their respective cases in December 2023 and January 2024, challenging the constitutionality of the then-existing Utah Social Media Regulation Act of 2023. Shortly thereafter, however, the court stayed both cases pending the completion of the 2024 Utah legislative session. During the session, the Utah Legislature repealed the 2023 law and partially replaced it with the Act. The parties in both cases agreed to file amended complaints drawn to the Act.

Dkt. 1 (NetChoice), NetChoice's Initial Complaint for Declaratory and Injunctive Relief; Dkt. 2 (Zoulek) Zoulek's Initial Complaint for Declaratory and Injunctive Relief.

Dkt. 41 (NetChoice), Docket Text Order (striking the briefing schedule for NetChoice's Motion for Preliminary Injunction and directing the parties to propose an updated schedule following the legislative session); Dkt. 27 (Zoulek), Order Granting Stipulation and Joint Motion to Stay Pending the Completion of the 2024 Utah General Legislative Session.

NetChoice FAC ¶ 38; ZoulekFAC ¶¶ 42-43.

Dkt. 45 (NetChoice), Docket Text Order (establishing new briefing schedule); Dkt. 31 (Zoulek), Docket Text Order (same).

On May 3, 2024, NetChoice filed its First Amended Complaint and the present Motion for Preliminary Injunction. The Zoulek Plaintiffs followed suit on May 31, 2024. Both Plaintiffs broadly challenge the constitutionality of the Act, arguing it violates the First Amendment and the Due Process Clause of the Fourteenth Amendment by impermissibly regulating the protected speech of social media companies and their users. Each Plaintiff also asserts provisions of the Act are preempted by Section 230 of the Communications Decency Act (CDA), 47 U.S.C. § 230, and the Zoulek Plaintiffs assert the Act violates the Commerce Clause. Both Plaintiffs aver these issues satisfy their burden to obtain a preliminary injunction and ask the court to enjoin Defendants from enforcing the Act before it takes effect on October 1, 2024.

See generally NetChoice FAC; NetChoice Motion.

See generally Zoulek FAC; Zoulek Motion.

NetChoice Motion at 1-5; ZoulekMotion at 1-2. See also NetChoice FAC ¶¶ 1-7; ZoulekFAC ¶¶ 1-6.

NetChoice FAC ¶ 5; Zoulek FAC ¶ 6. See also NetChoice Motion at 4.

Zoulek Motion at 2; Zoulek FAC ¶ 6.

NetChoice Motion at 4-5; Zoulek Motion at 2.

Defendants oppose both Motions for Preliminary Injunction on the grounds that Plaintiffs fail to meet their burden “of establishing a clear and unequivocal right” to injunctive relief. To that end, Defendants primarily contend Plaintiffs “cannot . . . demonstrate that [they are] likely to succeed on the merits” because “the Act is a reasonable and constitutional regulation that is appropriately tailored to the State's important and compelling interests.”

Dkt. 58 (NetChoice), Defendants' Memorandum in Opposition to Plaintiff's Motion for Preliminary Injunction (NetChoice Opposition) at 3; Dkt. 52 (Zoulek), Defendants' Memorandum in Opposition to Plaintiffs' Motion for Preliminary Injunction (Zoulek Opposition) at 3.

NetChoice Opposition at 2-3; Zoulek Opposition at 3.

After Plaintiffs filed their Motions for Preliminary Injunctions, Defendants separately moved to dismiss certain claims in each case. In the NetChoice case, Defendants moved to dismiss NetChoice's Section 230 preemption claim, and in the Zoulek case, Defendants moved to dismiss the Zoulek Plaintiffs' Section 230 preemption and Commerce Clause claims. On July 22, 2024, the court issued a Memorandum Decision and Order granting Defendants' Motion to Dismiss NetChoice's Section 230 preemption claim for failure to state a claim under Federal Rule of Civil Procedure 12(b)(6). Shortly thereafter, the Zoulek Plaintiffs voluntarily dismissed their Section 230 claim. On August 5, 2024, the court issued a separate Memorandum Decision and Order granting Defendants' Motion to Dismiss the Zoulek Plaintiffs' Commerce Clause claim for lack of standing.

Dkt. 59 (NetChoice), Defendants' Motion to Dismiss for Failure to State a Claim and Memorandum in Support.

Dkt. 51 (Zoulek), Defendants' Motion to Dismiss for Lack of Subject Matter Jurisdiction (Count 3) and Failure to State a Claim (Counts 3 and 4).

Dkt. 78 (NetChoice), Memorandum Decision and Order Granting Defendants' Motion to Dismiss ( Section 230 Decision).

Dkt. 66 (Zoulek), Plaintiffs' Notice of Voluntary Dismissal of Count IV of the First Amended Complaint.

Dkt. 67 (Zoulek), Memorandum Decision and Order Granting Defendants' Motion to Dismiss.

With those claims dismissed, only Plaintiffs' challenges under the First and Fourteenth Amendments remain, and the court considers only those challenges in its analysis of the present Motions. Those Motions are fully briefed and the court heard oral argument on August 14, 2024.

Dkt. 73 (NetChoice), Plaintiff's Reply Brief in Support of Motion for Preliminary Injunction (NetChoice Reply); Dkt. 62 (Zoulek), Plaintiffs' Combined Reply in Support of Motion for Preliminary Injunction and Opposition to Motion to Dismiss. At the direction of the court, the parties submitted supplemental briefing concerning the Supreme Court's recent decision in Moody v. NetChoice, LLC, 144 S.Ct. 2383 (2024). See Dkt. 75 (NetChoice), Plaintiff's Supplemental Brief on the Supreme Court's Moody Decision (NetChoice Supp. Brief); Dkt. 79 (NetChoice), Defendants' Supplemental Brief Regarding the Supreme Court's Decision in Moody v. NetChoice; Dkt. 65 (Zoulek), Plaintiffs' Response to Order for Supplemental Briefing; Dkt. 68 (Zoulek), Defendants' Supplemental Brief Regarding the Supreme Court's Decision in Moody v. NetChoice.

Dkt. 80 (NetChoice), Minute Entry; Dkt. 69 (Zoulek), Minute Entry.

ANALYSIS

The court takes up the parties' Motions separately, beginning with NetChoice's Motion and then proceeding to the Zoulek Plaintiffs' Motion. Although Defendants' do not challenge either Plaintiffs' standing, “[t]he standing requirement is an ‘irreducible constitutional minimum' that ‘serv[es] to identify those disputes which are appropriately resolved through the judicial process.'” Accordingly, the court begins its review of each Motion by examining Plaintiffs' standing. Concluding NetChoice has standing, the court proceeds to the merits of its Motion and grants its request for preliminary injunctive relief. Concluding the Zoulek Plaintiffs lack standing, the court denies the Zoulek Plaintiffs' Motion and dismisses each of their claims.

Speech First, Inc. v. Shrum, 92 F.4th 947, 949 (10th Cir. 2024) (quoting Lujan v. Defs. of Wildlife, 504 U.S. 555, 560 (1992)) (second alteration in original); see also Collins v. Daniels, 916 F.3d 1302, 1314 (10th Cir. 2019) (quoting Arbaugh v. Y&H Corp., 546 U.S. 500, 514 (2006)).

I. NetChoice's Motion

A. NetChoice Has Standing to Assert Harms to the First Amendment Interests of Its Members.

The court begins its analysis of NetChoice's Motion by reviewing whether NetChoice has standing to raise constitutional challenges against the Act. “The familiar tripartite test for standing requires a plaintiff to show (1) it has ‘suffered an injury in fact'; (2) the injury is ‘fairly traceable to the challenged action of the defendant'; and (3) it is ‘likely, as opposed to merely speculative, that the injury will be redressed by a favorable decision.'” However, a different standing test applies to organizations asserting claims on behalf of their members. Such an organization must show “(1) at least one of its members would have standing to sue in the member's own right; (2) the interest it seeks to protect is germane to its purpose; and (3) neither the claim asserted nor the relief requested requires the member to participate in the lawsuit.”

Speech First, Inc., 92 F.4th at 949 (quoting Lujan, 504 U.S. at 560-61).

Id. (citing Friends of the Earth v. Laidlaw, 528 U.S. 167, 181 (2000)).

Id.

NetChoice, suing on behalf of its members, satisfies these requirements. First, NetChoice has shown its members have individual standing to sue because they are subject to the Act and will face injury in the form of liability if they violate its operative provisions. This injury is directly traceable to Defendants, who the Act vests with enforcement authority, and would be redressed by an injunction blocking that authority. Second, NetChoice has shown the interests it aims to protect are central to its organizational purpose of promoting “online commerce and speech” while “minimizing . . . burdens that . . . prevent businesses from making the Internet more accessible and useful.” Finally, NetChoice has shown its claims do not “require[] the participation of individual members in the lawsuit.” Its claims can be proven without fact-intensive, individualized inquiry and the prospective relief it seeks “will inure to the benefit of those members of the association actually injured.”

Courts considering recent challenges NetChoice has brought against similar state laws have reached the same conclusion. See NetChoice, LLC v. Fitch, No. 1:24-cv-170-HSO-BWR, 2024 WL 3276409, at *5-7 (S.D.Miss. July 1, 2024) (holding NetChoice “has demonstrated its associational standing to bring claims on behalf of its member and its members' Mississippi users”); NetChoice, LLC v. Yost, No. 2:24-cv-00047, 2024 WL 555904, at *3-6 (S.D. Ohio Feb. 12, 2024) (finding NetChoice had “standing to bring both its claims on behalf of its member organizations and Ohioan minors”); NetChoice, LLC v, Griffin, No. 5:23-cv-05105, 2023 WL 5660155, at *9-12 (W.D. Ark. Aug. 31, 2023) (concluding NetChoice has associational standing to challenge state law on behalf of its members and its members' users).

Virginia v. Am. Booksellers Ass'n, Inc., 484 U.S. 383, 392 (1988) (citing Craig v. Boren, 429 U.S. 190, 194 (1976)) (holding a trade group had standing when a challenged law was “aimed directly at [the group's members], who . . . would have to take significant and costly compliance measures or risk criminal prosecution”).

NetChoice FAC ¶ 8.

United Food & Com. Workers Union Loc. 751 v. Brown Grp., Inc., 517 U.S. 544, 553 (1996) (quoting Hunt v. Wash. State Apple Advert. Comm'n, 432 U.S. 333, 343 (1977)).

Warth v. Seldin, 422 U.S. 490, 515 (1975).

Thus, the court concludes NetChoice has standing to raise constitutional claims on behalf of its members.

NetChoice contends it has standing to sue on behalf of both its members and its members' users. However, as explained below, the court determines NetChoice is entitled to a preliminary injunction based on the interests of its members alone. The court need not consider whether NetChoice has standing to advance claims on behalf of its members' users.

B. NetChoice Is Entitled to a Preliminary Injunction.

Having reviewed NetChoice's standing, the court turns to NetChoice's Motion for Preliminary Injunction. To obtain a preliminary injunction under Rule 65 of the Federal Rules of Civil Procedure, a moving party must establish four elements: “(1) a substantial likelihood of success on the merits; (2) irreparable harm to the movant if the injunction is denied; (3) the threatened injury outweighs the harm . . . the preliminary injunction may cause the opposing party; and (4) the injunction, if issued, will not adversely affect the public interest.” Because preliminary injunctive relief “is an extraordinary remedy” the moving party's “right to relief must be clear and unequivocal.”

Leachco, Inc. v. Consumer Prod. Safety Comm'n, 103 F.4th 748, 752 (10th Cir. 2024) (quoting Gen. Motors Corp. v. Urb. Gorilla, LLC, 500 F.3d 1222, 1226 (10th Cir. 2007)).

Id. (quoting Schrier v. Univ. of Co., 427 F.3d 1253, 1258 (10th Cir. 2005)).

As described below, NetChoice clears this hurdle. First, NetChoice has shown it is substantially likely to succeed on the merits of its claim the entire Act violates the United States Constitution. Specifically, NetChoice has shown it is substantially likely to succeed on its first cause of action-that the entire Act, through the Act's Central Coverage Definition, facially violates the First Amendment. Second, NetChoice has demonstrated the remaining preliminary injunction factors support its request for relief.

See NetChoice FAC ¶¶ 71-96. Because the court concludes NetChoice has shown it is substantially likely to succeed on this claim, it does not reach NetChoice's remaining First and Fourteenth Amendment claims. See NetChoice FAC ¶¶ 97-151, 160-173, 179-187; NetChoice Motion at 24-34, 37-39. Neither does the court reach NetChoice's Section 230 claims. See NetChoice FAC ¶¶ 152-159, 174-178; NetChoice Motion at 35-37. The court already determined, as a matter of law, that Section 230 does not preempt the Act's prohibitions on the use of autoplay, seamless pagination, and notifications on minors' accounts and dismissed the corresponding claim from NetChoice's FAC. See generally Section 230 Decision.

1. NetChoice is Substantially Likely to Succeed on the Merits of Its Claim the Entire Act, Through the Central Coverage Definition, Violates the First Amendment.

NetChoice argues the entire Act facially violates the First Amendment because the Act's operative provisions each rely on the Central Coverage Definition, and the Central Coverage Definition imposes unjustified, content-based restrictions on social media companies' speech.

NetChoiceMotion at 16-24; id. at 25-26 (quotingMurphy v. N.C. A.A., 584 U.S. 453, 481 (2018)).

NetChoice's argument is persuasive. As a preliminary matter, there is no dispute the Act implicates social media companies' First Amendment rights. The speech at issue in this case- the speech social media companies engage in when they make decisions about how to construct and operate their platforms-is protected speech. The Supreme Court has long held that “[a]n entity ‘exercis[ing] editorial discretion in the selection and presentation' of content is ‘engage[d] in speech activity'” protected by the First Amendment. And this July, in Moody v. NetChoice, LLC, the Court affirmed these First Amendment principles “do not go on leave when social media are involved.” Indeed, the Court reasoned that in “making millions of . . . decisions each day” about “what third-party speech to display and how to display it,” social media companies “produce their own distinctive compilations of expression.”

Defendants concede this issue, acknowledging “social media platforms contain speech, such that there are First Amendment . . . concerns.” NetChoice Opposition at 10. Relatedly, Defendants concede each challenged provision of the Act, other than the age assurance requirement, is subject to heighted First Amendment scrutiny. Id.

NetChoice argues the Act implicates the First Amendment rights of both its members and its members' users. However, as suggested at n. 82, the court concludes NetChoice is entitled to a preliminary injunction without deciding whether the Act implicates and violates the First Amendment rights of social media users.

Moody, 144 S.Ct. at 2402 (quoting Arkansas Ed. Television Comm'n v. Forbes, 523 U.S. 666, 674 (1998)) (second and third alterations in original); see also Hurley v. Irish-American Gay, Lesbian, and Bisexual Group of Boston, 515 U.S. 557, 570 (1995) (“Nor, under our precedent, does First Amendment protection require a speaker to generate, as an original matter, each item featured in the communication.”).

Moody, 144 S.Ct at 2394.

Id. at 2393. Cf. Yost, 2024 WL 555904, at *7 (“[T]he Act does implicate the First Amendment, at least to some degree ....”); Fitch, 2024 WL 3276409, at *10 (“[T]he [c]ourt is not persuaded that H.B. 1126 merely regulates non-expressive conduct.”); Comput. & Commc'ns Indus. Ass'n et al. v. Paxton, 2024 WL 4051786, at *10 (W.D. Tex. Aug. 30, 2024) (“The [c]ourt agrees with Plaintiffs that HB 18's threshold coverage definition is a contentbased regulation.”).

Regarding the more pressing question-whether the Act facially violates social media companies' First Amendment rights-the probable answer is “yes.” As explained below, NetChoice has shown it is substantially likely to succeed on its claim the Act has “no constitutionally permissible application” because it imposes content-based restrictions on social media companies' speech, such restrictions require Defendants to show the Act satisfies strict scrutiny, and Defendants have failed to do so.

NetChoice Supp. Brief at 9; see also NetChoice Motion at 15 (quoting United States v. Stevens, 559 U.S. 460, 472-73 (2010)). In Moody, the Supreme Court affirmed that a district court evaluating a facial challenge under the First Amendment must consider whether a “substantial number of [a law's] applications are unconstitutional, judged in relation to [its] plainly legitimate sweep.” Moody, 144 S.Ct. at 2397. The Court also prescribed a step-by-step framework for courts assessing this question to apply. Id. at 2398. First, the court directed district courts to assess a challenged law's “scope,” considering “[w]hat activities, by what actors,” the law “prohibit[s] or otherwise regulate[s].” Id. Second, the Court directed district courts to “decide which of the law['s] applications violate the First Amendment, and . . . measure them against the rest.” Id. With respect to the scope-framing Central Coverage Definition, however, the Moody questions are easily answered. There is no dispute about who and what the Act regulates because the parties agree the Act's operative provisions, only apply to “social media companies” providing “social media services” to minors. See NetChoice Supp. Brief at 2-4. Likewise, there is no dispute “about how the First Amendment applies to different websites or regulatory requirements.” Id. at 5. Although the parties offer opposing arguments about whether the Act satisfies heighted First Amendment scrutiny, NetChoice urges the Act is uniformly unconstitutional and Defendants urge the Act is uniformly constitutional. See id. at 9.

i. NetChoice Has Shown the Act Imposes Content-Based Restrictions on Social Media Companies' Speech.

NetChoice argues the entire Act is facially content based because the Central Coverage Definition draws distinctions between websites that allow users to interact socially and websites that serve another function or purpose, such as those that allow users to shop, read the news, access entertainment, educate themselves, or conduct business. In brief review, the Central Coverage Definition defines a “social media company” as “an entity that owns or operates a social media service,” and defines a “social media service” by reference to five characteristics, including “a public website or application” that “allow[s] users to interact socially with each other ....” The court agrees with NetChoice.

NetChoice Motion at 16-17 (citing Reed v. Town of Gilbert, Ariz., 576 U.S. 155, 163-64 (2015)).

While a law “is facially content based . . . if it ‘applies to particular speech because of the topic discussed or the idea or message expressed[,]'” not all “facial distinctions . . . are obvious,” and a law “cannot escape classification as facially content based simply by swapping an obvious subject-matter distinction for a ‘function or purpose' proxy that achieves the same result.” Additionally, even “facially content neutral” laws must be considered content based if they “cannot be justified without reference to the content of the regulated speech,” or if they “were adopted . . . because of disagreement with the message the speech conveys.”

City of Austin v. Reagan Nat'l Advert. of Austin, LLC, 596 U.S. 61, 69 (2022) (quoting Reed, 576 U.S. at 163).

City of Austin, 596 U.S. at 74.

Reed, 576 U.S. at 164 (quoting Ward v. Rock Against Racism, 491 U.S. 781, 791 (1989)) (internal quotations and alterations omitted).

For example, in Reed v. Town of Gilbert, Arizona, the Supreme Court held a municipal policy that divided signs into categories based on their “communicative content,” such as “political signs” and “temporary directional signs,” and regulated each category differently, was facially content based. The court reasoned that “a speech regulation targeted at a specific subject matter is content based even if it does not discriminate among viewpoints within that subject matter.” But in City of Austin v. Reagan National Advertising, the Court held a municipal policy that distinguished between on-premises and off-premises signs (that is, signs advertising products or services located on the site where the sign was installed and signs advertising products or services located somewhere else) was not facially content based. Distinguishing Reed, the Court explained that although the on-premises/off-premises distinction “required a reader to inquire ‘who is the speaker and what is the speaker saying,'” it required that inquiry “only in service of drawing neutral, location-based lines.” This examination was “agnostic as to content,” failing to “single out any topic or subject matter for differential treatment.”

Id.

Id. at 169.

City of Austin, 596 U.S. at 66, 68.

Id. at 68-69.

Id. at 69.

The Central Coverage Definition, like the sign ordinance at issue in Reed, appears to draw facially content-based distinctions between subjects of speech. Just as the Reed ordinance divided the universe of signs into political signs, defined as signs “designed to influence the outcome of an election,” and temporary directional signs, defined as “signs directing the public to a church,” the Act's Central Coverage Definition divides the universe of internet platforms into social media services, defined as websites or applications that “allow users to interact socially with each other,” and other internet platforms, such as platforms for “news, sports, commerce, [and] online video games.”

See Reed, 576 U.S. at 155.

Fitch, 2024 WL 3276409, at *9. Although the Central Coverage Definition doesn't expressly divide the world of internet platforms into social media services, news services, entertainment services, etc., it implicitly does the same by distinguishing between social media services and all other types of internet services.

Defendants respond that the Definition contemplates a social media service's “structure, not subject matter.” However, Defendants' argument emphasizes the elements of the Central Coverage Definition that relate to “registering accounts, connecting accounts, [and] displaying user-generated content” while ignoring the “interact socially” requirement. And unlike the premises-based distinction at issue in City of Austin, the social interaction-based distinction does not appear designed to inform the application of otherwise content-neutral restrictions. It is a distinction that singles out social media companies based on the “social” subject matter “of the material [they] disseminate[].” Or as Defendants put it, companies offering services “where interactive, immersive, social interaction is the whole point.”

NetChoice Opposition at 22.

Id.

Fitch, 2024 WL 3276409, at *9; see also Paxton, 2024 WL 4051786, at *11 (“If there is a difference between the regulated DSP and unregulated DSP, it is the content of the speech on the site, not the medium through which that speech is presented.”).

NetChoice Opposition at 25.

Defendants also respond that the Central Coverage Definition is content neutral because it does not prevent “minor account holders and other users they connect with [from] discuss[ing] any topic they wish.” But in this respect, Defendants appear to misunderstand the essential nature of NetChoice's position. The foundation of NetChoice's First Amendment challenge is not that the Central Coverage Definition restricts minor social media users' ability to, for example, share political opinions. Rather, the focus of NetChoice's challenge is that the Central Coverage Definition restricts social media companies' abilities to collage user-generated speech into their “own distinctive compilation[s] of expression.”

Id.

Moody, 144 S.Ct. at 2393.

Moreover, because NetChoice has shown the Central Coverage Definition facially distinguishes between “social” speech and other forms of speech, it is substantially likely the Definition is content based and the court need not consider whether NetChoice has “point[ed] to any message with which the State has expressed disagreement through enactment of the Act.” Likewise, the court need not consider NetChoice's additional arguments that the Definition is speaker based or viewpoint based.

NetChoice Opposition at 21. See Reed, 576 U.S. at 165-166 (quoting Cincinnati v. Discovery Network, Inc., 507 U.S. 410, 429 (1993) (“A law that is content based on its face is subject to strict scrutiny regardless of the government's benign motive, content-neutral justification, or lack of ‘animus toward the ideas contained' in the regulated speech.”)). Defendants also cite Turner Broadcasting System v. F.C.C. in support of their position that the Central Coverage Definition is content-neutral. See NetChoice Opposition at 22. In Turner, the Supreme Court employed a two-part analysis to hold rules requiring cable television systems to carry local broadcast television stations were not content based. Turner Broad. Sys., Inc. v. F.C.C., 512 U.S. 622, 645 (1994) (“Our cases have recognized that even a regulation neutral on its face may be content based if its manifest purpose is to regulate speech because of the message it conveys.”). First, the Court held the rules were not facially content based because they were “based only upon the manner in which speakers transmit their messages to viewers, and not upon the messages they carry.” Id. at 637-45. Second, the Court held that content-based purposes did not underlie the facially content neutral rules. Id. at 645-49. Defendants' argument relies on the second piece of this analysis. See NetChoice Opposition at 22. But as outlined above, the court need not conduct this analysis because the Central Coverage Definition is facially content based.

See NetChoice Motion at 17-18.

ii. Defendants Have Not Shown the Act Satisfies Strict Scrutiny.

Accepting that the entire Act, through the Central Coverage Definition, is facially content based, strict scrutiny applies. The Act is “presumptively unconstitutional and may be justified only if the government proves that [it is] narrowly tailored to serve compelling state interests.”Defendants have not met their burden to satisfy this “demanding standard.”

Id. at 163.

Brown v. Ent. Merchants Ass'n, 564 U.S. 786, 799 (2011) (quoting U.S. v. Playboy Ent. Group, 529 U.S. 803, 817 (2000)) (“It is rare that a regulation restricting speech because of its content will ever be permissible.”). Recall that it is NetChoice's burden to show it is likely to succeed on the merits of its First Amendment challenge. See Leachco, Inc., 103 F.4th at 752; Awad v. Ziriax, 670 F.3d 1111, 1129 (10th Cir. 2012). That being so, the “burdens at the preliminary injunction stage track the burdens at trial,” and Defendants bear “the burden of proof on the ultimate question of the challenged Act's constitutionality.” Awad, 670 F.3d at 1129 (quoting Gonzales v. O Centro Espirita Beneficente Uniao de Vegetal, 546 U.S. 418, 429 (2006)); see also Playboy Ent. Group, 529 U.S. at 818 (“When First Amendment compliance is the point to be proved, the risk of nonpersuasion-operative in all trials-must rest with the Government, not with the citizen.”).

a. Defendants Have Not Shown the Act Serves a Compelling State Interest.

Although the Act's statutory language asserts “the state [of Utah] has a compelling interest in safeguarding the well-being and privacy of minors in the state[,]” Defendants have not met their burden to articulate a compelling government interest warranting the Act's intrusion on social media companies' First Amendment rights.

To satisfy this exacting standard, Defendants must “specifically identify an ‘actual problem' in need of solving.” In Brown v. Entertainment Merchants Association, for example, the Supreme Court held California failed to demonstrate a compelling government interest in protecting minors from violent video games because it lacked evidence showing a causal “connection between exposure to violent video games and harmful effects on children.”Reviewing psychological studies California cited in defense of its position, the Court reasoned research “show[ed] at best some correlation between exposure to violent entertainment” and “real-world effects.” This “ambiguous proof” did not establish violent videogames were such a problem that it was appropriate for California to infringe on its citizens' First Amendment rights. Likewise, the Court rejected the notion that California had a compelling interest in “aiding parental authority.” The Court reasoned the state's assertion ran contrary to the “rule that ‘only in relatively narrow and well-defined circumstances may government bar public dissemination of protected materials to [minors].'”

Brown, 564 U.S. at 799 (quoting Playboy Ent. Grp., 529 U.S. at 822-23); see also Awad, 670 F.3d at 1130 (“[O]verly generally statements of abstract principles do not satisfy the government's burden to articulate a compelling interest.”).

Brown, 564 U.S. at 800.

Id.

Id. at 799.

Id. at 802.

Id. (quoting Erznoznik v. City of Jacksonville, 422 U.S. 205, 212-13 (1975)) (alteration in original).

Viewing Defendants' argument through a wide lens, the court understands Defendants' position to be that the State has compelling interests in protecting minors from the mental health-and personal privacy-related harms associated with excessive social media use. But these interests, like California's interests in protecting minors from the harms associated with violent videogames and aiding parental authority, fall short of the First Amendment's demanding standards.

Defendants' analysis does not clearly identify what compelling government interest the Act seeks to advance, moving straight from the question of whether the Central Coverage Definition is content based to a discussion of whether the Act is appropriately tailored. See NetChoice Opposition at 27-30. Thus, the court infers these interests from the Act's factual findings and Defendants' general opposition to NetChoice's Motion. See Utah Code § 13-71102; NetChoice Opposition at 4-5, 12-18, 27-30.

First, though the court is sensitive to the mental health challenges many young people face, Defendants have not provided evidence establishing a clear, causal relationship between minors' social media use and negative mental health impacts. It may very well be the case, as Defendants allege, that social media use is associated with serious mental health concerns including depression, anxiety, eating disorders, poor sleep, online harassment, low self-esteem, feelings of exclusion, and attention issues. But the record before the court contains only one report to that effect, and that report-a 2023 United States Surgeon General Advisory titled Social Media and Youth Mental Health-offers a much more nuanced view of the link between social media use and negative mental health impacts than that advanced by Defendants. For example, the Advisory affirms there are “ample indicators that social media can . . . have a profound risk of harm to the mental health and well-being of children and adolescents,” while emphasizing “robust independent safety analyses of the impact of social media on youth have not yet been conducted.” Likewise, the Advisory observes there is “broad agreement among the scientific community that social media has the potential to both benefit and harm children and adolescents,” depending on “their individual strengths and vulnerabilities, and . . . cultural, historical, and socio-economic factors.” The Advisory suggests social media can benefit minors by “providing positive community and connection with others who share identities, abilities, and interest,” “provid[ing] access to important information and creat[ing] a space for self-expression,” “promoting help-seeking behaviors[,] and serving as a gateway to initiating mental health care.”

NetChoice Opposition at 12-15.

See Dkt. 58-4 (NetChoice), Social Media and Youth Mental Health. Nor do Defendants provide any evidence the Utah Legislature considered as it deliberated whether to pass the Act. Although courts usually consider such evidence in deciding whether a law satisfies intermediate scrutiny-that is, whether a law advances a substantial government interest-it is telling that Defendants do not satisfy this reduced burden. See Turner Broad. Sys., Inc. v. F.C.C., 520 U.S. 180, 195, 198-216 (1997) (finding Congress reasonably relied on “substantial evidence” in concluding a “real threat justified enactment” of a federal law subject to intermediate scrutiny).

Id. at 4, 11.

Id. at 5.

Id. at 6. The Advisory suggests these benefits are especially prominent among minors “who are often marginalized, including racial, ethnic, and sexual and gender minorities.” Id. For example, the Advisory cites evidence that “[s]even out of ten adolescent girls of color report encountering positive or identity-affirming content related to race across social media platforms.” Id.

The record also contains a Declaration by Dr. Jean Twenge, a psychology professor at San Diego State University, describing various reports linking social media use to negative mental health impacts. But these reports are not themselves a part of the record, and the court is unable to assess their results or methodologies. Moreover, a review of Dr. Twenge's Declaration suggests the majority of the reports she cites show only a correlative relationship between social media use and negative mental health impacts. Insofar as those reports support a causal relationship, Dr. Twenge's Declaration suggests the nature of that relationship is limited to certain populations, such as teen girls, or certain mental health concerns, such as body image.

See Dkt. 58-2 (NetChoice), Declaration of Dr. Jean Twenge ¶¶ 30-53 (Twenge Declaration).

See, e.g., id. ¶ 31 (“Correlational studies consistently show a link between heavy social media use and mood disorders.”); id. ¶ 33(e) (“These studies are consistent with a 2022 meta-analysis finding a strong connection between social media use and depression/anxiety.”); id. ¶ 42 (“A longitudinal prospective study of adolescents without ADHD symptoms found that . . . high-frequency use of digital media . . . was associated with a modest yet statistically significant increased odds of developing ADHD symptoms.”).

See, e.g., id. ¶ 47 (“Even the social media companies themselves have internal research establishing the causal link between their product and poor mental health outcomes for youth, with Facebook finding that ‘we make body issues worse for 1 in 3 teen girls.'”); Id. ¶ 52 (“A random assignment experiment found that reducing social media use to one hour a day improved youths' confidence in their appearance.”).

Second, Defendants' position that the Act serves to protect uninformed minors from the “risks involved in providing personal information to social media companies and other users”ignores the basic First Amendment principle that “minors are entitled to a significant measure of First Amendment Protection.” The personal information a minor might choose to share on a social media service-the content they generate-is fundamentally their speech. And the Defendants may not justify an intrusion on the First Amendment rights of NetChoice's members with, what amounts to, an intrusion on the constitutional rights of its members' users.

NetChoice Opposition at 17.

Erznoznik, 422 U.S. at 212-13.

Yost, 2024 WL 555904, at *12 (“[L]ike content-based regulations, laws that require parental consent for children to access constitutionally protected, non-obscene content, are subject to strict scrutiny.”); see also Brown, 564 U.S. at 794-95 (citing Erznoznik, 422 U.S. at 213-14).

Third, with respect to both the State's mental health and personal privacy concerns, Defendants generally argue parents are caught “in a losing battle against social media companies for the attention and well-being of their own children.” However, Defendants' evidence is far from clear that “the Act's restrictions meet a substantial need of parents who wish to restrict their children's access to” social media services and “cannot do so” otherwise. To the contrary, Defendants' evidence generally indicates “[o]ther methods exist to advance the goal of protecting children on the internet, including parental controls and web filtering technology.”

NetChoice Opposition at 35.

Brown, 564 U.S. at 803.

Dkt. 58-3 (NetChoice), Declaration of Tony Allen (Allen Declaration) ¶ 37. The Declaration of Carl Szabo in Support of Plaintiff's Motion for Preliminary Injunction provides a useful explanation of the utility of these alternative methods. See Dkt. 52-1 (NetChoice), Declaration of Carl Szabo in Support of Plaintiff's Motion for Preliminary Injunction (Szabo Declaration) ¶ 8-9. Szabo describes the lengths NetChoice's members go to protect children on their social media platforms, as well as the network-, device-, browser-, and app-level restrictions parents may implement to control their children's access to various social media services. Id.

b. Defendants Have Not Shown the Act Is Narrowly Tailored.

Even assuming Defendants have established the State's mental health and personal privacy concerns are “‘actual problem[s] in need of solving,” the Act fails strict scrutiny because Defendants have not shown it is “carefully tailored to achieve those ends.” In the strict scrutiny context, narrow tailoring requires a law to be the “least restrictive means” of satisfying a government interest. That is, “the curtailment of free speech must be actually necessary ....” Additionally, the government may not pursue its interests by means that are either “seriously underinclusive” or “seriously overinclusive.”

Brown, 564 U.S. at 799.

Sable Commc'ns of Cal., Inc. v. F.C.C., 492 U.S. 115, 126 (1989).

Ams. for Prosperity Found. v. Bonta, 594 U.S. 595, 607 (2021) (quoting McCullen v. Coakley, 573 U.S. 464, 478 (2014)).

Brown, U.S. 564 at 799; R.A.V. v. City of St. Paul, 505 U.S. 377, 395 (1992) (“The dispositive question in this case . . . is whether content discrimination is reasonably necessary to achieve [a defendant's] compelling interests.”).

Brown, U.S. 564 at 805.

To begin, Defendants have not shown the Act is the least restrictive option for the State to accomplish its goals because they have not shown existing parental controls are an inadequate alternative to the Act. While Defendants present evidence suggesting parental controls are not in widespread use, their evidence does not establish parental tools are deficient. It only demonstrates parents are unaware of parental controls, do not know how to use parental controls, or simply do not care to use parental controls. Moreover, Defendants do not indicate the State has tried, or even considered, promoting “the diverse supervisory technologies that are widely available” as an alternative to the Act. The court is not unaware of young people's technological prowess and potential to circumvent parental controls. But parents “control[] whether their minor children have access to Internet-connected devices in the first place,” and Defendants have not shown minors are so capable of evading parental controls that they are an insufficient alternative to the State infringing on protected speech.

See Ashcroft v. A.C.L.U., 542 U.S. 656, 666 (2004) (upholding a district court's decision to issue a preliminary injunction enjoining enforcement of a federal law restricting minors' access to certain internet content when the government failed its burden to show the plaintiffs' proposed alternative-blocking and filtering software-was a less effective solution).

See, e.g., NetChoice Opposition at 17 (citing evidence that “just 50 percent of parents use any kind of parental controls,” and “just 16% of parents use blocking or filtering controls to restrict their teens use of [their] cell phone”).

Allen Declaration ¶¶ 38-41.

NetChoice Motion at 21. In any case, the State should be mindful of parents who choose not to use parental controls because they are not concerned about their children's social media use. “It is cardinal with us that the custody, case, and nurture of the child reside first in the parents, whose primary function and freedom include preparation for obligations the state can neither supply nor hinder.” Ginsberg v. State of N.Y., 390 U.S. 629, 639 (1986) (quoting Prince v. Massachusetts, 321 U.S. 158, 166 (1944)).

Allen Declaration ¶¶ 39-41.

Szabo Declaration ¶ 9.

If minors are capable of circumventing parental controls at all levels, there is reason to believe children could also circumvent the controls the Act requires social media companies to impose. See Brown, 564 U.S. at 802.

Defendants also suggest the Act is essential to solving social media-related problems because social media platforms contain “nicotine-like additives”-namely, seamless pagination, autoplay, and push notification systems-designed to foster “over-indulgence” and “user addiction.” But Defendants do not offer any evidence that requiring social media companies to compel minors to push “play,” hit “next,” and log in for updates will meaningfully reduce the amount of time they spend on social media platforms. Nor do Defendants offer any evidence that these specific measures will alter the status quo to such an extent that mental health outcomes will improve and personal privacy risks will decrease.

NetChoice Opposition at 4, 16, 30.

Next, Defendants have not shown the Act is not seriously “underinclusive when judged against its asserted justification[s].” Brown is illustrative for this purpose where the Supreme Court held California's restrictions on minors' access to violent videogames were underinclusive in so far as they did not restrict minors' access to other media, including “Saturday morning cartoons” or videogames “rated for young children.” The Court reasoned California's failure to regulate cartoons like Bugs Bunny and non-violent videogames like Sonic the Hedgehog was problematic because research showed they produced the same effect in children as violent videogames. This result “raise[d] serious doubts about whether the government [was] in fact pursing the interest it invoke[d], rather than disfavoring a particular speaker or viewpoint.”

Brown, 564 U.S. at 802.

Id. at 800-02.

Id.

Id. at 802.

Like Brown, the Act appears underinclusive when judged against the State's interests in protecting minors from the harms associated with social media use because the Act ultimately preserves minors' ability to spend as much time as they want on social media platforms. This outcome does not comport with a core underpinning of Defendants' argument-that excessive social media use harms minors. Similarly, the Act preserves minors' access to the addictive features Defendants express particular concern with on all internet platforms other than social media services. As NetChoice explains, “a teenager can receive notifications about their favorite sports team from ESPN but not from X-even if the notification is word-for-word the same.” They can “seamlessly scroll through image searches on Bing or through college rankings on U.S. News and World Report but cannot use such seamless pagination for searching recipes on Pinterest.” And they “can autoplay videos on Disney+ and Hulu,” but not YouTube.

See NetChoice Opposition at 4, 13, 23, 58, 60.

“Essentially all applications, including services like Apple News, Disney+, Duolingo, ESPN, and The Wall Street Journal” send users push notifications; “services like Disney+, Hulu, Spotify, and Buzzfeed use autoplay to present content[;]” and “services like U.S. News and World Report College Rankings, The New York Times, Bing, and Apple News use seamless pagination to present content[.]” Szabo Declaration ¶¶ 17-19.

NetChoice FAC ¶ 94.

Id.

Id. ¶ 78, 94.

Defendants generally respond to these underinclusivity concerns by suggesting a social media-specific problem arises when social media companies' use “addictive design features” in combination with “user-generated [content] and user-to-user interface.” But Defendants simply do not offer any evidence to support this distinction, and they only compare social media services to “entertainment services.” They do not account for the wider universe of platforms that utilize the features they take issue with, such as news sites and search engines. Accordingly, the Act's regulatory scope “raises seriously doubts” about whether the Act actually advances the State's purported interests.

NetChoice Opposition at 28.

In support of their position, Defendants cite a paragraph from Dr. Twenge's Declaration stating “[s]ocial media, more than TV or gaming, is linked to shorter sleep, more mid-sleep awakenings, and longer time to fall asleep.” Twenge Declaration ¶ 39. See also Brown, 564 U.S. at 798 (rejecting the notion that videogames present special First Amendment problems because they are “interactive”).

NetChoice Opposition at 28.

Brown, 564 U.S. at 802; see also Williams-Yulee v. Florida Bar, 575 U.S. 433, 448-49 (2015) (explaining underinclusivity “raises a red flag” when it suggests the government is disfavoring a particular speaker or viewpoint, or it “reveal[s] that a law does not actually advance a compelling interest”).

Finally, Defendants have not shown the Act is not seriously overinclusive, restricting more constitutionally protected speech than necessary to achieve the State's goals. Specifically, Defendants have not identified why the Act's scope is not constrained to social media platforms with significant populations of minor users, or social media platforms that use the addictive features fundamental to Defendants' well-being and privacy concerns. NetChoice member Dreamwidth, “an open source social networking, content management, and personal publishing website,” provides a useful illustration of this disconnect. Although Dreamwidth fits the Central Coverage Definition's concept of a “social media service,” Dreamwidth is distinguishable in form and purpose from the likes of traditional social media platforms-say, Facebook and X. Additionally, Dreamwidth does not actively promote its service to minors and does not use features such as seamless pagination and push notification.

NetChoice Motion at 24; Szabo Declaration ¶¶ 10-12.

See Dkt. 52-5 (NetChoice), Declaration of Denise Paolucci in Support of Plaintiff's Motion for Preliminary Injunction ¶ 1.

Id. ¶ 7.

Id. ¶¶ 3, 5, 11, 13-14.

In combination, these shortcomings demonstrate Defendants have not met their burden to show the Act, through the Central Coverage Definition, is narrowly tailored to advance a compelling government interest. As a result, the court concludes Defendants have not met their “burden of proof on the ultimate question of the . . . Act's constitutionality,” and NetChoice is substantially likely to prevail on the merits of its claim the entire Act, through the Central Coverage Definition, facially violates the First Amendment.

The court's focus with respect to under- and over-inclusivity is how the Act, through the Central Coverage Definition, restricts social media companies' speech. However, the Act also appears overinclusive in so far as it affects users' speech. For example, the Act's age assurance provision, which applies to all users, broadly burdens adult users' ability to “access a broad range of protected speech on a broad range of covered websites.” Fitch, 2024 WL 3276409 at *12. As another example, the Act's restrictions on minors' ability to connect with those outside their immediate networks broadly burdens minors' ability to share and receive protected speech. See Erznoznik, 422 U.S. at 212-14 (explaining “minors are entitled to a significant measure of First Amendment protection” and “speech that is neither obscene as to youths nor subject to some other legitimate proscription cannot be suppressed solely to protect the youth from ideas or images a legislative body thinks unsuitable for them”).

Gonzales, 546 U.S. at 429. Notably, this conclusion generally comports with the recent decisions of other district courts applying strict scrutiny to similar laws. See, e.g., Fitch, 2024 WL 3276409, at *14 (“In summary, NetChoice has demonstrated a substantial likelihood of success on its claim that H.B. 1126 is either overinclusive or underinclusive, or both, for achieving the assert governmental interest-protecting minors from predatory behavior online ....”); Yost, 2024 WL 555904, at *13 (“In other words, the Act is either underinclusive or overinclusive, or both, for all the purported government interests at stake).

Defendants do not specifically dispute NetChoice's argument that if the Central Coverage Definition is unconstitutional, the entire Act, through the Central Coverage Definition, is unconstitutional.

2. The Remaining Preliminary Injunction Factors Support NetChoice's Request for Injunctive Relief.

Having determined NetChoice is substantially likely to succeed on the merits of its First Amendment challenge, the court next considers the remaining preliminary injunction factors. “When a movant establishes the first prong of a preliminary injunction based on a First Amendment claim, the remaining prongs generally also weigh in [its] favor.” Such is the case here.

Pryor v. Sch. Dist. No. 1, 99 F.4th 1243, 1254 (10th Cir. 2024) (citing Hobby Lobby Stores, Inc. v. Sebelius, 723 F.3d 1114, 1145 (10th Cir. 2013)).

i. NetChoice Has Shown Its Members Will Suffer Irreparable Injury Absent a Preliminary Injunction.

Under the second preliminary injunction factor, NetChoice must demonstrate it will suffer irreparable injury in the absence of an injunction. Within the First Amendment context, however, “[t]he Supreme Court has made clear that ‘the loss of First Amendment freedoms, for even minimal periods of time, unquestionably constitutes irreparable injury.'” This principle comports with the wider view that “the infringement of a constitutional right . . . require[s] no further showing of irreparable injury.” Separately, courts hold that a plaintiff suffers irreparable injury when they face “monetary damages that cannot later be recovered for reasons such as sovereign immunity.”

Heideman v. S. Salt Lake City, 348 F.3d 1182, 1190 (10th Cir. 2003) (quoting Elrod v. Burns, 427 U.S. 347, 373 (1976)).

Free the Nipple, 916 F.3d at 805 (citing Awad, 670 F.3d at 1131).

Chamber of Com. of U.S. v. Edmondson, 594 F.3d 742, 770-71 (10th Cir. 2010) (citing Kan. Health Care Ass'n, Inc. v. Kan. Dep't of Soc. & Rehab. Servs., 31 F.3d 1536, 1543 (10th Cir. 1994)) (“Imposition of monetary damages that cannot later be recovered for reasons such as sovereign immunity constitutes irreparable injury.”).

NetChoice has shown its members face irreparable injury absent a preliminary injunction for both these reasons. First, as explained above, NetChoice has shown it is substantially likely to succeed on the merits of its claim the Act, through the Central Coverage Definition, violates its members' First Amendment rights. Because even brief First Amendment violations “unquestionably constitute[] irreparable injury,” NetChoice has shown it will suffer irreparable harm absent a preliminary injunction. Second, NetChoice members stand to incur substantial unrecoverable expenses in the form of either civil penalties or compliance costs absent a preliminary injunction because Defendants, sued in their official capacities as government employees, are immune from suit for monetary damages. This harm is particularly concerning given the high cost of violating the Act-$2,500 per offense-and the State's failure to promulgate administrative rules enabling social media companies to avail themselves of the Act's safe harbor provision before it takes effect on October 1, 2024.

Id.

NetChoice Motion at 39-40.

See Utah Code § 13-71-302. Defendants' Opposition asserts that state law does not allow the Division to promulgate these rules until the Act's effective date-October 1, 2024. See Dkt. 58-7 (NetChoice), Declaration of Katherine Hass ¶ 5 (citing Utah Code § 13-2-1(2)). But in discussing this issue at oral argument, Defendants did not recognize the harm this delay poses to social media companies who wish to comply with the Act on day one. Neither did the Defendants suggest the State would make any effort to mitigate this harm, such as delaying enforcement of the law or providing draft rules to NetChoice members or the public.

ii. The Balance of Equities and the Public Interest Weigh in Favor of a Preliminary Injunction.

When the government is the opposing party to a lawsuit, the third and fourth preliminary factors “merge.” A moving party must show avoiding the harm a threatened injury poses is consistent with the public interest. Though, the Tenth Circuit holds “it is always in the public interest to prevent the violation of a party's constitutional rights.”

Aposhian v. Barr, 958 F.3d 969, 978 (10th Cir. 2020) (citing Nken v. Holder, 556 U.S. 418, 435 (2009)), abrogated on other grounds by Garland v. Cargill, 602 U.S. 406 (2024).

Awad, 670 F.3d at 1132 (quoting G&V Lounge, Inc. v. Mich. Liquor Control Comm'n, 23 F.3d 1071, 1079 (6th Cir. 1994)).

Because NetChoice has shown it is substantially likely the Act violates social media companies' First Amendment rights, it follows that the balance of equities and the public interest lean in NetChoice's favor. Defendants counter that the public interest favors “protecting children and adolescents from the harmful effects of social media” and preserving the State's ability to “enact and enforce” state laws. But as discussed above, Defendants have not shown the State's desire to protect minors eclipses the First Amendment. Indeed, the public interest in protecting constitutional rights is “more profound” than the public interest in carrying out “the will of the voters” through the implementation of state laws. Accordingly, the court concludes the final two preliminary injunction factors, in combination with the first two factors, support granting NetChoice's request for a preliminary injunction.

NetChoice Opposition at 58-59.

Awad, 670 F.3d at 1132 (citation omitted).

iii. No Bond is Required.

Rule 65(c) provides a “court may issue a preliminary injunction . . . only if the movant gives a security in an amount that the court considers proper to pay the costs and damages sustained by any party found to have been wrongfully enjoined or restrained.” Although the parties do not address this issue, the court must. Trial courts in the Tenth Circuit “have ‘wide discretion under Rule 65(c) in determining whether to require security.'”

Coquina Oil Corp. v. Transwestern Pipeline Co., 825 F.2d 1461, 1462 (10th Cir. 1987) (holding that while a district court judge has discretion to “determine a bond is unnecessary to secure a preliminary injunction . . . when a trial court fails to contemplate the imposition of the bond,” as required by Rule 65(c), “its order granting a preliminary injunction is unsupportable”).

RoDa Drilling Co. v. Siegal, 552 F.3d 1203, 1215 (10th Cir. 2009) (quoting Winnebago Tribe of Neb. v. Stovall, 341 F.3d 1202, 1206 (10th Cir. 2003)).

Because this preliminary injunction “enforces fundamental constitutional rights against the government,” the court determines “[w]aiving the security requirement best accomplishes the purposes of Rule 65(c).” No bond is required.

S. Utah Drag Stars v. City of St. George, 677 F.Supp.3d 1252, 1294 (D. Utah 2023) (quoting United Utah Party v. Cox, 268 F.Supp.3d 1227, 1260 (D. Utah 2017)); Rocky Mountain Gun Owners v. Polis, 685 F.Supp.3d 1033, 1061 (D. Colo. 2023) (“The Court finds a bond unnecessary as this case seeks to enforce a constitutional right against the government.”).

II. The Zoulek Plaintiffs' Motion

The court now turns to the Zoulek Plaintiffs' Motion, which argues, among other things, that the court should enjoin enforcement of the Act because the Zoulek Plaintiffs are likely to prevail on the merits of their First Amendment claims. Although neither party addresses the Zoulek Plaintiffs' standing to bring these claims, the court begins-and ends-its analysis with this threshold inquiry.

Zoulek Motion at 1. Zoulek also argues they are likely to prevail on their claim under the Commerce Clause. Id. at 2. However, the court previously dismissed this claim. See Memorandum Decision and Order Granting Defendants' Motion to Dismiss. Only Zoulek's First Amendment claims remain for consideration. See Zoulek FAC ¶¶ 70-87, 88-90.

Collins, 916 F.3d at 1314 (quoting Arbaugh, 546 U.S. at 514).

A. The Zoulek Plaintiffs Lack Standing to Challenge the Act Under the First Amendment.

Recall that to establish standing, a plaintiff must show they have suffered an injury in fact, that their injury is fairly traceable to the challenged actions of defendants, and that it is likely, as opposed to merely speculative, that their injury will be redressed by a favorable decision of the court. This showing is “substantially more difficult to establish” when “a plaintiff's asserted injury arises from the government's allegedly unlawful regulation . . . of someone else.” Because courts are reluctant to engage in “guesswork as to how independent decisionmakers will exercise their judgment,” a plaintiff must “adduc[e] facts showing that . . . third-party choices have been or will be made in such a manner as to . . . permit redressability of injury.”

Speech First, 92 F.4th at 949 (quoting Lujan, 504 U.S. at 560-61).

State v. United States Env't Prot. Agency, 989 F.3d 874, 889 (10th Cir. 2021) (quoting Lujan, 504 U.S. at 562).

Clapper v. Amnesty Int'l USA, 568 U.S. 398, 413 (2013).

State, 989 F.3d at 889 (quoting Ctr. for Biological Diversity v. United States Dep't of Interior, 563 F.3d 446, 478 (D.C. Cir. 2009)) (emphasis in original).

Applying this standard to the present case, the Zoulek Plaintiffs suffer from a redressability problem. The Act regulates social media companies-not social media users. And any injuries to the Zoulek Plaintiffs' First Amendment rights would arise as the second-order effects of social media companies' responses to the Act. Nonetheless, the Zoulek Plaintiffs have not identified how an injunction will ensure redress of their purported injuries. They have not shown that social media companies-which are free to restrict user access, remove content, and otherwise moderate their platforms as they see fit-will maintain minors' access to their platforms absent an injunction.

Courts have uniformly held, social media companies are private entities, and the public does not have a First Amendment right to use their platforms. See, e.g., O'Handleyv. Weber, 62 F.4th 1145, 1155-57 (9th Cir. 2023) (noting that “[a]s a private company, Twitter is not ordinarily subject to the Constitution's constraints” and holding it “did not violate the Constitution” when moderating a user's posts or suspending an account); Prager Univ. v. Google LLC, 951 F.3d 991, 996-99 (9th Cir. 2020) (explaining that YouTube may be a “public square on the Internet” but that does not turn it into a state actor or convert the platform to a public forum subject to the First Amendment); DeLima v. Google, Inc., 561 F.Supp.3d 123, 134-35 (D.N.H. 2021) (dismissing a claim that social media companies' removal of posts and deleting of accounts violated the First Amendment because companies are private entities, not state actors); Fed. Agency of News LLC v. Facebook, Inc., 432 F.Supp.3d 1107, 1121-27 (N.D. Cal. 2020) (holding Facebook's removal of a plaintiff's content and profiles did not violate the First Amendment because company was a private actor and social media platforms are not public forums); Davison v. Facebook, Inc., 370 F.Supp.3d 621, 628-29 (E.D. Va. 2019) (dismissing constitutional claims against Facebook because, “as a private entity, [it has] the right to regulate the content of its platforms as it sees fit”); Nyabwa v. FaceBook, No. 2:17-cv-24, 2018 WL 585467, at *1 (S.D. Tex. Jan. 26, 2018) (explaining that notwithstanding the Supreme Court's recognition “that social media sites like FaceBook and Twitter have become the equivalent of a public forum for sharing ideas and commentary, the Court did not declare a cause of action against a private entity such as FaceBook for a violation of the free speech rights protected by the First Amendment”).

At oral argument, the Zoulek Plaintiffs suggested the existence of NetChoice's lawsuit resolved their redressability problem. They identified references to the parallel lawsuit in their Motion and First Amended Complaint and suggested the court could take judicial notice of its existence as evidence social media companies intended to maintain the status quo unless otherwise required to act. However, the Zoulek Plaintiffs did not cite legal authority in support of this contention, and NetChoice's lawsuit offers no clear indication its members will maintain minors' access to their platforms absent an injunction.

The Zoulek Plaintiffs directed the court to a portion of United States v. Rodriguez-Aguirre, 264 F.3d 1195, 1203 (10th Cir. 2001) describing the standard the court should apply to review a motion to dismiss for lack of subject matter jurisdiction under Federal Rule of Civil Procedure 12(b)(1). However, the case is inapt because it addressed the proper handling of disputed jurisdictional facts. Id. The issue here is that the Zoulek Plaintiffs simply fail to plead essential jurisdictional facts. It is also noteworthy that the Circuit was not dealing with a third-party standing issue and made no comments about the heightened bar applied to third-party standing. See, e.g., 1204-05.

Perhaps social media companies would maintain minors' access, recognizing the ways minors use their platforms to communicate and learn. Or perhaps social media companies would see value in the State's mental health and data privacy concerns and voluntarily reduce minors' access. The issue is we do not know. The Zoulek Plaintiffs have not pled facts demonstrating social media companies “will likely react in predictable ways” if the court enjoins the Act.

Murthy v. Missouri, 144 S.Ct. 1972, 1986 (2024) (quoting Dept. of Com. v. New York, 588 U.S. 752, 768 (2019)).

Under these circumstances, the court must conclude the Zoulek Plaintiffs' injuries are not redressable, and the Zoulek Plaintiffs' lack standing to challenge the Act under the First Amendment.

B. The Court Dismisses Counts I and II of the Zoulek Plaintiffs' First Amended Complaint Without Prejudice.

The effect of the Zoulek Plaintiffs' lack of standing extends beyond their Motion for Preliminary Injunction. “Because Article III standing is a jurisdictional issue,” the court's conclusion that the Zoulek Plaintiffs lack standing means the court does not have subject matter jurisdiction to hear their claims. And “once a federal court determines that it is without subject matter jurisdiction, the court is powerless to continue.” It “must dismiss the action.”Accordingly, the court dismisses the Zoulek Plaintiffs' First Amendment claims, Counts I and II of their First Amended Complaint, without prejudice.

Felix v. City of Bloomfield, 841 F.3d 848, 854 (10th Cir. 2016) (citing Lujan, 504 U.S. at 559-60).

Cunningham v. BHP Petroleum Great Britain PLC, 427 F.3d 1238, 1245 (10th Cir. 2005) (quoting Univ. of S. Ala. v. Am. Tobacco Co., 168 F.3d 405, 410 (11th Cir. 1999)); see also Brereton v. Bountiful City Corp., 434 F.3d 1213, 1217 (10th Cir. 2006) (citing Gold v. Loc. 7 United Food & Com. Workers, 159 F.3d 1307, 1311 (10th Cir. 1998)) (“[O]nce a court determines it lacks jurisdiction over a claim, it perforce lacks jurisdiction to make any determination of the merits of the underlying claim.”).

Fed.R.Civ.P. 12(h)(3) (“If the court determines at any time that it lacks subject-matter jurisdiction, the court must dismiss the action.”).

CONCLUSION

For the forgoing reasons, NetChoice's Motion is GRANTED. Defendants, their agents, their employees, and all other persons acting under their direction or control are PRELIMINARILY ENJOINED under Federal Rule of Civil Procedure 65 from enforcing any part of the Utah Minor Protection in Social Media Act, Utah Code §§ 13-71-101 to 401, pending final disposition of the issues in this case.

The Zoulek Plaintiffs' Motion is DENIED and Counts I and II of the Zoulek Plaintiffs'

First Amended Complaint are dismissed without prejudice. The Zoulek Plaintiffs may seek leave to file an amended complaint within thirty (30) days.

SO ORDERED


Summaries of

NetChoice LLC v. Reyes

United States District Court, District of Utah
Sep 10, 2024
2:23-cv-00911-RJS-CMR (D. Utah Sep. 10, 2024)
Case details for

NetChoice LLC v. Reyes

Case Details

Full title:NETCHOICE, LLC, Plaintiff, v. SEAN D. REYES, in his official capacity as…

Court:United States District Court, District of Utah

Date published: Sep 10, 2024

Citations

2:23-cv-00911-RJS-CMR (D. Utah Sep. 10, 2024)