From Casetext: Smarter Legal Research

NetChoice, LLC v. Bonta

United States District Court, Northern District of California
Sep 18, 2023
692 F. Supp. 3d 924 (N.D. Cal. 2023)

Opinion

Case No. 22-cv-08861-BLF

2023-09-18

NETCHOICE, LLC, d/b/a NetChoice Plaintiff, v. Rob BONTA, Attorney General of the State of California, in his official capacity, Defendant.

Ambika Kumar, Pro Hac Vice, Davis Wright Tremaine LLP, Seattle, WA, David Morris Gossett, Pro Hac Vice, Meenakshi Krishnan, Pro Hac Vice, Davis Wright Tremaine LLP, Washington, DC, Robert Corn-Revere, Pro Hac Vice, Foundation for Individual Rights and Expression, Washington, DC, Adam Stanley Sieff, Davis Wright Tremaine LLP, Los Angeles, CA, for Plaintiff. Elizabeth K. Watson, California Attorney General's Office California Department of Justice, San Francisco, CA, Nicole Juliet Kau, Office of the California Attorney General Department of Justice, Los Angeles, CA, for Defendant. Megan Leef Brown, Pro Hac Vice, Boyd Garriott, Pro Hac Vice, Wiley Rein LLP, Washington, DC, Kathleen Scott, Pro Hac Vice, Washington, DC, Robert Edward Dunn, Eimer Stahl LLP, San Jose, CA, for Amicus Chamber of Commerce of the United States of America. Lindsay C. Harrison, Jenner and Block LLP, Washington, DC, for Amicus Eric Goldman. Christopher Patrick Eby, Pro Hac Vice, King & Spalding, Atlanta, GA, Tamra Moore, King & Spalding, Washington, DC, for Amici Chamber of Progress, IP Justice, LGBT Tech Institute. Lauren Gallo White, Wilson Sonsini Goodrich and Rosati, PC, San Francisco, CA, for Amicus Computer & Communications Industry Association. Alan Jay Butler, Electronic Privacy Information Center, Washington, DC, for Amici Electronic Privacy Information Center, Reset Tech. Laura R. Garrett, Social Media Victims Law Center, Seattle, WA, for Amici Fairplay, The Public Health Advocacy Institute, Childrens Advocacy Institute, Center for Humane Technology, Spark and Stitch, Accountable Tech, Center for Digital Democracy, Design It For US, North Carolina Young Peoples Alliance, Encode Justice, Civics Unplugged, Archewell Foundation, 5 Rights Foundation, EK, Common Sense Media, Ultraviolet. Dana R. Green, Pro Hac Vice, Legal Department, New York, NY, Samantha Chariz Hamilton, Pro Hac Vice, University of Georgia School of Law, Athens, GA, for Amici The New York Times Company, Student Press Law Center.


Ambika Kumar, Pro Hac Vice, Davis Wright Tremaine LLP, Seattle, WA, David Morris Gossett, Pro Hac Vice, Meenakshi Krishnan, Pro Hac Vice, Davis Wright Tremaine LLP, Washington, DC, Robert Corn-Revere, Pro Hac Vice, Foundation for Individual Rights and Expression, Washington, DC, Adam Stanley Sieff, Davis Wright Tremaine LLP, Los Angeles, CA, for Plaintiff. Elizabeth K. Watson, California Attorney General's Office California Department of Justice, San Francisco, CA, Nicole Juliet Kau, Office of the California Attorney General Department of Justice, Los Angeles, CA, for Defendant. Megan Leef Brown, Pro Hac Vice, Boyd Garriott, Pro Hac Vice, Wiley Rein LLP, Washington, DC, Kathleen Scott, Pro Hac Vice, Washington, DC, Robert Edward Dunn, Eimer Stahl LLP, San Jose, CA, for Amicus Chamber of Commerce of the United States of America. Lindsay C. Harrison, Jenner and Block LLP, Washington, DC, for Amicus Eric Goldman. Christopher Patrick Eby, Pro Hac Vice, King & Spalding, Atlanta, GA, Tamra Moore, King & Spalding, Washington, DC, for Amici Chamber of Progress, IP Justice, LGBT Tech Institute. Lauren Gallo White, Wilson Sonsini Goodrich and Rosati, PC, San Francisco, CA, for Amicus Computer & Communications Industry Association. Alan Jay Butler, Electronic Privacy Information Center, Washington, DC, for Amici Electronic Privacy Information Center, Reset Tech. Laura R. Garrett, Social Media Victims Law Center, Seattle, WA, for Amici Fairplay, The Public Health Advocacy Institute, Childrens Advocacy Institute, Center for Humane Technology, Spark and Stitch, Accountable Tech, Center for Digital Democracy, Design It For US, North Carolina Young Peoples Alliance, Encode Justice, Civics Unplugged, Archewell Foundation, 5 Rights Foundation, EK, Common Sense Media, Ultraviolet. Dana R. Green, Pro Hac Vice, Legal Department, New York, NY, Samantha Chariz Hamilton, Pro Hac Vice, University of Georgia School of Law, Athens, GA, for Amici The New York Times Company, Student Press Law Center.

ORDER GRANTING MOTION FOR PRELIMINARY INJUNCTION

[Re: ECF 29] BETH LABSON FREEMAN, United States District Judge

This suit challenges the enforceability of the California Age-Appropriate Design Code Act ("the CAADCA" or "the Act"), which was recently enacted for the stated purpose of affording protections to children when they access the internet. See Cal. Civ. Code § 1798.99.29. The Act applies to for-profit businesses that collect consumers' personal information and satisfy other criteria relating to business size and revenue. See CAADCA § 30; Cal. Civ. Code § 1798.140. Effective July 1, 2024, the Act imposes a number of requirements on any covered business that "provides an online service, product, or feature likely to be accessed by children." CAADCA § 31.

The CAADCA is codified at California Civil Code §§ 1798.99.28-1798.99.40. When citing to the Act, the Court will cite to the statute's abbreviated title and last two digits. For example, the Court will cite to Cal. Civil Code § 1798.99.31 as "CAADCA § 31."

Plaintiff NetChoice, LLC ("NetChoice") "is a national trade association of online businesses that share the goal of promoting free speech and free enterprise on the Internet." Compl. ¶ 5, ECF 1. NetChoice's members include Google, Amazon, Meta, TikTok and many other companies with strong online presences. NetChoice sues Defendant Rob Bonta, Attorney General of the State of California ("the State"), for declaratory and injunctive relief related to the CAADCA, which it asserts is both facially unconstitutional and preempted by federal statute.

NetChoice moves for preliminary injunction based on its claims that the CAADCA violates the First Amendment and the dormant Commerce Clause of the United States Constitution, and is preempted by both the Children's Online Privacy Protection Act ("COPPA"), 15 U.S.C. §§ 6501-6506, and Section 230 of the Communications Decency Act, 47 U.S.C. § 230. See Mot., ECF 29. The State opposes the motion, arguing that the CAADCA regulates conduct—the collection and use of children's personal information—that does not implicate the First Amendment. See Opp'n, ECF 51. The State also contends that the CAADCA does not violate the dormant Commerce Clause and is not preempted by either COPPA or Section 230. See id.

Mindful that the CAADCA was enacted with the unanimous support of California's Legislature and Governor, the Court has given careful consideration to the motion, the State's opposition, NetChoice's reply, the supplemental briefs filed by both parties, the briefs filed by seven sets of amici curiae, and the oral arguments presented at the hearing on July 27, 2023. The Court finds that although the stated purpose of the Act—protecting children when they are online—clearly is important, NetChoice has shown that it is likely to succeed on the merits of its argument that the provisions of the CAADCA intended to achieve that purpose do not pass constitutional muster. Specifically, the Court finds that the CAADCA likely violates the First Amendment. The motion for preliminary injunction is GRANTED on that basis.

I. BACKGROUND

The internet has become indispensable to the exchange of information. Many online providers allow users to view content and access services without creating an account, while others require the creation of a free account to access services, and still others require users to pay fees. See Cairella Decl. ¶¶ 4-8, ECF 22; Masnick Decl. ¶¶ 5-6, ECF 29; Roin Decl. ¶¶ 7-9, ECF 25; Paolucci Decl. ¶ 2, ECF 28. Online providers generally rely on advertising to earn revenue that supports the content and services they offer. See Cairella Decl. ¶¶ 4, 21; Roin Decl. ¶ 10. Advertisements are targeted to users based on their interests, which are gleaned from data collected from the users while they are online. See Egelman Decl. ¶¶ 13-14, ECF 51-1. Such data also is used by online providers to tailor content to individual users. See Cairella Decl. ¶ 8; Roin Decl. ¶¶ 2-6. In addition, online providers may sell user data to third parties. See Egelman Decl. ¶ 11.

Users can manage their online privacy by reading privacy policies before engaging with the provider's services. See Egelman Decl. ¶ 24. Users also may change their privacy settings to block or delete "cookies," which are data that websites store in consumers' web browsers, which are then transmitted back to websites when visited again. See id. ¶ 29. However, privacy policies can be difficult to understand and privacy settings are not always user friendly. See id. ¶¶ 24-30.

These privacy concerns have become increasingly relevant to children, because their internet use has grown dramatically in recent years. See Radesky Decl. ¶¶ 21-25, ECF 51-5. During the COVID-19 pandemic, children's access to digital technology and time online went up significantly. See id. ¶ 26. Children's time online increased approximately 52% during the pandemic, and heavier technology use habits have persisted. See id. Children depend on the internet for both educational and entertainment purposes. See id. ¶¶ 26-29. Unplugging is not a viable option. See id. ¶ 29.

A federal child privacy law, COPPA, limits the ability of online providers to collect personal information from children. See 15 U.S.C.A. §§ 6501-06. COPPA makes it "unlawful for an operator of a website or online service directed to children, or any operator that has actual knowledge that it is collecting personal information from a child, to collect personal information from a child in a manner that violates the regulations prescribed" under the statute. 15 U.S.C. § 6502(a)(1). "Child" is defined as an individual under the age of 13. 15 U.S.C. § 6501(1). The applicable regulations require the operator to obtain parental consent prior to any collection, use, or disclosure of personal information from children. See 16 C.F.R. § 312.3(b).

The California Consumer Privacy Act ("CCPA") imposes limits on the collection of personal information from users generally, requiring among other things that online providers inform users of the categories of personal information to be collected and the purposes of such collection. See Cal. Civ. Code § 1798.100(a)(1). The CCPA defines "personal information" to include any information that "relates to, describes, is reasonably capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household." Cal. Civ. Code § 1798.140(v).

It is against this backdrop that the CAADCA was enacted. The CAADCA goes far beyond the scope of protections offered by COPPA and the CCPA. Whereas COPPA limits the collection of user data by operators of websites and services "directed to children," 15 U.S.C. § 6502(a)(1), the CAADCA "declares that children should be afforded protections not only by online products and services specifically directed at them but by all online products and services they are likely to access," CAADCA § 29. COPPA protects children under the age of 13, see 15 U.S.C. § 6501(1), while the CAADCA protects children under the age of 18, see CAADCA § 30(b)(1). COPPA gives parents authority to make decisions about use of their children's personal information, see 16 C.F.R. § 312.3(b), and the CCPA gives users authority to make decisions about their own personal information, see Cal. Civ. Code § 1798.135. In contrast, the CAADCA requires online providers to create a Data Protection Impact Assessment ("DPIA") report identifying, for each offered online service, product, or feature likely to be accessed by children, any risk of material detriment to children arising from the provider's data management practices. See CAADCA § 30(a)(1). Providers must create a "timed plan to mitigate or eliminate" the risks identified in the DPIA "before the online service, product, or feature is accessed by children," id. § 30(a)(2), and must provide the DPIA reports to the California Attorney General upon written request, see id. § 30(a)(2). The CAADCA also requires that online providers comply with a list of enumerated mandates and prohibitions, discussed in detail below. See id. § 31(a)-(b).

Covered businesses must complete the required DPIA reports and satisfy related requirements by July 1, 2024, and continue to do so on an ongoing basis. See CAADCA §§ 31, 33. The CAADCA authorizes the California Attorney General to bring a civil enforcement action against any business that fails to comply with the Act's requirements. See id. § 35. Violators are subject to civil penalties of $2,500 per child for each negligent violation and $7,500 for each intentional violation. See id.

NetChoice filed this suit on December 14, 2022, challenging the CAADCA as facially unconstitutional and preempted by federal statute. The complaint asserts the following claims: (1) violation of the First and Fourteenth Amendments to the U.S. Constitution, and Article I, Section 2(a) of the California Constitution; (2) violation of the Fourth Amendment to the U.S. Constitution; (3) void for vagueness under the First Amendment and Due Process Clause of the U.S. Constitution, and Article I, Section 7(a) of the California Constitution; (4) violation of the dormant Commerce Clause of the U.S. Constitution; (5) preemption by COPPA; and (6) preemption by Section 230. Compl. ¶¶ 76-122. The complaint requests declaratory and injunctive relief prohibiting enforcement of the CAADCA.

NetChoice now seeks a preliminary injunction enjoining enforcement of the CAADCA pending disposition of the suit.

II. LEGAL STANDARD

"Courts consider four factors in deciding whether to grant a preliminary injunction: the plaintiff's likelihood of success on the merits; her likelihood of suffering irreparable harm in the absence of preliminary relief; whether the balance of equities tips in her favor; and whether an injunction is in the public interest." Garcia v. City of Los Angeles, 11 F.4th 1113, 1118 (9th Cir. 2021) (citing Winter v. Nat. Res. Def. Council, Inc., 555 U.S. 7, 20, 129 S.Ct. 365, 172 L.Ed.2d 249 (2008)).

In this circuit, "[l]ikelihood of success on the merits is the most important factor." Apartment Ass'n of L.A. Cnty., Inc. v. City of Los Angeles, 10 F.4th 905, 912 (9th Cir. 2021) (quoting California v. Azar, 911 F.3d 558, 575 (9th Cir. 2018)). "It is well-established that the first factor is especially important when a plaintiff alleges a constitutional violation and injury." Baird v. Bonta, 81 F.4th 1036, 1040 (9th Cir. Sept. 7, 2023). "If a plaintiff in such a case shows he is likely to prevail on the merits, that showing usually demonstrates he is suffering irreparable harm no matter how brief the violation." Id. Finally, "[w]hen, like here, the nonmovant is the government, the last two Winter factors merge." Id. at 1040 (quotation marks and citation omitted).

Where the plaintiff cannot show a likelihood of success on the merits, " 'serious questions going to the merits' and a hardship balance that tips sharply toward the plaintiff can support issuance of an injunction, assuming the other two elements of the Winter test are also met." All. for the Wild Rockies v. Cottrell, 632 F.3d 1127, 1132 (9th Cir. 2011). The Court need not apply this alternative formulation of the Winter test here because, as discussed below, NetChoice makes a strong showing on likelihood of success and on the other Winter factors.

III. DISCUSSION

A. Likelihood of Success on the Merits

NetChoice argues that it is likely to succeed on the merits of its claims that the Act violates free speech rights under the First Amendment (Claims 1 and 3), violates the dormant Commerce Clause (Claim 4), and is preempted by both COPPA (Claim 5) and Section 230 (Claim 6). See Mot. 1; Compl. ¶¶ 76-122.

1. First Amendment (Claims 1 and 3)

Claim 1 asserts that the CAADCA violates the First Amendment because it is an unlawful prior restraint on protected speech, is unconstitutionally overbroad, and regulates protected expression but fails strict scrutiny or any lesser standard of scrutiny that may apply. See Compl. ¶¶ 76-88. Claim 3 asserts that the CAADCA is void for vagueness under the First Amendment. See id. ¶¶ 93-103. NetChoice argues that it is likely to succeed on its First Amendment claims because the CAADCA: (1) is an unlawful prior restraint; (2) is unconstitutionally overbroad; (3) is void for vagueness; and (4) is subject to and fails strict scrutiny. Mot. 7-22.

Before taking up these arguments, the Court notes that both parties appear to have accepted the relaxed standard for standing in a First Amendment facial challenge. That is, although the general rule of standing is that a party may not challenge a statute's constitutionality "on the ground that it may conceivably be applied unconstitutionally to others," Broadrick v. Oklahoma, 413 U.S. 601, 610, 93 S.Ct. 2908, 37 L.Ed.2d 830 (1973), a party making a First Amendment claim has standing to challenge the impact of a regulation on both "its own expressive activities, as well as those of others," S.O.C. Inc. v. County of Clark, 152 F.3d 1136, 1142 (9th Cir. 1998). Accordingly, the parties have made—and the Court will consider—arguments about the CAADCA's alleged impact on the expressive activities of individuals and entities who are not NetChoice members.

Turning to NetChoice's four First Amendment arguments on likelihood of success, the Court first addresses the argument that the Act regulates protected expression and fails the applicable level of scrutiny. Because the argument is dispositive, the Court need not address NetChoice's additional First Amendment arguments based on prior restraint, overbreadth, and vagueness.

a. Legal Framework re Scrutiny for Regulations of Speech

"The First Amendment generally prevents government from proscribing speech, [ ] or even expressive conduct, [ ] because of disapproval of the ideas expressed." R.A.V. v. City of St. Paul, 505 U.S. 377, 382, 112 S.Ct. 2538, 120 L.Ed.2d 305 (1992) (internal citations omitted). A law compelling speech is no less subject to First Amendment scrutiny than a law prohibiting speech. Frudden v. Pilling, 742 F.3d 1199, 1203 (9th Cir. 2014) (citing W. Va. State Bd. Of Educ. v. Barnette, 319 U.S. 624, 633-34, 63 S.Ct. 1178, 87 L.Ed. 1628 (1943)).

The threshold question in a free speech analysis is whether the challenged law invokes the First Amendment at all. See Int'l Franchise Ass'n v. City of Seattle, 803 F.3d 389, 408 (9th Cir. 2015). "All manner of speech—from 'pictures, films, paintings, drawings, and engravings,' to 'oral utterance and the printed word'—qualify for the First Amendment's protections; no less can hold true when it comes to speech . . . conveyed over the Internet." 303 Creative LLC v. Elenis, 600 U.S. 570, 143 S. Ct. 2298, 2312, 216 L.Ed.2d 1131 (2023) (citations omitted). That is, the First Amendment's protections apply not only to written or verbal speech, but to any expressive conduct. See, e.g., Ward v. Rock Against Racism, 491 U.S. 781, 790, 109 S.Ct. 2746, 105 L.Ed.2d 661 (1989) ("Music, as a form of expression and communication, is protected under the First Amendment."). In determining whether a law regulates protected expression, courts evaluate "whether [activity] with a 'significant expressive element' drew the legal remedy or the ordinance has the inevitable effect of 'singling out those engaged in expressive activity.' " Int'l Franchise, 803 F.3d at 408 (quoting Arcara v. Cloud Books, Inc., 478 U.S. 697, 706-07, 106 S.Ct. 3172, 92 L.Ed.2d 568 (1986)). For example, a tax on paper and ink that in effect "single[s] out the press for special treatment" regulates protected expression, although the application of a general sales tax to newspapers does not. See Minneapolis Star & Tribune Co. v. Minn. Comm'r of Revenue, 460 U.S. 575, 581-82, 103 S.Ct. 1365, 75 L.Ed.2d 295 (1983). A regulation that restricts conduct without a "significant expressive element" is not subject to any level of First Amendment scrutiny. See HomeAway.com, Inc. v. City of Santa Monica, 918 F.3d 676, 684 (9th Cir. 2019); see also Sorrell v. IMS Health Inc., 564 U.S. 552, 567, 131 S.Ct. 2653, 180 L.Ed.2d 544 (2011) ("[T]he First Amendment does not prevent restrictions directed at commerce or conduct from imposing incidental burdens on speech.").

If a court finds that a challenged law regulates some manner of protected expression, it must then "determine the scope of the [regulated] speech" in order to apply the appropriate level of scrutiny. Yim v. City of Seattle, 63 F.4th 783, 791 (9th Cir. 2023). There are several levels of scrutiny that may apply, depending on the type of expression at issue.

i. Strict Scrutiny

If the challenged regulation restricts only non-commercial speech, the level of scrutiny depends on whether the law is content based or content neutral. "Government regulation of speech is content based if a law applies to particular speech because of the topic discussed or the idea or message expressed," that is, if the regulation "draws distinctions based on the message a speaker conveys." Reed v. Town of Gilbert, 576 U.S. 155, 163, 135 S.Ct. 2218, 192 L.Ed.2d 236 (2015) (citations omitted). A law is also content based if, even though facially neutral, it "cannot be justified without reference to the content of the regulated speech, or . . . were adopted by the government because of disagreement with the message the speech conveys." Id. at 164, 135 S.Ct. 2218 (internal punctuation marks and citation omitted). If the court determines a law is content based, it applies strict scrutiny, "regardless of the government's benign motive, content-neutral justification, or lack of 'animus toward the ideas contained' in the regulated speech." Porter v. Martinez, 68 F.4th 429, 439 (9th Cir. 2023) (citations omitted). Strict scrutiny "requires the Government to prove that the restriction furthers a compelling interest and is narrowly tailored to achieve that interest." Reed, 576 U.S. at 171, 135 S.Ct. 2218; see also Berger v. City of Seattle, 569 F.3d 1029, 1050 (9th Cir. 2009) ("Under that standard [of strict scrutiny], the regulation is valid only if it is the least restrictive means available to further a compelling government.") (citing United States v. Playboy Ent. Grp., Inc., 529 U.S. 803, 813, 120 S.Ct. 1878, 146 L.Ed.2d 865 (2000)).

ii. Intermediate Scrutiny

"By contrast, a content-neutral regulation of [non-commercial] expression must meet the less exacting standard of intermediate scrutiny." Porter, 68 F.4th at 439 (citation omitted). Under this lower standard, "a regulation is constitutional 'if it furthers an important or substantial governmental interest; if the governmental interest is unrelated to the suppression of free expression; and if the incidental restriction on alleged First Amendment freedoms is no greater than is essential to the furtherance of that interest.' " Id. (quoting United States v. O'Brien, 391 U.S. 367, 377, 88 S.Ct. 1673, 20 L.Ed.2d 672 (1968)).

iii. Commercial Speech Scrutiny

If a statute regulates only commercial speech—i.e., " 'expression related solely to the economic interests of the speaker and its audience' " that "does no more than propose a commercial transaction," Am. Acad. of Pain Mgmt. v. Joseph, 353 F.3d 1099, 1106 (9th Cir. 2004) (citations omitted)—the court applies commercial speech scrutiny as established by Central Hudson Gas & Electric Corp. v. Public Service Commission of New York, 447 U.S. 557, 100 S.Ct. 2343, 65 L.Ed.2d 341 (1980). First, commercial speech is not entitled to any First Amendment protection if it is misleading or related to illegal activity. Cent. Hudson, 447 U.S. at 563-64, 100 S.Ct. 2343; see also, e.g., Thompson v. W. States Med. Ctr., 535 U.S. 357, 367, 122 S.Ct. 1497, 152 L.Ed.2d 563 (2002). For all other commercial speech, the court asks "whether the asserted governmental interest is substantial," "whether the regulation directly advances the governmental interest," and "whether [the regulation] is not more extensive than is necessary to serve that interest." Retail Digital Network, LLC v. Prieto, 861 F.3d 839, 844 (9th Cir. 2017) (quoting Cent. Hudson, 447 U.S. at 566, 100 S.Ct. 2343). The regulation is constitutional only if the answer to all three questions is "yes,". See id. This analysis applies to commercial speech regardless of whether the regulation is content based or content neutral. Yim, 63 F.4th at 793 n.14 (citing Valle Del Sol, Inc. v. Whiting, 709 F.3d 808, 820 (9th Cir. 2013)).

The Court will use the phrase "commercial speech scrutiny" in this order to refer to the "intermediate scrutiny standard codified in Central Hudson." Yim, 63 F.4th at 793.

iv. Scrutiny where Commercial and Non-Commercial Speech is Inextricably Intertwined

Finally, if a law regulates expression that "inextricably intertwines" commercial and non-commercial components, the court does not "apply[ ] one test to one phrase and another test to another phrase," but instead treats the entire expression as non-commercial speech and applies the appropriate level of scrutiny. Riley v. Nat'l Fed'n of the Blind of N.C., Inc., 487 U.S. 781, 796, 108 S.Ct. 2667, 101 L.Ed.2d 669 (1988) (applying strict scrutiny to content-based regulation of solicitation of charitable contributions by professional fundraisers while assuming professional fundraiser's financial motivation for solicitation intertwined commercial interest with non-commercial advocacy).

With these principles in mind, the Court now assesses whether NetChoice has shown that it is likely to succeed both in establishing that the CAADCA regulates protected expression, and in establishing that the CAADCA fails the applicable level of scrutiny.

b. Protected Expression or Non-Expressive Conduct

NetChoice argues that the CAADCA regulates speech by requiring internet content providers to take various actions to protect minors from harmful messages, such as making content-based assessments about potential harm to minors in order to comply with the DPIA requirement, and necessarily reviewing content to adhere to the Act's content policy enforcement provision. See Mot. 19-21. The State argues that the Act merely regulates business practices regarding the collection and use of children's data, so that its restrictions are only of nonexpressive conduct that is not entitled to First Amendment protection. See Opp'n 10-12. The State further contends that the Act does not restrict speech because it does not prevent any particular content from being shown to a minor—even if the content provider knows it would be harmful—as long as the content provider does not use the minor's personal information to do so. See id. at 12.

In evaluating whether the CAADCA regulates protected expression, the Court first notes that determining whether the statute applies to a business will often require viewing the content of the online service, product, or feature to evaluate whether it is "likely to be accessed by children" because, for example, it contains "advertisements marketed to children." CAADCA §§ 29(b)(4)(C), 31(a). But having to view content to determine whether the statute applies does not by itself mean that the statute regulates speech. See, e.g., Am. Soc'y of Journalists & Authors, Inc. v. Bonta, 15 F.4th 954, 960-61 (9th Cir. 2021) (finding law classifying workers as employees or independent contractors based on criteria including whether worker's output was "to be appreciated primarily or solely for its imaginative, aesthetic, or intellectual content" did not regulate speech) (citing Cal. Labor Code § 2778(b)(2)(F)(ii)). The question is whether the law at issue regulates expression "because of its message, its ideas, its subject matter, or its content." Id. at 960 (quoting Reed, 576 U.S. at 163, 135 S.Ct. 2218). The Court will evaluate this question first with respect to those portions of the statute that prohibit certain actions, see CAADCA § 31(b), and then turn to the sections of the statute mandating specific acts, see id. § 31(a).

i. The Act's Prohibitions (CAADCA § 31(b))

The CAADCA's prohibitions forbid the for-profit entities covered by the Act from engaging—with some exceptions—in the collection, sale, sharing, or retention of children's personal information, including precise geolocation information, for profiling or other purposes. See generally id. § 31(b). The State argues that the CAADCA's regulation of "collection and use of children's personal information" is akin to laws that courts have upheld as regulating economic activity, business practices, or other conduct without a significant expressive element. Opp'n 11-12 (citations omitted). There are two problems with the State's argument. First, none of the decisions cited by the State for this proposition involved laws that, like the CAADCA, restricted the collection and sharing of information. See id.; Rumsfeld v. Forum for Acad. & Inst. Rights, Inc., 547 U.S. 47, 66, 126 S.Ct. 1297, 164 L.Ed.2d 156 (2006) (statute denying federal funding to educational institutions restricting military recruiting did not regulate "inherently expressive" conduct because expressive nature of act of preventing military recruitment necessitated explanatory speech); Roulette v. City of Seattle, 97 F.3d 300, 305 (9th Cir. 1996) (ordinance prohibiting sitting or lying on sidewalk did not regulate "forms of conduct integral to, or commonly associated with, expression"); Int'l Franchise, 803 F.3d at 397-98, 408 (minimum wage increase ordinance classifying franchisees as large employers "exhibit[ed] nothing that even the most vivid imagination might deem uniquely expressive") (citation omitted); HomeAway.com, 918 F.3d at 680, 685 (ordinance regulating forms of short-term rentals was "plainly a housing and rental regulation" that "regulate[d] nonexpressive conduct—namely, booking transactions"); Am. Soc'y of Journalists & Authors, 15 F.4th at 961-62 (law governing classification of workers as employees or independent contractors "regulate[d] economic activity rather than speech").

Second, in a decision evaluating a Vermont law restricting the sale, disclosure, and use of information about the prescribing practices of individual doctors—which pharmaceutical manufacturers used to better target their drug promotions to doctors—the Supreme Court held the law to be an unconstitutional regulation of speech, rather than conduct. Sorrell, 564 U.S. at 557, 562, 570-71, 131 S.Ct. 2653. The Supreme Court noted that it had previously held the "creation and dissemination of information are speech within the meaning of the First Amendment," 564 U.S. at 570, 131 S.Ct. 2653 (citing Bartnicki v. Vopper, 532 U.S. 514, 527, 121 S.Ct. 1753, 149 L.Ed.2d 787 (2001); Rubin v. Coors Brewing Co., 514 U.S. 476, 481, 115 S.Ct. 1585, 131 L.Ed.2d 532 (1995); Dun & Bradstreet, Inc. v. Greenmoss Builders, Inc., 472 U.S. 749, 759, 105 S.Ct. 2939, 86 L.Ed.2d 593 (1985) (plurality opinion)), and further held that even if the prescriber information at issue was a commodity, rather than speech, the law's "content- and speaker-based restrictions on the availability and use of . . . identifying information" constituted a regulation of speech, id. at 570-71, 131 S.Ct. 2653; see also id. at 568, 131 S.Ct. 2653 ("An individual's right to speak is implicated when information he or she possesses is subject to 'restraints on the way in which the information might be used' or disseminated.") (quoting Seattle Times Co. v. Rhinehart, 467 U.S. 20, 32, 104 S.Ct. 2199, 81 L.Ed.2d 17 (1984)).

The State argues that Sorrell does not necessitate the conclusion that the CAADCA's prohibitions regulate speech because Sorrell (1) does not hold that a business has a right to collect data from individuals, and (2) is generally distinguishable on the facts because the physicians described in Sorrell, whose information was collected, were willing participants in the data generation who had the power to restrict the use of their information. See July 27, 2023 Hr'g Tr. ("Tr.") 27:16-31:13; Opp'n 11-12; see also id. 1 ("Plaintiff's members do not have a First Amendment right to children's personal information."). As for the first point, the State is correct that Sorrell does not address any general right to collect data from individuals. In fact, the Supreme Court noted that the "capacity of technology to find and publish personal information . . . presents serious and unresolved issues with respect to personal privacy and the dignity it seeks to secure." Sorrell, 564 U.S. at 579-80, 131 S.Ct. 2653. But whether there is a general right to collect data is independent from the question of whether a law restricting the collection and sale of data regulates conduct or speech. Under Sorrell, the unequivocal answer to the latter question is that a law that—like the CAADCA—restricts the "availability and use" of information by some speakers but not others, and for some purposes but not others, is a regulation of protected expression. Id. at 570-71, 131 S.Ct. 2653. The State's attempt to distinguish Sorrell based on the physicians' ability to prevent their information from being collected, see Tr. 31:7-10, is not persuasive because the Supreme Court concluded that the law at issue regulated speech based on its restrictions on the use of the information after it was collected, without including any reasoning about the nature of the source of the information. See Sorrell, 564 U.S. at 570-71, 131 S.Ct. 2653.

Accordingly, the Court finds that NetChoice is likely to succeed in showing that the Act's prohibitions—which restrict covered business from "[c]ollect[ing], sell[ing], shar[ing], or retain[ing] any personal information" for most purposes, see, e.g., CAADCA § 31(b)(3)—limit the "availability and use" of information by certain speakers and for certain purposes and thus regulate protected speech.

ii. The Act's Mandates (CAADCA § 31(a))

The Act's ten statutory mandates are more varied than the prohibitions. See generally CAADCA §§ 31(a)(1)-(10). One of the main requirements of the Act is that companies create DPIA reports identifying, for each offered online service, product, or feature likely to be accessed by children, any risk of material detriment to children arising from the business's data management practices. Id. §§ 31(a)(1)-(4). For example, a DPIA report must assess whether the "design of the online service, product, or feature could harm children, including by exposing children to harmful, or potentially harmful, content on the online service, product, or feature." Id. § 31(a)(1)(B). Each business must then create a "timed plan to mitigate or eliminate" the risks identified in the DPIA "before the online service, product, or feature is accessed by children," id. § 31(a)(2), and provide a list of all DPIA reports and the reports themselves to the state Attorney General upon written request, id. § 31(a)(3)-(4).

The State contended at oral argument that the DPIA report requirement merely "requires businesses to consider how the product's use design features, like nudging to keep a child engaged to extend the time the child is using the product" might harm children, and that the consideration of such features "has nothing to do with speech." Tr. 19:14-20:5; see also id. at 23:5-6 ("[T]his is only assessing how your business models . . . might harm children."). The Court is not persuaded by the State's argument because "assessing how [a] business model[ ] . . . might harm children" facially requires a business to express its ideas and analysis about likely harm. It therefore appears to the Court that NetChoice is likely to succeed in its argument that the DPIA provisions, which require covered businesses to identify and disclose to the government potential risks to minors and to develop a timed plan to mitigate or eliminate the identified risks, regulate the distribution of speech and therefore trigger First Amendment scrutiny. See Reply 2, ECF 60; Sorrell, 564 U.S. at 570, 131 S.Ct. 2653 ("This Court has held that the creation and dissemination of information are speech within the meaning of the First Amendment.") (citations omitted).

Several sections require businesses to affirmatively provide information to users, and by requiring speech necessarily regulate it. See CAADCA § 31(a)(7) (requiring businesses "[p]rovide any privacy information . . . concisely, prominently, and using clear language suited to the age of children likely to access that online service, product, or feature"); id. § 31(a)(8) (requiring that businesses "provide an obvious signal to [a] child" if the child is being tracked or monitored by a parent or guardian via an online service, product, or feature); id. § 31(a)(10) ("Provide prominent, accessible, and responsive tools to help children . . . exercise their privacy rights and report concerns."); see also, e.g., Rubin, 514 U.S. at 481, 115 S.Ct. 1585 (holding "information on beer labels" constitutes speech). The CAADCA also requires a covered business to enforce its "published terms, policies, and community standards"—i.e., its content moderation policies. CAADCA § 31(a)(9). Although the State argues that the policy enforcement provision does not regulate speech because businesses are free to create their own policies, it appears to the Court that NetChoice's position that the State has no right to enforce obligations that would essentially press private companies into service as government censors, thus violating the First Amendment by proxy, is better grounded in the relevant binding and persuasive precedent. See Mot. 11; Playboy Ent. Grp., 529 U.S. at 806, 120 S.Ct. 1878 (finding statute requiring cable television operators providing channels with content deemed inappropriate for children to take measures to prevent children from viewing content was unconstitutional regulation of speech); NetChoice, LLC v. Att'y Gen., Fla. ("NetChoice v. Fla."), 34 F.4th 1196, 1213 (11th Cir. 2022) ("When platforms choose to remove users or posts, deprioritize content in viewers' feeds or search results, or sanction breaches of their community standards, they engage in First-Amendment-protected activity."); Engdahl v. City of Kenosha, 317 F. Supp. 1133, 1135-36 (E.D. Wis. 1970) (holding ordinance restricting minors from viewing certain movies based on ratings provided by Motion Picture Association of America impermissibly regulated speech).

The remaining two sections of the CAADCA require businesses to estimate the age of child users and provide them with a high default privacy setting, or forgo age estimation and provide the high default privacy setting to all users. CAADCA §§ 31(a)(5)-(6). The State argues that "[r]equiring businesses to protect children's privacy and data implicates neither protected speech nor expressive conduct," and notes that the provisions "say[ ] nothing about content and do[ ] not require businesses to block any content for users of any age." Opp'n 15. However, the materials before the Court indicate that the steps a business would need to take to sufficiently estimate the age of child users would likely prevent both children and adults from accessing certain content. See Amicus Curiae Br. of Prof. Eric Goldman ("Goldman Am. Br.") 4-7 (explaining that age assurance methods create time delays and other barriers to entry that studies show cause users to navigate away from pages), ECF 34-1; Amicus Curiae Br. of New York Times Co. & Student Press Law Ctr. ("NYT Am. Br.") 6 (stating age-based regulations would "almost certain[ly] [cause] news organizations and others [to] take steps to prevent those under the age of 18 from accessing online news content, features, or services"), ECF 56-1. The age estimation and privacy provisions thus appear likely to impede the "availability and use" of information and accordingly to regulate speech. Sorrell, 564 U.S. at 570-71, 131 S.Ct. 2653.

The Court is keenly aware of the myriad harms that may befall children on the internet, and it does not seek to undermine the government's efforts to resolve internet-based "issues with respect to personal privacy and . . . dignity." See Sorrell, 564 U.S. at 579, 131 S.Ct. 2653; Def.'s Suppl. Br. 1 ("[T]he 'serious and unresolved issues' raised by increased data collection capacity due to technological advances remained largely unaddressed [in Sorrell]."). However, the Court is troubled by the CAADCA's clear targeting of certain speakers—i.e., a segment of for-profit entities, but not governmental or non-profit entities—that the Act would prevent from collecting and using the information at issue. As the Supreme Court noted in Sorrell, the State's arguments about the broad protections engendered by a challenged law are weakened by the law's application to a narrow set of speakers. See Sorrell, 564 U.S. at 580, 131 S.Ct. 2653 ("Privacy is a concept too integral to the person and a right too essential to freedom to allow its manipulation to support just those ideas the government prefers").

For the foregoing reasons, the Court finds that NetChoice is likely to succeed in showing that the CAADCA's prohibitions and mandates regulate speech, so that the Act triggers First Amendment scrutiny.

c. The Type of Speech Regulated by the CAADCA

Because the Court has found the CAADCA likely regulates protected speech, it must now determine what type of speech is at issue in order to apply the appropriate level of scrutiny. As described above, see Part III(A)(1)(a), strict scrutiny applies to a law regulating non-commercial speech in a content-based manner, meaning the law "target[s] speech based on its communicative content." Reed, 576 U.S. at 163, 135 S.Ct. 2218. To survive strict scrutiny, the "the Government [must] prove that the restriction furthers a compelling interest and is narrowly tailored to achieve that interest." Id. at 171, 135 S.Ct. 2218. A content-neutral regulation of non-commercial speech, on the other hand, "is constitutional as long as it withstands intermediate scrutiny—i.e., if: (1) 'it furthers an important or substantial government interest'; (2) 'the governmental interest is unrelated to the suppression of free expression'; and (3) 'the incidental restriction on alleged First Amendment freedoms is no greater than is essential to the furtherance of that interest.' " Jacobs v. Clark Cnty. Sch. Dist., 526 F.3d 419, 434 (9th Cir. 2008) (quoting Turner Broad. Sys., Inc. v. FCC, 512 U.S. 622, 661-62, 114 S.Ct. 2445, 129 L.Ed.2d 497 (1994)). And if the speech at issue is commercial, courts apply intermediate scrutiny under the four-part test articulated by the Supreme Court in Central Hudson, which the Ninth Circuit has described as follows:

(1) [I]f "the communication is neither misleading nor related to unlawful activity," then it merits First Amendment scrutiny as a threshold matter; [and] in order for the restriction to withstand such scrutiny, (2) "[t]he State must assert a substantial interest to be achieved by restrictions on commercial speech;" (3) "the restriction must directly advance the state interest involved;" and (4) it must not be "more extensive than is necessary to serve that interest."
Metro Lights, L.L.C. v. City of Los Angeles, 551 F.3d 898, 903 (9th Cir. 2009) (quoting Cent. Hudson, 447 U.S. at 564-66, 100 S.Ct. 2343); see also Junior Sports Mags. Inc. v. Bonta, 80 F.4th 1109, 1115-16 (9th Cir. Sept. 13, 2023).

NetChoice argues that the CAADCA regulates non-commercial speech because the speech at issue goes beyond proposing a commercial transaction, Reply 10, and that the speech is "content-based in many obvious respects" because its "very premise [is] that providers must prioritize content that promotes the 'well-being' of minors," Mot. 19. Accordingly, NetChoice contends that the Act is subject to strict scrutiny. See Mot. 19-21; Reply 9-10. The State counters that any protected expression regulated by the Act is at most commercial speech, so that the Act is subject to the lower level of scrutiny described in Central Hudson. Opp'n 19. The State argues that the Act affects how businesses persuade consumers to engage with their products—such as by posting policies that aid consumers in deciding whether to engage with certain products—and that consumer engagement in turn drives the regulated businesses' revenue. Id. Based on this revenue model, the State concludes that "there can be no doubt that regulated businesses have 'an economic motive for engaging in the [alleged] speech' with regard to the specific products—services likely to be accessed by children—that the Act regulates." Id. (quoting Am. Acad. of Pain Mgmt., 353 F.3d at 1106).

Based on the record before it, the Court finds it difficult to determine whether the Act regulates only commercial speech. NetChoice argues in fairly conclusory fashion that the Act "regulates speech that does far more than 'propose a commercial transaction' " and that the for-profit nature of a website "does not render [its] content commercial speech" because many covered businesses rely on advertisements to support the expressive content and services they provide. Reply 10; see Mot. 2, 19-21. NetChoice provides some support for the latter argument. See, e.g., Roin Decl. ¶ 10 (stating that the Goodreads application earns the vast majority of its revenue from advertising, including personalized advertisements targeted to registered users). However, the Court notes that some sections of the CAADCA, such as those prohibiting the sale of personal information, see generally CAADCA § 31(b), may well be analyzed as regulating only commercial speech. See, e.g., Hunt v. City of Los Angeles, 638 F.3d 703, 715-16 (9th Cir. 2011) (finding speech commercial because it was "directed to their products and why a consumer should buy them" and not "inextricably intertwined" with non-commercial speech). Ultimately, the Court finds that NetChoice has not provided sufficient material to demonstrate that it is likely to succeed in showing that the Act regulates either purely non-commercial speech or non-commercial speech that is inextricably intertwined with commercial speech. It is NetChoice's burden to make that showing in order to trigger application of strict scrutiny. See, e.g., Yim, 63 F.4th at 793 ("The parties on appeal dispute whether the Ordinance regulates commercial speech and calls for the application of intermediate scrutiny, or whether the Ordinance regulates [content-based] non-commercial speech and is subject to strict scrutiny review.").

However, as the Ninth Circuit reasoned in Yim, the Court "need not decide that question, . . . because [it] conclude[s] that the [Act] does not survive the intermediate scrutiny standard of review" for commercial speech. Id.; see also Junior Sports Mags., 80 F.4th at 1115 ("We need not decide this issue because 'the outcome is the same whether a special commercial speech inquiry or a stricter form of judicial scrutiny is applied.' ") (quoting Sorrell, 564 U.S. at 571, 131 S.Ct. 2653). Accordingly, the Court will assume for the purposes of the present motion that only the lesser standard of intermediate scrutiny for commercial speech applies because, as shown below, the outcome of the analysis here is not affected by the Act's evaluation under the lower standard of commercial speech scrutiny.

d. Application of Commercial Speech Scrutiny to the CAADCA

Under the standard for commercial speech scrutiny, if the regulation restricts speech that is neither misleading nor related to unlawful activity, it is the State's burden to show "at least that the statute directly advances a substantial governmental interest and that the measure is drawn to achieve that interest." Sorrell, 564 U.S. at 572, 131 S.Ct. 2653 (citations omitted); Junior Sports Mags., 80 F.4th at 1117 ("Under Central Hudson, a state seeking to justify a restriction on commercial speech bears the burden to prove that its law directly advances that [substantial] interest to a material degree."). That is, "the restriction must directly advance the state interest involved," and it must not be "more extensive than is necessary to serve that interest." Cent. Hudson, 447 U.S. at 566, 100 S.Ct. 2343. These "last two steps of the Central Hudson analysis basically involve a consideration of the fit between the legislature's ends and the means chosen to accomplish those ends." Hunt, 638 F.3d at 717 (quoting Rubin, 514 U.S. at 486, 115 S.Ct. 1585) (internal quotation marks omitted). The government need not employ the least restrictive means to advance its interest, but the means employed may not be "substantially excessive." Id. (quoting Bd. of Trs. of State Univ. of N.Y. v. Fox, 492 U.S. 469, 479, 109 S.Ct. 3028, 106 L.Ed.2d 388 (1989)).

i. Substantial State Interest

There is no dispute that the CAADCA regulates speech that is neither misleading nor related to unlawful activity. The Court thus turns directly to the question of whether the State can show a substantial state interest to which the CAADCA is geared. The State asserts a substantial interest in "protecting the physical, mental, and emotional health and well-being of minors." Def.'s Suppl. Br. 1-2; see also Opp'n 20 (describing substantial state interest in "safeguarding the physical and psychological well-being of a minor"); Tr. 71:6-13 (accord); id. at 74:25-75:3 ("[T]he government has a compelling interest [in] the nature of online space for children."). NetChoice does not dispute that "the well-being of children is a compelling interest in the abstract," but argues that the CAADCA does not identify a sufficiently concrete harm that the law addresses. Mot. 21-22. However, the State has presented evidence that children are currently harmed by lax data and privacy protections online. See Radesky Decl. ¶¶ 45-47 (privacy settings often allow unwanted contact), ¶¶ 64-68 (profiling leads to children being targeted with ads for monetization and extreme dieting). In light of this evidence, and given that the Supreme Court has repeatedly recognized a compelling interest in "protecting the physical and psychological well-being of minors," the Court finds that NetChoice is not likely to show that the State has not satisfied its burden of showing a substantial interest under the commercial speech scrutiny standard. Sable Comm'cns of Cal., Inc. v. FCC, 492 U.S. 115, 126, 109 S.Ct. 2829, 106 L.Ed.2d 93 (1989); see also New York v. Ferber, 458 U.S. 747, 756, 102 S.Ct. 3348, 73 L.Ed.2d 1113 (1982) ("It is evident beyond the need for elaboration that a State's interest in 'safeguarding the physical and psychological well-being of a minor' is 'compelling.' ") (quoting Globe Newspaper Co. v. Super. Ct., 457 U.S. 596, 607, 102 S.Ct. 2613, 73 L.Ed.2d 248 (1982)).

Because the State argues that the CAADCA satisfies both strict scrutiny and commercial speech scrutiny, it occasionally describes its interest as "compelling," rather than "substantial." See, e.g., Opp'n 19-20. The Court treats those arguments as supporting the State's position that it has a substantial state interest as required by the commercial speech scrutiny standard.

ii. Means-Ends Fit

After the State shows a substantial interest, the Court evaluates the commercial speech regulation under the last two prongs of the Central Hudson analysis, i.e., whether the "restriction . . . directly advance[s] the state interest involved" and whether it is not "more extensive than is necessary to serve that interest." Metro Lights, L.L.C., 551 F.3d at 903 (quoting Cent. Hudson, 447 U.S. at 564-66, 100 S.Ct. 2343). As noted above, the "last two steps of the Central Hudson analysis basically involve a consideration of the fit between the legislature's ends and the means chosen to accomplish those ends." Hunt, 638 F.3d at 717 (citation omitted). Once again, it is the State's burden to show that the statute satisfies the standards set forth by Central Hudson. Junior Sports Mags., 80 F.4th at 1115-16 (citations omitted); see also Sorrell, 564 U.S. at 572, 131 S.Ct. 2653.

NetChoice argues that certain provisions of the CAADCA—namely, CAADCA §§ 31(a)(1)-(7), 31(a)(9), 31(b)(1)-(4), and 31(b)(7)—fail commercial speech scrutiny, and that the entire statute must be enjoined because the invalid provisions are not severable from the otherwise valid remainder. See Pl.'s Suppl. Br. in Supp. of Mot. for Prelim. Inj. ("Pl.'s Suppl. Br."), ECF 71, at 2-7. The State argues that all of the mandates and prohibitions of the CAADCA satisfy commercial speech scrutiny because each provision is appropriately tailored to the State's substantial interest in protecting the physical, mental, and emotional health and well-being of minors. See Def.'s Suppl. Br. 2-7. The Court will first address whether the specific provisions of the Act challenged by NetChoice survive commercial speech scrutiny before turning to the issue of severability.

The Court refers to those portions of the Act not challenged by NetChoice as a "valid remainder" for the purposes of its decision on the motion for preliminary injunction, but does not intend to suggest it has conducted an analysis and found those unchallenged provisions to be legally valid.

(1) DPIA Report Requirement (CAADCA § 31(a)(1)-(4))

The State contends that the CAADCA's DPIA report requirement furthers its substantial interest in protecting children's safety because the provisions will cause covered businesses to proactively assess "how their products use children's data and whether their data management practices or product designs pose risks to children," so that "fewer children will be subject to preventable harms." Def.'s Suppl. Br. 2-3. According to the State's expert, "[c]hildren's digital risks and opportunity are shaped by the design of digital products, services, and features," and businesses currently take a reactive approach by removing problematic features only after harm is discovered. See Radesky Decl. ¶ 40 (emphasis added). For example, the mobile application Snapchat ended the use of a speed filter after the feature was linked to dangerous incidents of reckless driving by adolescents. Id. ¶ 41.

Accepting the State's statement of the harm it seeks to cure, the Court concludes that the State has not met its burden to demonstrate that the DPIA provisions in fact address the identified harm. For example, the Act does not require covered businesses to assess the potential harm of product designs—which Dr. Radesky asserts cause the harm at issue—but rather of "the risks of material detriment to children that arise from the data management practices of the business." CAADCA § 31(a)(1)(B) (emphasis added). And more importantly, although the CAADCA requires businesses to "create a timed plan to mitigate or eliminate the risk before the online service, product, or feature is accessed by children," id. § 31(a)(2), there is no actual requirement to adhere to such a plan. See generally id. § 31(a)(1)-(4); see also Tr. 26:9-10 ("As long as you write the plan, there is no way to be in violation."), ECF 66.

"A restriction 'directly and materially advances' the government's interests if the government can show 'the harms it recites are real and that its restriction will in fact alleviate them to a material degree.' " Yim, 63 F.4th at 794 (quoting Fla. Bar v. Went For It, Inc., 515 U.S. 618, 626, 115 S.Ct. 2371, 132 L.Ed.2d 541 (1995)). Because the DPIA report provisions do not require businesses to assess the potential harm of the design of digital products, services, and features, and also do not require actual mitigation of any identified risks, the State has not shown that these provisions will "in fact alleviate [the identified harms] to a material degree." Id. The Court accordingly finds that NetChoice is likely to succeed in showing that the DPIA report provisions provide "only ineffective or remote support for the government's purpose" and do not "directly advance" the government's substantial interest in promoting a proactive approach to the design of digital products, services, and feature. Id. (citations omitted). NetChoice is therefore likely to succeed in showing that the DPIA report requirement does not satisfy commercial speech scrutiny. See Junior Sports Mags., 80 F.4th at 1116 ("Because California fails to satisfy its burden to justify the proposed speech restriction, [Plaintiff] is likely to prevail on the merits of its First Amendment claim.").

(2) Age Estimation (CAADCA § 31(a)(5))

The CAADCA requires that covered businesses "[e]stimate the age of child users with a reasonable level of certainty appropriate to the risks that arise from the data management practices of the business or apply the privacy and data protections afforded to children to all consumers." CAADCA § 31(a)(5). The State argues that CAADCA § 31(a)(5) promotes the well-being of children by requiring covered businesses to "provide data and privacy protections to users based on estimated age or, if the business does not estimate age, apply child-appropriate data and privacy protections to all users." Def.'s Suppl. Br. 3. This argument relies on the state legislature's finding that greater data privacy "necessarily means greater security and well-being." Id. (quoting AB 2273 § 1(a)(4)). NetChoice counters that the age estimation provision does not directly advance the State's substantial interest in children's well-being because the practical process of such estimation involves further information collection that is itself invasive. See Reply 5-6; Goldman Am. Br. 2-4.

The Court notes that the age estimation provision does not itself require any specific protections; the required data and privacy protections for either minors (if the business estimates age) or all users (if the business does not estimate age) are set forth in the remainder of the statute, and especially at CAADCA §§ 31(b)(1)-(8).

As described above, for the Act to survive commercial speech scrutiny, the State must show that the CAADCA's challenged provisions directly advance a substantial government interest by materially alleviating real harms. See Yim, 63 F.4th at 794; Junior Sports Mags., 80 F.4th at 1116-17. Based on the materials before the Court, the CAADCA's age estimation provision appears not only unlikely to materially alleviate the harm of insufficient data and privacy protections for children, but actually likely to exacerbate the problem by inducing covered businesses to require consumers, including children, to divulge additional personal information. The State argues that age estimation is distinct from the more onerous exercise of age verification, that the statute requires only a level of estimation that is appropriate to the risk presented by a business's data management practices, and that there are "minimally invasive" age estimation tools, some of which are already used by NetChoice's member companies. See Opp'n 15-16. But even the evidence cited by the State about the supposedly minimally invasive tools indicates that consumers might have to permit a face scan, or that businesses might use "locally-analyzed and stored biometric information" to signal whether the user is a child or not. See id. at 16 (citing Radesky Decl. ¶ 96); see also Radesky Decl. ¶ 96(b) & n.92 (noting Google's use of facial age-estimation software), ¶ 96(d) (noting businesses receive signals from hardware devices based on "locally-analyzed and stored biometric information" that indicate whether a user is a child). Further, as noted in Professor Goldman's amicus brief, age estimation is in practice quite similar to age verification, and—unless a company relies on user self-reporting of age, which provides little reliability—generally requires either documentary evidence of age or automated estimation based on facial recognition. See Goldman Am. Br. 3-4. Such measures would appear to counter the State's interest in increasing privacy protections for children. For these reasons, the State has not met its burden under Central Hudson and thus NetChoice is likely to succeed in showing that the age estimation clause does not satisfy commercial speech scrutiny. See Yim, 63 F.4th at 794 ("[A] statute cannot meaningfully advance the government's stated interests if it contains exceptions that 'undermine and counteract' those goals.") (quoting Rubin, 514 U.S. at 489, 115 S.Ct. 1585).

Although Dr. Radesky states that Google's current system involves facial recognition only by adults who have been placed in "child mode" through a machine-learning analysis, Radesky Decl. ¶ 96(b), there is nothing to suggest that companies would not request all consumers to undergo such a process.

If a business does not estimate age, it must "apply the privacy and data protections afforded to children to all consumers." CAADCA § 31(a)(5). Doing so would clearly advance the government's interest in increasing data and privacy protections for children. NetChoice argues, however, that the effect of this requirement would be to restrain a great deal of protected speech. See Mot. 13-14, Reply 12. The Court is indeed concerned with the potentially vast chilling effect of the CAADCA generally, and the age estimation provision specifically. The State argues that the CAADCA does not prevent any specific content from being displayed to a consumer, even if the consumer is a minor; it only prohibits a business from profiling a minor and using that information to provide targeted content. See, e.g., Opp'n 16. Yet the State does not deny that the end goal of the CAADCA is to reduce the amount of harmful content displayed to children. See id. ("[T]he Act prevents businesses from attempting to increase their profits by using children's data to deliver them things they do not want and have not asked for, such as ads for weight loss supplements and content promoting violence and self-harm."); Def.'s Suppl. Br. 6 ("Children are unable to avoid harmful unsolicited content—including extreme weight loss content and gambling and sports betting ads—directed at them based on businesses' data collection and use practices.").

Putting aside for the moment the issue of whether the government may shield children from such content—and the Court does not question that the content is in fact harmful—the Court here focuses on the logical conclusion that data and privacy protections intended to shield children from harmful content, if applied to adults, will also shield adults from that same content. That is, if a business chooses not to estimate age but instead to apply broad privacy and data protections to all consumers, it appears that the inevitable effect will be to impermissibly "reduce the adult population . . . to reading only what is fit for children." Butler v. Michigan, 352 U.S. 380, 381, 383, 77 S.Ct. 524, 1 L.Ed.2d 412 (1957). And because such an effect would likely be, at the very least, a "substantially excessive" means of achieving greater data and privacy protections for children, see Hunt, 638 F.3d at 717 (citation omitted), NetChoice is likely to succeed in showing that the provision's clause applying the same process to all users fails commercial speech scrutiny.

For these reasons, even accepting the increasing of children's data and privacy protections as a substantial governmental interest, the Court finds that the State has failed to satisfy its burden to justify the age estimation provision as directly advancing the State's substantial interest in protecting the physical, mental, and emotional health and well-being of minors, so that NetChoice is likely to succeed in arguing that the provision fails commercial speech scrutiny. See Junior Sports Mags., 80 F.4th at 1115-16.

(3) High Default Privacy Settings (CAADCA § 31(a)(6))

CAADCA § 31(a)(6) requires covered businesses to "[c]onfigure all default privacy settings provided to children . . . to settings that offer a high level of privacy, unless the business can demonstrate a compelling reason that a different setting is in the best interests of children." The State argues that high privacy settings "demonstrably keep children safe." Def.'s Suppl. Br. 3-4 (citing Radesky Decl. ¶¶ 57-60). The evidence before the Court indicates that lower default privacy settings may quickly lead to individuals perceived as adolescents "receiv[ing] direct messages from accounts they did not follow, including being added to group chats with strangers and contacts from marketers of detrimental material such as pornography and diet products." Radesky Decl. ¶ 59. Accordingly, the Court finds that the State is likely to establish a real harm, as required under commercial speech scrutiny. See Yim, 63 F.4th at 794.

The instant provision, however, does not make clear whether it applies only to privacy settings on accounts created by children—which is the harm discussed in the State's materials, see, e.g., Radesky Decl. ¶ 59—or if it applies, for example, to any child visitor of an online website run by a covered business. NetChoice has provided evidence that uncertainties as to the nature of the compliance required by the CAADCA is likely to cause at least some covered businesses to prohibit children from accessing their services and products altogether. See, e.g., NYT Am. Br. 5-6 (asserting CAADCA requirements that covered businesses consider various potential harms to children would make it "almost certain that news organizations and others will take steps to prevent those under the age of 18 from accessing online news content, features, or services"). Although the State need not show that the Act "employs . . . the least restrictive means" of advancing the substantial interest, the Court finds it likely, based on the evidence provided by NetChoice and the lack of clarity in the provision, that the provision here would serve to chill a "substantially excessive" amount of protected speech to the extent that content providers wish to reach children but choose not to in order to avoid running afoul of the CAADCA. See Hunt, 638 F.3d at 717 (citation omitted). Accordingly, the State has not met its burden under Central Hudson of showing "a reasonable fit between the means and ends of the regulatory scheme," Junior Sports Mags., 80 F.4th at 1119 (quoting Lorillard Tobacco Co. v. Reilly, 533 U.S. 525, 561, 121 S.Ct. 2404, 150 L.Ed.2d 532 (2001)), so that NetChoice is likely to succeed in showing the restriction fails commercial speech scrutiny.

(4) Age-Appropriate Policy Language (CAADCA § 31(a)(7))

The CAADCA next requires covered businesses to "[p]rovide any privacy information, terms of service, policies, and community standards concisely, prominently, and using clear language suited to the age of children likely to access that online service, product, or feature." CAADCA § 31(a)(7). The State argues this provision "protects the safety and well-being of minors" by "giving children the tools to make informed decisions about the services with which they interact." Def.'s Suppl. Br. 4.

The evidence submitted by the State indicates that the harm it seeks to address is a lack of consumer understanding of websites' privacy policies. See id. (citing Egelman Decl.); see also Egelman Decl. ¶ 52. The State has shown that internet users generally do not read privacy policies, and that the reason may be that such policies are often "written at the college level and therefore may not be understood by a significant proportion of the population (much less children)." Egelman Decl. ¶ 27; see id. ¶ 24. The Court notes that the research-based claims in Dr. Egelman's declaration do not appear to be based on studies involving minors and the impact of policy language on their use of online services. See id. at, e.g., ¶¶ 18-19, 24-27, 52.

Even accepting that the manner in which websites present "privacy information, terms of service, policies, and community standards," CAADCA § 31(a)(7), constitutes a real harm to children's well-being because it deters children from implementing higher privacy settings, the State has not shown that the CAADCA's policy language provision would directly advance a solution to that harm. The State points only to a sentence in Dr. Egelman's declaration stating that he "believe[s] the [Act] addresses this issue [of lack of consumer understanding of privacy policies] by requiring the language to be understandable by target audiences (when their online services are likely to be accessed by children)." Egelman Decl. ¶ 52; see Def.'s Suppl. Br. 4 (citing same). Nothing in the State's materials indicates that the policy language provision would materially alleviate a harm to minors caused by current privacy policy language, let alone by the terms of service and community standards that the provision also encompasses. NetChoice is therefore likely to succeed in showing that the provision fails commercial speech scrutiny. See Yim, 63 F.4th at 794.

(5) Internal Policy Enforcement (CAADCA § 31(a)(9))

CAADCA § 31(a)(9) requires covered businesses to "[e]nforce published terms, policies, and community standards established by the business, including, but not limited to, privacy policies and those concerning children." As an initial matter, although the State argues that "businesses have to be accountable for the commitments they make to [ ] consumers" for "children and parents to make informed decisions about the products children access," Def.'s Suppl. Br. 5, the State fails to establish a concrete harm. The State points to Dr. Radesky's declaration, which asserts that "[s]tudies have shown that businesses are not enforcing their privacy policies," "mak[ing] it challenging for consumers to make informed decisions about whether they want to join different online communities [without] knowing whether stated policies and standards will be followed." Radesky Decl. ¶ 93; see Def.'s Suppl. Br. 5. The State has not provided anything remotely nearing a causal link between whether a business consistently follows its "published terms, policies, and community standards"—or even children's difficulty in making better-informed decisions about whether to use online services—and some harm to children's well-being. On this basis alone, NetChoice is likely to succeed in showing that the policy enforcement provision fails commercial speech scrutiny. See Yim, 63 F.4th at 794 (noting the government must show that "the harms it recites are real") (citation omitted).

Further, even if the State is able to show a concrete harm to children's well-being, the provision on its face goes beyond enforcement of policies related to children, or even privacy policies generally. See CAADCA § 31(a)(9) (requiring enforcement of terms "including, but not limited to, privacy policies and those concerning children"). The lack of any attempt at tailoring the proposed solution to a specific harm suggests that the State here seeks to force covered businesses to exercise their editorial judgment in permitting or prohibiting content that may, for instance, violate a company's published community standards. The State argues that businesses have complete discretion to set whatever policies they wish, and must merely commit to following them. See Opp'n 14; Def.'s Suppl. Br. 5. It is that required commitment, however, that flies in the face of a platform's First Amendment right to choose in any given instance to permit one post but prohibit a substantially similar one. See NetChoice v. Fla., 34 F.4th at 1204-05, 1228 (finding content moderation restrictions impinged on business's protected curation of content).

Lastly, the Court is not persuaded by the State's argument that the provision is necessary because there is currently "no law holding online businesses accountable for enforcing their own policies," Def.'s Suppl. Br. 5, as the State itself cites to a Ninth Circuit case permitting a lawsuit to proceed where the plaintiff brought a breach of contract suit against an online platform for failure to adhere to its terms. See id.; Barnes v. Yahoo!, Inc., 570 F.3d 1096, 1108-09 (9th Cir. 2009).

For the multiplicity of reasons described above, Court finds that the State has not met its burden of justifying the policy enforcement provision, and that NetChoice is therefore likely to succeed in showing that the provision fails commercial speech scrutiny.

(6) Knowingly Harmful Use of Children's Data (CAADCA § 31(b)(1))

As previously noted, CAADCA § 31(a) contains the Act's mandates, and CAADCA § 31(b) enumerates its prohibitions. The first of these prohibitions forbids a covered business from "[using] the personal information of any child in a way that the business knows, or has reason to know, is materially detrimental to the physical health, mental health, or well-being of a child." CAADCA § 31(b)(1).

The Third Circuit's decision in ACLU v. Mukasey is instructive here. In Mukasey, which went up to the Supreme Court twice and was finally decided by the Court of Appeals, the court held that a law prohibiting the transmission of "material that is harmful to minors" was not narrowly tailored because it required evaluation of a wide range of material that was not in fact harmful, and because the law's definition of a "minor" as anyone under 17 years of age would cause "great uncertainty in deciding what minor could be exposed to" the material. ACLU v. Mukasey, 534 F.3d 181, 191, 193 (3d Cir. 2008) (cert. denied). The Third Circuit also rejected the government's affirmative defense that regulated companies could use age verification techniques to achieve greater certainty as to what material was prohibited to a given user. Id. at 196-97.

The CAADCA does not define what uses of information may be considered "materially detrimental" to a child's well-being, and it defines a "child" as a consumer under 18 years of age. See CAADCA § 30. Although there may be some uses of personal information that are objectively detrimental to children of any age, the CAADCA appears generally to contemplate a sliding scale of potential harms to children as they age. See, e.g., Def.'s Suppl. Br. 3, 4 (describing Act's requirements for "age-appropriate" protections). But as the Third Circuit explained, requiring covered businesses to determine what is materially harmful to an "infant, a five-year old, or a person just shy of age seventeen" is not narrowly tailored. Mukasey, 534 F.3d at 191. Although the law in Mukasey was evaluated under a strict scrutiny standard, the Court finds the same concerns apply here, so that the State has not met its burden of showing the instant provision is reasonably tailored to the State's substantial interest, and thus NetChoice is likely to succeed in showing that the provision fails commercial speech scrutiny. NetChoice has provided evidence that covered businesses might well bar all children from accessing their online services rather than undergo the burden of determining exactly what can be done with the personal information of each consumer under the age of 18. See, e.g., NYT Am. Br. 5-6 (asserting CAADCA requirements that covered businesses consider various potential harms to children would make it "almost certain that news organizations and others will take steps to prevent those under the age of 18 from accessing online news content, features, or services"). The provision at issue would likely "burden substantially more speech than is necessary to further the government's legitimate interests," and therefore NetChoice is likely to succeed in demonstrating that it fails commercial speech scrutiny. See Yim, 63 F.4th at 795-96 (quoting Fox, 492 U.S. at 478, 109 S.Ct. 3028).

(7) Profiling Children by Default (CAADCA § 31(b)(2))

CAADCA § 31(b)(2) prevents a covered business from "[p]rofil[ing] a child by default unless" (1) the business "can demonstrate it has appropriate safeguards in place to protect children" and (2) either of the following conditions is met: (a) the profiling is "necessary to provide the online service, product, or feature requested and only with respect to the aspects of the online service, product, or feature with which the child is actively engaged" or (b) the business can "demonstrate a compelling reason that profiling is in the best interests of children." The State argues this provision protects children's well-being because businesses commonly profile children by default and place them into target audience categories for products related to harmful content such as smoking, gambling, alcohol, or extreme weight loss. Def.'s Suppl. Br. 5-6; Radesky Decl. ¶ 66. The Court accepts the State's assertion of a concrete harm to children's well-being, i.e., the use of profiling to advertise harmful content to children, and turns to the issue of tailoring.

NetChoice has provided evidence indicating that profiling and subsequent targeted content can be beneficial to minors, particularly those in vulnerable populations. For example, LGBTQ+ youth—especially those in more hostile environments who turn to the internet for community and information—may have a more difficult time finding resources regarding their personal health, gender identity, and sexual orientation. See Amicus Curiae Br. of Chamber of Progress, IP Justice, & LGBT Tech Inst. ("LGBT Tech Am. Br."), ECF 42-1, at 12-13. Pregnant teenagers are another group of children who may benefit greatly from access to reproductive health information. Id. at 14-15. Even aside from these more vulnerable groups, the internet may provide children—like any other consumer—with information that may lead to fulfilling new interests that the consumer may not have otherwise thought to search out. The provision at issue appears likely to discard these beneficial aspects of targeted information along with harmful content such as smoking, gambling, alcohol, or extreme weight loss.

The State argues that the provision is narrowly tailored to "prohibit[ ] profiling by default when done solely for the benefit of businesses, but allows it . . . when in the best interest of children." Def.'s Suppl. Br. 6. But as amici point out, what is "in the best interest of children" is not an objective standard but rather a contentious topic of political debate. See LGBT Tech Am. Br. 11-14. The State further argues that children can still access any content online, such as by "actively telling a business what they want to see in a recommendations profile - e.g., nature, dance videos, LGBTQ+ supportive content, body positivity content, racial justice content, etc." Radesky Decl. ¶ 89(b). By making this assertion, the State acknowledges that there are wanted or beneficial profile interests, but that the Act, rather than prohibiting only certain targeted information deemed harmful (which would also face First Amendment concerns), seeks to prohibit likely beneficial profiling as well. NetChoice's evidence, which indicates that the provision would likely prevent the dissemination of a broad array of content beyond that which is targeted by the statute, defeats the State's showing on tailoring, and the Court accordingly finds that State has not met its burden of establishing that the profiling provision directly advances the State's interest in protecting children's well-being. NetChoice is therefore likely to succeed in showing that the provision does not satisfy commercial speech scrutiny. See Yim, 63 F.4th at 794 (noting regulation that burdens substantially more speech than is necessary or undermines and counteracts the state's interest fails commercial speech scrutiny).

(8) Restriction on Collecting, Selling, Sharing, and Retaining Children's Data (CAADCA § 31(b)(3))

CAADCA § 31(b)(3) states that a covered business shall not "[c]ollect, sell, share, or retain any personal information that is not necessary to provide an online service, product, or feature with which a child is actively and knowingly engaged . . . unless the business can demonstrate a compelling reason that [such an action] is in the best interests of children likely to access the online service, product, or feature." The State argues that "[e]xcessive data collection and use undoubtedly harms children" because children are "unable to avoid harmful unsolicited content—including extreme weight loss content and gambling and sports betting ads—directed at them" due to the data collection. Def.'s Suppl. Br. 6. As with the previous provision prohibiting profiling, this restriction throws out the baby with the bathwater. In seeking to prevent children from being exposed to "harmful unsolicited content," the Act would restrict neutral or beneficial content, rendering the restriction poorly tailored to the State's goal of protecting children's well-being. And—in light of the State's admission that it seeks to prevent children from consuming particular content—the Court emphasizes that the compelling and laudable goal of protecting children does not permit the government to shield children from harmful content by enacting greatly overinclusive or underinclusive legislation. See, e.g., Brown v. Ent. Merchants Ass'n, 564 U.S. 786, 802-04, 131 S.Ct. 2729, 180 L.Ed.2d 708 (2011) (holding California law prohibiting sale or rental of violent video games to minors failed strict scrutiny). For the same reasons described above, see supra, at Part III(A)(1)(a)(iv)(9), CAADCA § 31(b)(3) NetChoice is likely to succeed in showing the provision fails commercial speech scrutiny.

(9) Unauthorized Use of Children's Personal Information (CAADCA § 31(b)(4))

CAADCA § 31(b)(4) prohibits a covered business from using a child's "personal information for any reason other than a reason for which that personal information was collected, unless the business can demonstrate a compelling reason that use of the personal information is in the best interests of children." The State clarifies this fairly circular restriction with an example: "a business that uses a child's IP address solely to provide access to its platform cannot also use the IP address to sell ads." Def.'s Suppl. Br. 6. However, the State provides no evidence of a harm to children's well-being from the use of personal information for multiple purposes. See id. To the extent the harm is the same profiling concern discussed in the prior two sections, the State has not met its burden to show that the instant provision is not similarly overbroad. See supra, at Parts III(A)(1)(a)(iv)(7)-(8). Because the State has not established a real harm that the provision materially alleviates, NetChoice will likely succeed in showing that the provision fails commercial speech scrutiny. See Yim, 63 F.4th at 794.

(10) Use of Dark Patterns (CAADCA § 31(b)(7))

The last CAADCA provision challenged by NetChoice prohibits the "[u]se [of] dark patterns to lead or encourage children to provide personal information beyond what is reasonably expected to provide that online service, product, or feature[,] to forego privacy protections, or to take any action that the business knows, or has reason to know, is materially detrimental to the child's physical health, mental health, or well-being." CAADCA § 31(b)(7). Dark patterns are design features that "nudge" individuals into making certain decisions, such as spending more time on an application. Def.'s Suppl. Br 7; see also Opp'n 9 (describing dark patterns as "interfaces designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice"); Radesky Decl. ¶ 54 ("[D]esign features that manipulate or nudge the user in a way that meets the technology developer's best interests - at the expense of the user's interests (i.e., time, money, sleep) - have been termed 'dark patterns.' "). The State argues that businesses use dark patterns to "nudge children into making decisions that are advantageous to businesses," and that "dark patterns can make it difficult or impossible for children to avoid harmful content." Def.'s Suppl. Br. 7. NetChoice contends that the term "dark patterns" has also been "construed by scholars to reach commonplace features that simplify and improve user experience, such as standard 'autoplay' and 'newsfeed' functions that recommend personalized content." Mot. 6 (citation omitted).

The instant provision can be analytically divided into three parts. It first prohibits the use of dark patterns to encourage children to "provide personal information beyond what is reasonably expected to provide that online service, product, or feature." CAADCA § 31(b)(7). This prohibition is similar to the profiling restrictions discussed above in that (1) the State has not shown a harm resulting from the provision of more personal information "beyond what is reasonably expected" for the covered business to provide its online service, product, or feature, and (2) to the extent the harm is the use of profiling information to present harmful content to a child, the State has not shown that the instant provision is sufficiently tailored to survive commercial speech scrutiny. See supra, at Parts III(A)(1)(a)(iv)(7)-(9).

Second, the provision prohibits the use of dark patterns to encourage a child to "forego privacy protections." CAADCA § 31(b)(7). However, the State has not shown that dark patterns causing children to forego privacy protections constitutes a real harm. See Yim, 63 F.4th at 794. Many of the examples of dark patterns cited by the State's experts—such as making it easier to sign up for a service than to cancel it or creating artificial scarcity by using a countdown timer, Egelman Decl. ¶ 51, or sending users notifications to reengage with a game or auto-advancing users to the next level in a game, Radesky Decl. ¶ 55—are not causally connected to an identified harm. See Brown, 564 U.S. at 799, 131 S.Ct. 2729 (finding lack of "direct causal link between violent video games and harm to minors" showed government had not identified "actual problem in need of solving," so that law failed strict scrutiny); Yim, 63 F.4th at 794 (noting commercial speech scrutiny requires government to show "the harms it recites are real")

The most concrete potential harm the Court can find is in Dr. Radesky's assertion that "[m]anipulative dark patterns are known to cause monetary harm to children," based on a March 2023 FTC complaint requiring a game developer to pay $245 million "as a penalty for the use of dark patterns to manipulate users into making purchases." Radesky Decl. ¶ 56. The State does not, however, suggest that the CAADCA is an attempt to address monetary harms to children. See generally Opp'n; Def.'s Suppl. Br. Similarly, although the State points to an existing federal law limiting the practice of making it inconvenient for users to prevent their data from being sold or shared, see Def.'s Suppl. Br. 7 (citing 16 CFR § 312.7), the State does not show how this law indicates a harm to minors caused by the sale of personal information. See generally id.; Radesky Decl.; Egelman Decl. To the extent the harm is the use of data to profile users, including children, the State has not shown that the provision is appropriately tailored to survive commercial speech scrutiny for the same reasons described above. See supra, at Parts III(A)(1)(a)(iv)(7)-(9). The Court accordingly finds that the State is not likely to show a harm in dark patterns causing children to forego privacy protections, so that NetChoice is likely to succeed in showing that this restriction fails commercial speech scrutiny. See Junior Sports Mags., 80 F.4th at 1119-20 (reversing denial of preliminary injunction and reasoning that "[i]n the end, California spins a web of speculation—not facts or evidence—to claim that its restriction on speech will significantly curb" an alleged harm).

The last of the three prohibitions of CAADCA § 31(b)(7) concerns the use of dark patterns to "take any action that the business knows, or has reason to know, is materially detrimental" to a child's well-being. The State here argues that dark patterns cause harm to children's well-being, such as when a child recovering from an eating disorder "must both contend with dark patterns that make it difficult to unsubscribe from such content and attempt to reconfigure their data settings in the hope of preventing unsolicited content of the same nature." Def.'s Suppl. Br. 7; see also Amicus Curiae Br. of Fairplay & Public Health Advocacy Inst. ("Fairplay Am. Br.") 4 (noting that CAADCA "seeks to shift the paradigm for protecting children online," including by "ensuring that children are protected from manipulative design (dark patterns), adult content, or other potentially harmful design features.") (citation omitted), ECF 53-1. The Court is troubled by the "has reason to know" language in the Act, given the lack of objective standard regarding what content is materially detrimental to a child's well-being. See supra, at Part III(A)(1)(a)(iv)(7). And some content that might be considered harmful to one child may be neutral at worst to another. NetChoice has provided evidence that in the face of such uncertainties about the statute's requirements, the statute may cause covered businesses to deny children access to their platforms or content. See NYT Am. Br. 5-6. Given the other infirmities of the provision, the Court declines to wordsmith it and excise various clauses, and accordingly finds that NetChoice is likely to succeed in showing that the provision as a whole fails commercial speech scrutiny.

iii. Conclusion re Commercial Speech Scrutiny

For the foregoing reasons, the Court finds that NetChoice is likely to succeed in showing that the CAADCA's challenged mandates and prohibitions fail commercial speech scrutiny and therefore are invalid.

e. Severability

NetChoice argues that the CAADCA must be enjoined in its entirety because the challenged provisions of the CAADCA—which are likely invalid—cannot be severed from the Act's remaining prohibitions and mandates, or from other provisions related to the CAADCA's application, penalties, and compliance. Pl.'s Suppl. Br. 6-7 (discussing CAADCA § 31(a)(8), 31(a)(10), 31(b)(5)-(6), 31(b)(8), 32, 33, and 35). The State argues that almost every provision is severable, and urges the Court to sustain any provisions not found invalid. Def.'s Suppl. Br. 2.

"Severability is a matter of state law." Sam Francis Found. v. Christies, Inc., 784 F.3d 1320, 1325 (9th Cir. 2015) (quoting Leavitt v. Jane L., 518 U.S. 137, 139, 116 S.Ct. 2068, 135 L.Ed.2d 443 (1996)) (alterations omitted). Under California law, the severability of the invalid parts of a statute depends on whether such provisions are grammatically, functionally, and volitionally severable from the valid remainder. See Calfarm Ins. Co. v. Deukmejian, 48 Cal. 3d 805, 821-22, 258 Cal.Rptr. 161, 771 P.2d 1247 (1989) (en banc). Putting aside the CAADCA provisions setting forth the statute's title, findings, and definitions, CAADCA §§ 28-30, the valid remainder of the statute involve: restrictions on monitoring children's online behavior and tracking location, CAADCA § 31(a)(8); the provision of responsive tools for children to exercise their privacy rights and report concerns, id. § 31(a)(10); the collection of precise geolocation data, id. §§ 31(b)(5)-(6); the use of age-estimation information, id. § 31(b)(8); the creation of a working group to deliver a report on best practices under the CAADCA, id. § 32; the July 1, 2024 deadline for covered businesses to complete DPIA reports, id. § 33; and the penalties for violations of the CAADCA, id. § 35. See Pl.'s Suppl. Br. 6-7.

The Court first notes that there is no severability clause in the CAADCA that would create a presumption in favor of "sustaining the valid part" of the statute. See Garcia, 11 F.4th at 1120 (citing Cal. Redevelopment Ass'n v. Matosantos, 53 Cal. 4th 231, 270, 135 Cal.Rptr.3d 683, 267 P.3d 580 (2011)). Turning to the question of functional severability, the Court finds dispositive the status of the DPIA provisions. As noted by NetChoice, the CAADCA provides that the State shall not initiate an action for any violation of the statute without providing written notice to a covered business identifying specific provisions of the Act that are alleged to have been violated. CAADCA § 35(c); see Pl.'s Suppl. Br. 7. The Court's determination that NetChoice is likely to succeed in showing that the DPIA report requirement is invalid, see supra, at Part III(A)(1)(d)(ii)(1), similarly renders likely invalid a condition precedent for enforcement of the remainder of the statute. Because the CAADCA is not capable of "separate enforcement" without the DPIA requirement, the DPIA provisions are not functionally severable from the otherwise valid portions of the statute. People's Advocate, Inc. v. Super. Ct., 181 Cal. App. 3d 316, 332, 226 Cal.Rptr. 640 (1986) ("The remaining provisions must stand on their own, unaided by the invalid provisions nor rendered vague by their absence nor inextricably connected to them by policy considerations. They must be capable of separate enforcement.").

Although the Court need not review the severability of any other provision in light of the DPIA report requirement's impact on the entire CAADCA, it notes that the age estimation provision, CAADCA § 31(a)(5), is the linchpin of most of the CAADCA's provisions, which specify various data and privacy protections for children. See id. §§ 31(a)(6), (b)(1)-(8). The State concedes only that CAADCA § 31(b)(8)—which prevents the use of personal information collected to estimate age for any other purpose—is rendered obsolete if the age estimation provision is deemed unconstitutional. Def.'s Suppl. Br. 3. However, compliance with the CAADCA's requirements would appear to generally require age estimation to determine whether each user is in fact under 18 years old. The age estimation provision is thus also not functionally severable from the remainder of the statute. See People's Advocate, 181 Cal. App. 3d at 1332, 226 Cal.Rptr. 640.

The futility of severance is apparent when one considers the outcome if the Court were to preliminarily enjoin only the challenged provisions that NetChoice has shown are likely violative of the First Amendment. The Act would consist of the provisions setting forth the statute's title, findings, and definitions; two mandates; three prohibitions; and provisions establishing a working group, DPIA report deadlines, and penalties for violating the Act. See CAADCA §§ 28-30, 31(a)(8), 31(a)(10), 31(b)(5)-(6), 31(b)(8), 32-33, 35. The DPIA report deadline, id. § 33, is meaningless without a DPIA report requirement. Five of the six required recommendations of the working group track provisions of the Act that are likely invalid. See id. § 32(d)(1)-(5). Further, even the State agrees that one of the three remaining prohibitions—that on collecting age estimation data, id. § 31(b)(8)—"would be made obsolete" in the absence of § 31(a)(5), which NetChoice has shown is likely invalid. Def.'s Suppl. Br. 3. Accordingly, the only meat left of the Act would be four unchallenged mandates and prohibitions that together would require covered businesses to provide children with obvious tracking signals and prominent and responsive tools to exercise their privacy rights, and to refrain from collecting children's precise geolocation data. See CAADCA §§ 31(a)(8), 31(a)(10), 31(b)(5)-(6). All of these provisions require businesses to know their users' ages, but the Court has found NetChoice will likely succeed in showing the age estimation provision does not pass commercial speech scrutiny. And none of the provisions can be enforced without the penalty provision, id. § 35, which, as described above, is hamstrung if the State cannot determine whether a covered business is in substantial compliance with the likely-invalid DPIA report requirement. These interdependencies indicate how intertwined with—and thus inseverable from—the challenged provisions are with respect to the valid remainder.

Given that multiple provisions of the CAADCA will be preliminarily enjoined by this order, and the Court's determination that these provisions are not functionally severable from the presumably valid remainder of the statute, the Court concludes that it cannot sever the likely invalid portions from the statute and sustain the remainder. See Acosta v. City of Costa Mesa, 718 F.3d 800, 820 (9th Cir. 2013) (refusing to "rewrite[e] the ordinance in order to save it") (internal alterations and citation omitted).

f. Conclusion re First Amendment Arguments (Claims 1 and 3)

Based on the foregoing, the Court concludes that NetChoice has demonstrated a likelihood of success on Claim 1, which asserts that the CAADCA violates the First Amendment because the Act's "speech restrictions . . . fail strict scrutiny and also would fail a lesser standard of scrutiny." Compl. ¶ 82. As noted above, see supra, at Part III(A)(1), the Court need not and does not here address NetChoice's likelihood of success on its allegations of additional First Amendment violations in Claims 1 and 3.

2. Other Claims

NetChoice has demonstrated a likelihood of success on the merits of Claim 1 brought under the First Amendment and, as discussed below, has satisfied the remaining Winter factors with respect to Claim 1. NetChoice is entitled to preliminary injunctive relief on that basis. Under these circumstances, the Court must determine whether it is necessary or advisable to address the likelihood of success of NetChoice's other claims for relief at this time: Claim 4, asserting that the CAADCA violates the dormant Commerce Clause; Claim 5, asserting that the CAADCA is preempted by COPPA; and Claim 6, asserting that the CAADCA is preempted by Section 230.

Once a plaintiff demonstrates that a preliminary injunction is warranted based on the likelihood of success on one claim, district courts in this circuit generally do not consider whether the same injunctive relief could be granted based on other claims. See, e.g., Shawarma Stackz LLC v. Jwad, No. 21-CV-01263-BAS-BGS, 2021 WL 5827066, at *19 (S.D. Cal. Dec. 8, 2021) ("The Court need not reach the merits of the remaining state torts claims that SSL raises because the Lanham Act claim and the UCL claim are sufficient to sustain a preliminary injunction."); Seiko Epson Corp. v. Nelson, No. 5:21-cv-00320-JWH-SPx, 2021 WL 5033486, at *3 (C.D. Cal. Mar. 31, 2021) ("The Court therefore finds that Plaintiffs have demonstrated a likelihood of success on the merits with respect to their first claim for relief. Plaintiffs have thus satisfied the preliminary injunction standard; the Court need not analyze Plaintiffs' other two claims for relief."); Faison v. Jones, 440 F. Supp. 3d 1123, 1136 n.3 (E.D. Cal. 2020) ("Because the Court finds Plaintiffs are likely to succeed on the merits of their viewpoint discrimination theory, the Court need not and does not address Plaintiffs' remaining theories."); Medina v. Becerra, No. 3:17-CV-03293 CRB, 2017 WL 5495820, at *12 (N.D. Cal. Nov. 16, 2017) ("As Medina has shown a likelihood of success on the merits for his First Amendment claim, this Court need not address Medina's other claims for relief."). This Court sees no reason to depart from the approach adopted by other district courts in the Ninth Circuit.

Deferring consideration of NetChoice's Commerce Clause claim is particularly appropriate here, because the claim presents thorny constitutional issues that the parties briefed prior to receiving the Supreme Court's latest guidance in Nat'l Pork Producers Council v. Ross, 598 U.S. 356, 143 S.Ct. 1142, 215 L.Ed.2d 336 (2023). Ross provides a comprehensive review of case law on the dormant Commerce Clause, emphasizing that "the Commerce Clause prohibits the enforcement of state laws driven by economic protectionism—that is, regulatory measures designed to benefit in-state economic interests by burdening out-of-state competitors," and clarifying that this "antidiscrimination principle lies at the 'very core' of the Court's dormant Commerce Clause jurisprudence." Id. (quotation marks, alterations, and citation omitted). The decision may call into question the dormant Commerce Clause's application where, as here, the state law at issue does not discriminate against out-of-state competitors but does have an extraterritorial effect. Ross observes that "[i]n our interconnected national marketplace, many (maybe most) state laws have the 'practical effect of controlling' extraterritorial behavior," and concludes that extraterritorial effects alone are insufficient to implicate the dormant Commerce Clause. See id. at 1156-57. In the Court's view, it would be imprudent to engage in an analysis of NetChoice's dormant Commerce Clause claim where such analysis is unnecessary to a ruling on the present motion and the Court does not have the benefit of the parties' views on the impact of Ross.

The motion and opposition were filed before Ross issued. The reply was filed approximately one week after Ross was decided, and Ross is cited once therein as secondary authority for an assertion made in the brief. See Reply 13.

With respect to NetChoice's preemption claims, the Court's initial view is that neither would support the requested preliminary injunction. Claim 5 asserts that the CAADCA is preempted by COPPA, which contains a preemption clause providing, "No State or local government may impose any liability for commercial activities or actions by operators in interstate or foreign commerce in connection with an activity or action described in this chapter that is inconsistent with the treatment of those activities or actions under this section." 15 U.S.C.A. § 6502(d) (emphasis added). NetChoice claims that the CAADCA is "inconsistent" with COPPA in the following respects: the CAADCA applies broadly to services "likely to be accessed" by children, whereas COPPA applies only to online services "directed" to children; the CAADCA imposes privacy obligations that are not required by COPPA; and the CAADCA imposes substantive obligations that far exceed those imposed by COPPA. See id. ¶¶ 114-16. NetChoice additionally claims that the statutes are inconsistent because the CAADCA prohibits conduct that is permitted under COPPA, including profiling a child by default and using dark patterns to encourage children to provide personal information. See id. ¶ 117.

The Ninth Circuit recently held in Jones v. Google LLC, 73 F.4th 636, 642 (9th Cir. 2023), that a state law is not "inconsistent" with COPPA for preemption purposes unless the state law contains requirements that contradict those of COPPA or "stand as obstacles to federal objectives" embodied in COPPA. A state law that supplements or requires the same thing as COPPA is not inconsistent with COPPA. See id. In the Court's view, it is not clear that the cited provisions of the CAADCA contradict, rather than supplement, those of COPPA. Nor is it clear that the cited provisions of the CAADCA would stand as an obstacle to enforcement of COPPA. An online provider might well be able to comply with the provisions of both the CAADCA and COPPA, with the possible exception of the CAADCA provisions identified in paragraph 117 of the complaint. However, a determination whether those are inconsistent with COPPA for preemption purposes would require a careful and nuanced analysis. It would make little sense to engage in such analysis at this stage of the proceedings in light of the fact that NetChoice is entitled to the requested injunctive relief based on its First Amendment claims.

Claim 6 asserts that the CAADCA is preempted by Section 230. Section 230 "protects certain internet-based actors from certain kinds of lawsuits." Barnes, 570 F.3d at 1099. As relevant here, Section 230(c)(1) provides that "[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." 47 U.S.C. § 230(c)(1). Section 230(c)(2) provides that "[n]o provider or user of an interactive computer service shall be held liable on account of . . . any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be . . . objectionable[.]" 47 U.S.C. § 230(c)(2)(A). NetChoice contends that the CAADCA's requirement that online providers enforce their "published terms, policies, and community standards," CAADCA § 31(a)(9), and restrictions on the use of minors' personal information, CAADCA § 31(b)(1), (3), (4), (7), are inconsistent with Section 230. NetChoice claims that those inconsistencies result in preemption of the CAADCA under § 230(e), which provides that "[n]o cause of action may be brought and no liability may be imposed under any State or local law that is inconsistent with this section." 47 U.S.C. § 230(e)(3). Section 230 may be implicated by an online provider's enforcement of its policies and other acts in compliance with the CAADCA, but it is difficult (if not impossible) to make that determination without knowing what policies or acts are at issue. For that reason, it is the Court's view that a facial challenge to the CAADCA is not the appropriate context in which to consider the applicability of § 230.

Accordingly, the Court need not and does not determine whether NetChoice is likely to succeed on the merits of its claims grounded in the dormant Commerce Clause, COPPA, and Section 230. The Court limits its consideration of the remaining Winter factors to Claim 1 under the First Amendment, namely, irreparable harm, the balance of equities, and the public interest.

B. Irreparable Harm

"The loss of First Amendment freedoms, for even minimal periods of time, unquestionably constitutes irreparable injury." Elrod v. Burns, 427 U.S. 347, 373, 96 S.Ct. 2673, 49 L.Ed.2d 547 (1976); see also Baird, 81 F.4th at 1040-41. Loss of free speech rights resulting from a threat of enforcement rather than actual enforcement constitutes irreparable harm. See Cuviello v. City of Vallejo, 944 F.3d 816, 833 (9th Cir. 2019). Consequently, "[i]rreparable harm is relatively easy to establish in a First Amendment case." CTIA - The Wireless Ass'n v. City of Berkeley, 928 F.3d 832, 851 (9th Cir. 2019). "[A] party seeking preliminary injunctive relief in a First Amendment context can establish irreparable injury . . . by demonstrating the existence of a colorable First Amendment claim." Id. (quotation marks and citation omitted). As discussed above, NetChoice has done more than merely assert a colorable First Amendment claim; it has established a likelihood of success on the merits of its claim that the CAADCA violates the First Amendment.

The Court finds unpersuasive the State's argument that the threat of enforcement is insufficient to establish irreparable injury because the Act's challenged provisions do not take effect until July 1, 2024. That date is less than a year away. "One does not have to await the consummation of threatened injury to obtain preventive relief. If the injury is certainly impending, that is enough." Pac. Gas & Elec. Co. v. State Energy Res. Conservation & Dev. Comm'n, 461 U.S. 190, 201, 103 S.Ct. 1713, 75 L.Ed.2d 752 (1983) (citation omitted). Moreover, NetChoice presents evidence that businesses already are expending time and funds preparing for enforcement of the CAADCA. See Roin Decl. ¶¶ 20, 24-25; Cairella Decl. ¶¶ 14, 19-22; Masnick Decl. ¶¶ 12, 14-19; Paolucci Decl. ¶¶ 16-18; Szabo Decl. ¶¶ 5-7, 12-17. Requiring businesses to proceed with such preparations without knowing whether CAADCA is valid "would impose a palpable and considerable hardship" on them. See Pac. Gas & Elec., 461 U.S. at 201-02, 103 S.Ct. 1713 ("To require the industry to proceed without knowing whether the moratorium is valid would impose a palpable and considerable hardship on the utilities[.]").

The Court has no difficulty finding that NetChoice has established a likelihood of irreparable harm absent issuance of the requested preliminary injunction.

C. Balance of Equities / Public Interest

"Where the government is a party to a case in which a preliminary injunction is sought, the balance of the equities and public interest factors merge." Roman v. Wolf, 977 F.3d 935, 940-41 (9th Cir. 2020); see also Baird, 81 F.4th at 1039-40. As discussed above, NetChoice has demonstrated a likelihood of success in proving that the CAADCA violates the First Amendment. "[I]t is always in the public interest to prevent the violation of a party's constitutional rights." Melendres v. Arpaio, 695 F.3d 990, 1002 (9th Cir. 2012) (quotation marks and citation omitted). Moreover, the State "cannot reasonably assert that it is harmed in any legally cognizable sense by being enjoined from constitutional violations." Zepeda v. U.S. I.N.S., 753 F.2d 719, 727 (9th Cir. 1983).

The State cites Maryland v. King, 567 U.S. 1301, 1303, 133 S.Ct. 1, 183 L.Ed.2d 667 (2012), for the proposition that "[a]ny time a State is enjoined by a court from effectuating statutes enacted by representatives of its people, it suffers a form of irreparable injury." King did not involve a motion for preliminary injunction, but rather Maryland's application for a stay of a state appellate court's decision overturning King's rape conviction pending disposition of Maryland's petition for writ of certiorari. See id. at 1301, 133 S.Ct. 1. The state appellate court had determined that Maryland's DNA collection statute, which had authorized law enforcement officers to collect King's DNA sample, violated the Fourth Amendment. See id. The Supreme Court found that a stay was warranted based on its determination that there was a reasonable probability it would grant certiorari. See id. at 1302, 133 S.Ct. 1. It was in that context that the Supreme Court discussed the harm to the State of Maryland flowing from its inability to effectuate its DNA collection statute. See id. at 1303, 133 S.Ct. 1. The quoted language has no application here, where (unlike the State of Maryland) the State of California has not made a showing that the challenged statute passes constitutional muster.

The Court finds that NetChoice has established that the last two factors, the balance of equities and the public interest, favor issuance of the requested injunction.

D. Conclusion

In conclusion, the Court finds that all of the Winter factors favor granting the requested preliminary injunction. With respect to the first and most important factor, likelihood of success on the merits, NetChoice has demonstrated that it is likely to succeed on at least one of its First Amendment theories set forth in Claim 1 of the complaint. NetChoice also has satisfied the second factor by demonstrating a likelihood that it will suffer irreparable injury if the requested preliminary injunction does not issue. Finally, NetChoice has satisfied the third and fourth factors by showing that the balance of the equities and the public interest favor issuance of the requested preliminary injunction.

"If a movant makes a sufficient demonstration on all four Winter factors (three when as here the third and fourth factors are merged), a court must not shrink from its obligation to enforce his constitutional rights, regardless of the constitutional right at issue." Baird, 81 F.4th at 1041 (quotation marks, citation, and brackets omitted). "It may not deny a preliminary injunction motion and thereby allow constitutional violations to continue simply because a remedy would involve intrusion into an agency's administration of state law." Id. (quotation marks and citation omitted).

NetChoice's motion for preliminary injunction is GRANTED.

E. Security

Federal Rule of Civil Procedure 65(c) provides that "[t]he court may issue a preliminary injunction or a temporary restraining order only if the movant gives security in an amount that the court considers proper to pay the costs and damages sustained by any party found to have been wrongfully enjoined or restrained." Fed. R. Civ. P. 65(c). The Ninth Circuit has "recognized that Rule 65(c) invests the district court with discretion as to the amount of security required, if any." Jorgensen v. Cassiday, 320 F.3d 906, 919 (9th Cir. 2003) (internal quotation marks and citation omitted) (italics in original). Thus, the district court has discretion to dispense with the filing of a bond altogether, or to require only a nominal bond. See id. ("The district court may dispense with the filing of a bond when it concludes there is no realistic likelihood of harm to the defendant from enjoining his or her conduct."); see also Save Our Sonoran, Inc. v. Flowers, 408 F.3d 1113, 1126 (9th Cir. 2005) ("The district court has discretion to dispense with the security requirement, or to request mere nominal security, where requiring security would effectively deny access to judicial review.") (citation omitted).

Neither party addresses the issue of security in its briefing. NetChoice's proposed order, filed with its motion for preliminary injunction, provides that the requested injunctive relief will issue without the requirement of any security bond because NetChoice has shown a likelihood of success and the State will not suffer any harm from maintaining the status quo. See Proposed Order, ECF 29-31. The State argues, as a reason to deny injunctive relief altogether, that issuance of the injunction "would inflict irreparable harm upon California by preventing enforcement of a statute enacted by representatives of the people." Opp'n at 30. The State's argument gives no indication, however, whether the State believes a bond should be required in the event a preliminary injunction issues, or the appropriate amount of such bond. See id.

The Court finds it appropriate to issue the preliminary injunction without requiring security based on NetChoice's showing that it is likely to prevail on its claim that enforcement of the CAADCA violates the First Amendment—and thus could not be lawfully enforced by the State—and the absence of any argument that a security bond should be required.

IV. ORDER

(1) Plaintiff NetChoice's motion for preliminary injunction is GRANTED as follows: (a) Rob Bonta, Attorney General of the State of California, and anyone acting in concert with his office are ENJOINED from enforcing the California Age-Appropriate Design Code Act; (b) This preliminary injunction shall issue without the requirement of a security bond; and (c) This preliminary injunction shall take effect immediately and shall remain in effect until otherwise ordered by the Court. (2) This order terminates ECF 29.


Summaries of

NetChoice, LLC v. Bonta

United States District Court, Northern District of California
Sep 18, 2023
692 F. Supp. 3d 924 (N.D. Cal. 2023)
Case details for

NetChoice, LLC v. Bonta

Case Details

Full title:NETCHOICE, LLC, d/b/a NetChoice Plaintiff, v. ROB BONTA, Attorney General…

Court:United States District Court, Northern District of California

Date published: Sep 18, 2023

Citations

692 F. Supp. 3d 924 (N.D. Cal. 2023)

Citing Cases

NetChoice, LLC v. Bonta

In its order supporting the injunction, the district court began its analysis by observing that "both parties…

NetChoice, LLC v. Bonta

This Court granted the motion based on its determination that NetChoice was likely to succeed on its facial…