Opinion
2:23-cv-00911-RJS-CMR
07-22-2024
Cecilia M. Romero Magistrate Judge
MEMORANDUM DECISION AND ORDER GRANTING DEFENDANTS' MOTION TO DISMISS
ROBERT J. SHELBY UNITED STATES CHIEF DISTRICT JUDGE
This case arises out of Plaintiff NetChoice, LLC's challenge to the Utah Minor Protection in Social Media Act (the Act). Among other causes of action, NetChoice claims Section 230 of the Communications Decency Act (CDA) preempts certain provisions of the Act. Now before the court is Defendants Sean D. Reyes and Katherine Hass' Motion to Dismiss that claim. For the reasons explained below, Defendants' Motion is GRANTED.
Utah Code §§ 13-71-101 to 401.
Dkt. 59, Defendants' Motion to Dismiss for Failure to State a Claim and Memorandum in Support (Motion to Dismiss).
Because this case is before the court on a motion to dismiss, it accepts as true all well-pleaded factual allegations contained in NetChoice's First Amended Complaint. See Bell Atl. Corp. v. Twombly, 550 U.S. 544, 555 (2007).
A. The Parties
Plaintiff NetChoice is a District of Columbia nonprofit trade association for internet companies. The Act regulates several NetChoice members, including: (1) Dreamwidth; (2) Google, which owns and operates YouTube; (3) Meta, which owns and operates Facebook and Instagram; (4) Nextdoor; (5) Pinterest; (6) Snap Inc., which owns and operates Snapchat; and (7) X.
Dkt. 51, Plaintiff's First Amended Complaint (FAC) ¶ 8. A complete list of NetChoice members can be found at https://perma.cc/GD5W-JYV6.
Id. ¶ 11; see also Utah Code § 13-71-101. The Act does not regulate all NetChoice members but, for purposes of this Order, the court follows NetChoice's labeling convention and refers to entities subject to the Act as “members.”
Defendants are Katherine Hass and Sean D. Reyes, both sued in their official capacity.Hass is Director of the Division of Consumer Protection of the Utah Department of Commerce (the Division). The Act grants enforcement authority to the Division and its Director. Reyes is the Attorney General of Utah. He has authority to “give legal advice to, and act as counsel for, the [D]ivision in the exercise of the [D]ivision's responsibilities.”
Id. ¶¶ 14-15.
Id. ¶ 15.
Id. (citing Utah Code § 13-71-301).
Id. ¶ 14.
Id. (quoting Utah Code § 13-71-301(4)(b)).
The court summarizes only those facts relevant to resolving Defendants' Motion to Dismiss.
In March 2024, Utah enacted Senate Bill 194, the Utah Minor Protection in Social Media Act. The Act partially replaced Utah's Social Media Regulation Act of 2023, which the State repealed after NetChoice filed a lawsuit challenging its constitutionality, and largely takes effect on October 1, 2024.
FAC ¶ 38.
Id. ¶¶ 38-39; see also Dkt. 1, Complaint for Declaratory and Injunctive Relief.
FAC ¶ 38.
The Act regulates Utah minors' access to and use of social media by imposing various requirements on covered “social media companies.” For example, the Act requires covered websites to “implement an age assurance system,” “limit the Utah minor account holder's ability to share content to only connected accounts,” and imposes data collection and use restrictions on covered entities. Relevant here, the Act also prohibits covered websites from disseminating content on minors' accounts in particular ways by requiring social media companies to “disable” three “features:”
The Act defines a “social media company” as any “entity that owns or operates a social media service.” Utah Code § 13-71-101(13). A “social media service” is any “public website or application that:”
(i) displays content that is primarily generated by account holders and not by the social media company;
(ii) permits an individual to register as an account holder and create a profile that is made visible to the general public or a set of other users defined by the account holder;
(iii) connects account holders to allow users to interact socially with each other within the website or application;
(iv) makes available to each account holder a list or lists of other account holders with whom the account holder shares a connection within the system; and
(v) allows account holders to post content viewable by other users.Id. § 13-71-101(14)(a). The term “[s]ocial media service” excludes “(i) email; (ii) cloud storage; or (iii) document viewing, sharing, or collaboration services.” Id. § 13-71-101(14)(b). See also FAC ¶¶ 1-6, 41.
FAC ¶ 42 (quoting Utah Code § 13-71-201).
Id. ¶ 52 (quoting Utah Code § 13-71-202(1)(b)).
Id. ¶ 58 (citing Utah Code §§ 13-71-202(1)(c), 13-71-204(2)-(4)).
• autoplay functions that continuously play content without user interaction;
• scroll or pagination that loads additional content as long as the user continues scrolling; and
• push notifications prompting repeated user engagement.
NetChoice refers to this feature as “seamless pagination.” Id. ¶ 47.
Id. (quoting Utah Code § 13-71-202(5)). The Act defines a “push notification” as “an automatic electronic message displayed on an account holder's device, when the user interface for the social media service is not actively open or visible on the device, that prompts the account holder to repeatedly check and engage with the social media service.” Utah Code § 13-71-101(11).
NetChoice members use these features to disseminate and display their users' speech and expression. For example, websites utilize autoplay because “some expression lends itself to being viewed in sequence.” Likewise, seamless pagination, particularly on mobile devices, “is an effective way to display and view the enormous amount of content on many covered websites.” And notifications allow “websites to inform users about things like recommended content, relevant announcements, and suspicious logins to their accounts.”
Id. ¶ 49.
Id.
Id. ¶ 49-50.
Id. ¶ 49.
The Act contains two mechanisms to enforce these provisions. First, the Division Director “may impose an administrative fine of up to $2,500 for each violation” of the Act.Second, the Division may bring a “court action,” and a “court may,” among other things:
Id. ¶¶ 65-66.
Id. ¶ 66 (quoting Utah Code § 13-71-301(3)(a)(i)).
(1) “order disgorgement of any money received in violation of” the Act; (2) “order payment of disgorged money to an injured purchaser or consumer;” (3) “impose a civil penalty of up to $2,500 for each violation;” (4) “award actual damages to an injured purchaser or consumer;” (5) award “reasonable attorney fees,” “court costs,” and “investigative fees;” and (6) order “any other relief that the court deems reasonable and necessary.” Additionally, “[a] person who violates an administrative or court order issued for a violation of” the Act “is subject to a civil penalty of no more than $5,000 for each violation.”
Id. ¶ 67 (quoting Utah Code § 13-71-301(b)-(c)).
Id. (quoting Utah Code § 13-71-301(4)(a)).
C. Procedural History
NetChoice filed its initial Complaint challenging Utah's Social Media Regulation Act of 2023 on December 18, 2023. Two days later, it filed a Motion for Preliminary Injunction. After Utah repealed the Social Media Act of 2023 and replaced it with the Utah Minor Protection in Social Media Act, the parties agreed NetChoice should amend its Complaint and file a new motion for preliminary injunction. NetChoice filed its now-operative First Amended Complaint (FAC) on May 3, 2024.
Complaint for Declaratory and Injunctive Relief.
Dkt. 25, Motion for Preliminary Injunction.
Dkt. 44, Joint Notice and Proposed Amended Scheduling Order.
FAC.
NetChoice asserts in its FAC several claims broadly contending the “Act is unconstitutional and otherwise unlawful.” Relevant here, Count VI asserts 47 U.S.C. § 230 preempts the provisions of the Act prohibiting autoplay, seamless pagination, and notifications on minors' accounts. NetChoice seeks a declaratory judgment that the challenged provisions- Utah Code §§ 13-71-202(1)(a)-(b), 13-71-202(5), and 13-71-204(1)-are “unlawful and unenforceable because they are preempted by federal law.”
Id. ¶¶ 71-199.
Id. ¶¶ 152-159.
Id. ¶ 195.
On May 31, 2024, Defendants filed a Motion to Dismiss Count VI of the FAC under Federal Rule of Civil Procedure 12(b)(6), arguing Section 230 does not preempt the Act because the Act is not inconsistent with Section 230. Defendants contend NetChoice's preemption allegations are not entitled to the presumptions a plaintiff's factual allegations receive at the motion to dismiss stage because they are legal conclusions-not factual allegations. And “because no fact development will affect the preemption analysis, this [c]ourt can and should resolve the matter now ....”
Motion to Dismiss at 1-2.
Id. at 1.
Id.
The Motion is fully briefed and ripe for review.
Dkt. 66, Plaintiff's Brief in Opposition to Defendant's Motion to Dismiss Count VI of First Amended Complaint (Opposition); Dkt. 72, Defendants' Reply in Support of Motion to Dismiss for Failure to State a Claim (Reply).
Having carefully reviewed the parties' briefs, the court determines oral argument would not be materially helpful and is now prepared to rule on Defendants' Motion. See DUCivR 7-1(g).
LEGAL STANDARD
“To survive a [Rule 12(b)(6)] motion to dismiss, a complaint must contain sufficient factual matter, accepted as true, to ‘state a claim to relief that is plausible on its face.'” A claim is facially plausible “when the plaintiff pleads factual content that allows the court to draw the reasonable inference that the defendant is liable for the misconduct alleged.” When evaluating a complaint under this standard, the court “accept[s] all well-pleaded factual allegations in the complaint . . . as true, and [] view[s] them in the light most favorable to the nonmoving party.”However, where a Rule 12(b)(6) motion seeks to dismiss a claim “present[ing] purely legal questions,” it is “appropriate to address” the claim “on the merits” and the court need not accept allegations concerning legal conclusions as true.
Ashcroft v. Iqbal, 556 U.S. 662, 678 (2009) (quoting Bell Atlantic Corp. v. Twombly, 550 U.S. 544, 570 (2007)).
Id. (citing Twombly, 550 U.S. at 556).
Sinclair Wyo. Refining Co. v. A&B Builders, Ltd., 989 F.3d 747, 765 (10th Cir. 2021) (quotations and citation omitted).
Free Speech v. Fed. Election Comm'n, 720 F.3d 788, 792 (10th Cir. 2013); Marshall Cnty. Health Care Auth. v. Shalala, 988 F.2d 1221, 1226 (D.C. Cir. 1993) (citing Neitzke v. Williams, 490 U.S. 319, 327 (1989)) (“[B]ecause a court can fully resolve any purely legal question on a motion to dismiss, there is no inherent barrier to reaching the merits at the 12(b)(6) stage.”); see also Khalik v. United Air Lines, 671 F.3d 1188, 1190 (10th Cir. 2012) (quoting Iqbal, 556 U.S. at 678) (“[W]hen legal conclusions are involved in the complaint ‘the tenet that a court must accept as true all of the allegations contained in a complaint is inapplicable to [those] conclusions.”).
ANALYSIS
NetChoice argues Section 230 preempts the provisions of the Act prohibiting the use of autoplay, seamless pagination, and notifications on minors' accounts. Defendants move to dismiss this claim, arguing the provisions are not inconsistent with Section 230. Because NetChoice's claim presents “purely legal questions” that do not require further factual development, it is appropriate to resolve the claim at this stage.
FAC ¶¶ 152-159.
Motion to Dismiss at 1-2.
Free Speech, 720 F.3d at 792.
The court concludes the challenged provisions impose liability for conduct that falls beyond the protections Section 230 affords NetChoice members. The Act's prohibitions on the use of autoplay, seamless pagination, and push notifications are not inconsistent with Section 230. Accordingly, Section 230 does not preempt the challenged provisions.
A. Section 230
Section 230 “creates a federal immunity to any state law cause of action that would hold computer service providers liable for information originating with a third party.” Specifically, Section 230(c)(1) establishes “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” An “interactive computer service” is “any information service, system, or access software provider that provides or enables computer access by multiple users to a computer service ....” And an “information content provider” is “any person or entity that is responsible, in whole or in part, for the creation or development of information provided through the Internet or any other interactive computer service.” Further, the law provides that “[n]othing in this section shall be construed to prevent any State from enforcing any State law that is consistent with this section.” However, “[n]o cause of action may be brought and no liability may be imposed under any State or local law that is inconsistent with this section.”
Ben Ezra, Weinstein, and Co., Inc. v. America Online Inc., 206 F.3d 980, 984-85 (10th Cir. 2000) (citing 47 U.S.C. § 230(e)(3)).
Id. § 230(f)(2).
Id. § 230(f)(3).
Id. § 230(e)(3).
Id.
As interpreted by the Tenth Circuit, Section 230 immunizes a service provider from certain civil liability if three requirements are met. First, “immunity is available only to a ‘provider or user of an interactive computer service.'” Second, “the liability must be based on the defendant's having acted as a ‘publisher or speaker.'” And third, “immunity can be claimed only with respect to ‘information provided by another information content provider.'”If the party asserting Section 230 immunity “fails to satisfy any one of the three, it is not entitled to immunity.”
Accusearch, 570 F.3d at 1196 (quoting 47 U.S.C. § 230(c)(1)).
Id. (quoting 47 U.S.C. § 230(c)(1)).
Id. (quoting 47 U.S.C. § 230(c)(1)). See also Dryoff v. Ultimate Software Grp., Inc., 934 F.3d 1093, 1097 (9th Cir. 2019) (internal quotations and citation omitted) (applying same three-part test); Force v. Facebook, Inc., 934 F.3d 53, 65 (2d Cir. 2019) (same); Jane Doe No. 1 v. Backpage.com, LLC, 817 F.3d 12, 19 (1st Cir. 2016) (internal quotations and citations omitted) (holding § 230 “shields conduct if the defendant (1) is a provider or user of an interactive computer service; (2) the claim is based on information provided by another information content provider; and (3) the claim would treat [the defendant] as the publisher or speaker of that information”).
Accusearch, 570 F.3d at 1196.
Paired with its text, the law's history sheds light on the appropriate scope of Section 230 immunity. Congress passed the law in response to a state court decision holding that when the administrator of an online message board “deleted some distasteful third-party postings,” it “became a ‘publisher'” and “was subject to publisher's liability for the defamatory postings it failed to remove.” As a result of this holding, a website operator that did nothing to moderate objectionable content posted by third-party users would face no liability for the content, whereas a website that attempted to regulate objectionable content would. Section 230 was designed to rectify that undesirable outcome, providing a limited immunity “‘to encourage service providers to self-regulate the dissemination of offensive material over their services' and to remove disincentives to self-regulation.” Fundamentally, as the Tenth Circuit has explained, Congress enacted Section 230 “to forbid the imposition of publisher liability on a service provider for the exercise of its editorial and self-regulatory functions.”
Id. at 1195 (citing Stratton Oakmont, Inc. v. Prodigy Servs. Co., 1995 WL 323710, at *5 (N.Y. Sup. Ct. May 24, 1995)).
Id. (citing Fair Hous. Council v. Roommates.com, LLC, 521 F.3d 1157, 1163 (9th Cir. 2008) (en banc)) (“The decision [Stratton Oakmont] was criticized for discouraging the voluntary filtration of [i]nternet content, because a forum provider's efforts to sanitize content would trigger liability that could be avoided by doing nothing.”).
Ben Ezra, 206 F.3d at 986 (quoting Zeran, 129 F.3d at 331). See also American Freedom Def. Initiative v. Lynch, 217 F.Supp.3d 100, 105 (D.D.C. 2016) (explaining Section 230 allows websites to “perform some editing on usergenerated content without thereby becoming liable for all defamatory or otherwise unlawful messages that they didn't edit or delete” by barring “lawsuits seeking to hold them liable for their exercise of a publisher's traditional editorial functions-such as deciding whether to publish, withdraw, postpone or alter content”) (cleaned up).
Id.
B. The Challenged Provisions
The parties do not dispute the NetChoice members subject to the Act are interactive computer services as defined by Section 230, nor that NetChoice members disseminate third-party content. The dispositive question is whether the Act's prohibitions on autoplay, seamless pagination, and notifications treat NetChoice members as the publisher or speaker of the third-party content they disseminate. The court concludes they do not. The Act's prohibitions focus solely on the conduct of the covered website-the website's use of certain design features on minors' accounts-and impose liability irrespective of the content those design features may be used to disseminate. In other words, the prohibitions do not impose liability on NetChoice members based on their role as a publisher of third-party content because the potential liability has no connection to that content. Accordingly, the challenged provisions fall outside the scope of Section 230's protections and are not inconsistent with it.
Opposition at 4.
See Roommates.com, 521 F.3d at 1165 (holding no Section 230 immunity in suit imposing liability on website for requiring users to submit surveys which potentially violated housing discrimination laws because “[defendant's] own acts-posting the questionnaire and requiring answers to it-are entirely its doing and thus Section 230 of the CDA does not apply to them”).
NetChoice asserts Section 230 prohibits liability for a broad range of decisions “about how to publish, disseminate, and present third-party speech-including the means of disseminating third-party content at issue here.” Because the Act's design feature prohibitions impose liability for these types of editorial decisions, NetChoice argues the Act impermissibly treats its members as the “publisher” or “speaker” of third-party content. The challenged provisions impose liability for “‘features that are part and parcel of the overall design and operation of the website,'” and for “‘tools meant to facilitate the communication content of others.'” As such, NetChoice explains, the provisions are inconsistent with Section 230 and preempted by it. The court disagrees.
Opposition at 2 (citing Dryoff, 934 F.3d at 1098).
Id. at 4.
Id. at 8 (quoting Fields v. Twitter, Inc., 217 F.Supp.3d 1116, 1124 (N.D. Cal. 2016) (citation omitted)).
Id. (quoting Dryoff, 934 F.3d at 1098).
Id. at 2.
NetChoice's interpretation of Section 230 as broadly immunizing websites from any liability for design decisions related to how a site “disseminate[s] and display[s] third-party speech” is unmoored from the plain text of Section 230 and unsupported by the caselaw NetChoice cites. Its assertion that its members' use of autoplay, seamless pagination, and notifications are a protected “exercise of [] editorial . . . functions” reads essential provisions of Section 230 out of the law.
Opposition at 1 (“At bottom, a website acts as a ‘publisher' and is protected by Section 230 when it decides how to disseminate and display third-party speech .... Accordingly, decisions to publish user-generated and other third-party content using autoplay, seamless pagination, or notifications are precisely the kinds of editorial publishing decisions that Section 230 protects.”).
Id. (quoting Ben Ezra, 206 F.3d at 986).
Collectively, the plain text of Section 230 does not support the broad interpretation NetChoice urges the court to accept. The law's provisions demonstrate immunity for a website's “exercise of its editorial and self-regulatory functions” “cannot be understood as a general prohibition of civil liability” for all of a website's conduct. Tellingly, the immunity provisions are located in a section captioned “Protection for ‘Good Samaritan' blocking and screening of offensive material.” And, as other Courts of Appeals have persuasively held, the scope and substance of the immunity provided by Section 230(c) “can and should be interpreted consistent with its caption.”
Ben Ezra, 206 F.3d at 986.
Chicago Law. Comm. for Civ. Rts. Under Law, Inc. v. Craigslist, Inc., 519 F.3d 666, 669 (7th Cir. 2008).
Roommates.com, 521 F.3d at 1164 (citing Chicago Law. Comm., 519 F.3d at 670).
Read through that lens, recall that Section 230(c)(1) states “[n]o provider . . . of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” And the following subsection provides “no provider . . . of an interactive computer service shall be held liable” for “any action voluntarily taken in good faith to restrict access to or availability of material that the provider . . . considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.” The plain language of Section 230 bars only causes of action or liability that treat a website as the publisher of third-party generated content. And, though it does so when a website takes actions to “block[] and screen[]” third-party content the website considers objectionable-actions which would have historically given rise to publisher liability-the language does not immunize from liability conduct unrelated to that third-party content.
Id. § 230(c)(2)(A).
Section 230 “was not meant to create a lawless no-man's-land on the [i]nternet.”Rather, as the Tenth Circuit explains, its protection from liability in certain circumstances was designed “‘to encourage service providers to self-regulate the dissemination of offensive material over their services' and to remove disincentives to self-regulation.” Where, as here, the liability a website may face has no relation to the content of its users, Section 230 simply does not apply.
Roomates.com, 521 F.3d at 1164. The en banc Ninth Circuit further explained that because of the internet's “vast reach into the lives of millions,” courts “must be careful not to exceed the scope of the immunity provided by Congress ....” Id. at 1164 n.15.
Ben Ezra, 206 F.3d at 986 (quoting Zeran, 129 F.3d at 331).
NetChoice cites several cases in support of its argument that its members' use of features such as autoplay, seamless pagination, and notifications, are editorial decisions protected by Section 230-each of which are unavailing. NetChoice is correct that the cases discuss similar website design features in the context of Section 230 immunity. However, each case involves a critical component that NetChoice's claim lacks: an effort to hold a website liable for content produced by third-party users.
For example, NetChoice cites Dryoff v. Ultimate Software Group, Inc. for the proposition that a website's use of “‘tools' that ‘facilitate the communication and content of others,'' such as email notifications, are traditional publisher functions protected by Section 230. In Dryoff, the Ninth Circuit held Section 230 immunized a website from liability where a plaintiff brought claims related to her child's use of the website to source and purchase narcotics. The plaintiff asserted Section 230 immunity did not apply because the defendant website did not merely publish third-party content-its design features enabled its users to traffic in illegal narcotics and “steered” users to groups “dedicated to the sale and use of narcotics.” In other words, because the website's “recommendation and notification functions were specifically designed to make subjective, editorial decisions about users based on their posts,” the site made contributions to the content that rendered it a content provider itself, not simply a publisher.
Id. at 7 (quoting Dryoff, 934 F.3d at 1098).
Dryoff, 934 F.3d at 1097.
Id.
Id. at 1096 (internal quotations omitted).
The court rejected that argument, holding the website did not become an information content provider by “facilitating communication” between users “through content-neutral website functions like group recommendations and post notifications” The court further reasoned the plaintiff's “content manipulation” theory was a backdoor attempt to impose liability on the website as the publisher of the narcotics-related content at the core of the plaintiff's complaint. And, the court held, a plaintiff “cannot plead around Section 230 immunity by framing these website features as content.”
Id.
Id. at 1098.
Id.
In arguing that design features alone are immune from liability under Section 230, NetChoice distorts Dryoff's facts and holding. The plaintiff there alleged a website's use of design features, such as recommendations and notifications, “materially contributed” to the site's user-generated content such that the website became an information content provider. The Ninth Circuit rejected that argument as a veiled attempt to hold the website liable for the content of its users' posts. That issue is distinct from this case because the potential liability NetChoice members face in connection with the Act's design feature prohibitions has nothing to do with third-party generated content. Whether NetChoice members incur liability for the use of autoplay, seamless pagination, or notifications on minors' accounts is entirely divorced from the content they may disseminate with those features. Just as the plaintiff in Dryoff could not “plead around Section 230 immunity by framing these website features as content,” NetChoice cannot shield itself with Section 230 immunity by attempting to frame website features as content.
Id. at 1099.
Id. at 1098.
NetChoice similarly omits relevant context from the Second Circuit's decision in Force v. Facebook, Inc. Quoting Force, NetChoice asserts “Section 230 protects websites' decisions about ‘arranging and distributing third-party information.'” In that case, plaintiffs alleged Facebook was civilly liable for aiding, abetting, and otherwise supporting Hamas because Facebook's design features, such as its newsfeed and recommendation algorithm, enabled the group to broadly disseminate its message and communicate about its attacks. Plaintiffs attempted to evade Section 230 immunity by arguing that, through its use of various design features and algorithms, Facebook was engaged in “matchmaking,” not “publishing,” as understood by Section 230.
Opposition at 7 (quoting Force, 934 F.3d at 66).
Force, 934 F.3d at 59-61.
Id. at 65.
The Second Circuit rejected plaintiffs' argument, holding that “arranging and distributing third-party information” in order to form connections among users is “an essential result of publishing.” Accepting plaintiffs' “matchmaking argument,” the court reasoned, would “deny immunity for the editorial decisions regarding third-party content that interactive computer services” commonly make. Furthermore, the court held, the design features did not constitute “material contribution[s]” to the third-party content that would render Facebook an information content provider itself. Ultimately, plaintiffs' various design feature challenges were still impermissible attempts to hold Facebook “responsible for the [third-party generated] Hamas-related content” at the heart of plaintiffs' complaint.
Id. at 66.
Id.
Id. at 68-71.
Id. at 69-70.
Placed in context, Force, like Dryoff, does not support a per se application of Section 230 immunity to a website's design features. The design features at issue in Force were only relevant because plaintiffs in the case used them in an attempt to plead around Section 230. Plaintiffs argued the design features took Facebook out of the realm of publishing and thus, placed it beyond the scope of Section 230. The portions of the case NetChoice selectively cites are from the court's explanation of why, notwithstanding that argument, Section 230 still applied. At bottom, plaintiffs were trying to hold Facebook liable for the third-party generated content it published. The court did not say Section 230 immunity applies when, as here, liability arises from the website's conduct-the use of certain design features-independent of its publication of third-party conduct.
Id. at 65 (“Plaintiffs seek to hold Facebook liable for giving Hamas a forum with which to communicate and for actively bringing Hamas' message to interested parties. But that alleged conduct by Facebook falls within the heartland of what it means to be the publisher of information under Section 230(c)(1). So, too, does Facebook's alleged failure to delete content from Hamas members' Facebook pages.”).
Nor do any of the other cases NetChoice cites support its position. Each of NetChoice's citations and quotations omit material facts and context. As a final example, NetChoice quotes Fields v. Twitter, Inc., for the proposition that “features that are part and parcel of the overall design and operation of the website” are immune from liability under Section 230. Opposition at 2 (quoting Fields, 217 F.Supp.3d at 1124). Plaintiffs in Fields sought to hold Twitter liable for providing accounts to ISIS, arguing the claim was not barred by Section 230 because a “contentneutral decision about whether to provide someone with a tool is not publishing activity.” Fields, 217 F.Supp.3d at 1123. The quotation NetChoice offers comes from a portion of the decision discussing a First Circuit case. In full, the Fields court explained the First Circuit held that “decisions regarding the ‘structure and operation of [a] website'-such as ‘permitt[ing] users to register under multiple screen names' and other decisions regarding ‘features that are part and parcel of the overall design and operation of the website'-‘reflect choices about what content can appear on the website and in what form' and thus ‘fall within the purview of traditional publisher functions.'” Id. at 1124 (quoting Backpage.com, 817 F.3d at 20-21). From this, the Fields court concluded Twitter's decision to allow ISIS to obtain accounts on the website “reflect[s] choices about what [third-party] content can appear on [Twitter] and in what form,” and when those choices “form the basis of a plaintiff's claim, Section 230(c)(1) applies.” Id. (internal quotations and citation omitted). The court concluded the liability plaintiffs sought to impose on Twitter fundamentally turned on the website's failure to “prevent ISIS from disseminating content through the Twitter platform”-a claim squarely within the ambit of Section 230 immunity. Id.
In sum, the court determines Section 230 does not preempt the provisions of the Act prohibiting the use of autoplay, seamless pagination, and notifications on minors' accounts because they do not impose liability on NetChoice members as the publisher or speaker of third-party content. The liability NetChoice members may face for violating the provisions arises solely from the members' use of the design features-not their use of the features to disseminate third-party content, NetChoice members' own content, or any other type of content. The Act does not, for example, condition liability on the use of the features to disseminate offensive, inappropriate, or otherwise objectionable third-party content to minors. It simply prohibits the use of those features on minors' accounts. The liability imposed by the Act's design feature prohibitions results only from the conduct of NetChoice's members, not their users.
See Roommates.com, 521 F.3d at 1165 (holding Section 230 did not apply because the liability a website faced was a result of its “own acts” and are “entirely its doing”).
NetChoice's preemption argument stretches Section 230 immunity beyond what the plain text of the law supports. The cases NetChoice cites in support, when read in context, confirm this interpretation. Fundamentally, Section 230 provides immunity from “any state law cause of action that would hold computer service providers liable for information originating with a third party.” Where, as here, liability arises solely from the service providers' conduct, detached from “information originating with a third party,” it falls beyond the scope of Section 230's protections. Accordingly, the challenged provisions of the Act are not inconsistent with Section 230 and are not preempted by it.
Ben Ezra, 206 F.3d at 984-85.
Id.
See 47 U.S.C. § 230(e)(3). Although the parties do not address this issue, the court further observes it is unclear whether NetChoice's Section 230 claim, insofar as it seeks declaratory relief, is properly raised. As demonstrated by the caselaw NetChoice cites, Section 230 immunity is typically invoked as an affirmative defense in an already-initiated action. Here, NetChoice seeks a declaratory judgment that the affirmative defense of Section 230 immunity preempts the challenged provisions prior to any enforcement. FAC ¶¶ 159, 195. But “using the Declaratory Judgment Act to anticipate an affirmative defense is not ordinarily proper, and numerous courts have refused to grant declaratory relief to a party who has come to court only to assert an anticipatory defense.” Divino Group LLC v. Google LLC, 2021 WL 51715, at *11 (N.D. Cal. Jan. 6, 2021) (citing Veoh Networks, Inc. v. UMG Recordings, Inc., 522 F.Supp.2d 1265, 1271 (S.D. Cal. 2007)). Indeed, other courts addressing declaratory relief claims preemptively invoking a statutorily provided affirmative defense “as a sword, rather than a shield,” have declined to exercise jurisdiction and dismissed the claim. Veoh Networks, 522 F.Supp.2d at 1271-72. In the only similarly postured case the court could identify, Publius v. Boyer-Vine, 237 F.Supp.3d 997 (E.D. Cal. Feb. 27, 2017), the court, addressing a Section 230 claim akin to NetChoice's, determined it did not have jurisdiction because the claim was “not ripe for review.” Id. at 1027-28. In reaching this conclusion, the court explained it could find no authority “holding that the mere threat of a lawsuit that ostensibly would violate [plaintiff's] [Section] 230(c) immunity constitutes a violation of [Section] 230 itself,” nor that “the [c]ourt has jurisdiction over a declaratory judgment claim that a threatened lawsuit would violate [Section] 230.” Id. at 1028. Here, the court concludes the challenged provisions of the Act are not inconsistent with Section 230. However, even if that were not the case, NetChoice's attempt to wield Section 230 as a sword, rather than a shield, would likely be procedurally improper.
CONCLUSION
For the reasons explained above, Section 230 does not preempt the Act's prohibitions on the use of autoplay, seamless pagination, and notifications on minors' accounts. Claim VI of NetChoice's FAC fails to adequately state a claim to relief under Rule 12(b)(6). Accordingly, Defendants' Motion to Dismiss is GRANTED and Count VI of NetChoice's FAC is DISMISSED.
Dkt. 59.
So ordered.