From Casetext: Smarter Legal Research

Anderson v. Tiktok, Inc.

United States District Court, E.D. Pennsylvania
Oct 25, 2022
637 F. Supp. 3d 276 (E.D. Pa. 2022)

Opinion

Civ. No. 22-1849

2022-10-25

Tawainna ANDERSON, Plaintiff, v. TIKTOK, INC., et al., Defendants.

Jeffrey P. Goodman, Samuel B. Dordick, Robert J. Mongeluzzi, Saltz Mongeluzzi & Bendesky PC, Philadelphia, PA, Parthena McCarthy, Shavertown, PA, for Plaintiff. Joseph E. O'Neil, Campbell Conroy & O'Neil, Berwyn, PA, Katherine A. Wang, Campbell Campbell Edwards & Conroy, Berwyn, PA, Albert Giang, King & Spalding, Los Angeles, CA, Geoffrey M. Drake, Tacara D. Harris, King & Spalding LLP, Atlanta, GA, for Defendants.


Jeffrey P. Goodman, Samuel B. Dordick, Robert J. Mongeluzzi, Saltz Mongeluzzi & Bendesky PC, Philadelphia, PA, Parthena McCarthy, Shavertown, PA, for Plaintiff. Joseph E. O'Neil, Campbell Conroy & O'Neil, Berwyn, PA, Katherine A. Wang, Campbell Campbell Edwards & Conroy, Berwyn, PA, Albert Giang, King & Spalding, Los Angeles, CA, Geoffrey M. Drake, Tacara D. Harris, King & Spalding LLP, Atlanta, GA, for Defendants. MEMORANDUM Diamond, District Judge

Plaintiff Tawainna Anderson accuses Defendants TikTok, Inc. and ByteDance, Inc. (operators of the social media application "TikTok") of causing the death of her daughter. (Compl. (Doc. No. 1).) Although the circumstances here are tragic, I am compelled to rule that because Plaintiff seeks to hold Defendants liable as "publishers" of third-party content, they are immune under the Communications Decency Act. Accordingly, I will grant Defendants' Motion to Dismiss. (Doc. No. 12); see 47 U.S.C. § 230(c)(1) and (e)(3).

I. BACKGROUND

a. Factual Allegations

TikTok is a social media platform enabling users to create short videos and view any shared videos created by third parties. (Compl. ¶ 50.) As "one of the world's fastest-growing social media platforms," TikTok boasts more than one billion active users worldwide. (Id. ¶¶ 43, 47.) Some twenty-eight percent of these users are younger than eighteen. (Id. ¶ 48.) Essential to TikTok's widespread appeal is its "For You Page." (Id. ¶ 52.) When a user opens TikTok, her FYP offers a stream of third-party videos curated through an algorithm developed to find that user's particular interests. (Id. ¶ 51.) The algorithm learns her age, location, and her previous application use. (Id. ¶ 53.) Defendants thus seek to provide FYP content that is "unique and tailored to that specific individual." (Id. ¶ 54.)

In December 2021, ten-year-old Nylah Anderson's FYP included the "Blackout Challenge": videos in which users strangle themselves with household items and then encourage others to record themselves doing the same. (Id. ¶¶ 82-83.) As alleged, the Blackout Challenge is of a piece with many other "challenges" published on TikTok "which promote dangerous behavior." (Id. ¶ 46.) Hiding in a bedroom closet, Nylah attempted the "Challenge." Her mother, Plaintiff Taiwanna Anderson, found Nylah unconscious, hanging from a purse strap. (Id. ¶ 87.) Ms. Anderson unsuccessfully attempted CPR. (Id. ¶ 88.) Three deep ligature marks on Nylah's neck confirmed that she had suffered while struggling to free herself. (Id. ¶¶ 86, 89.) After several days in intensive care, Nylah died. (Id. ¶ 91.)

As alleged, during 2021, other children died attempting the Blackout Challenge. (Id. ¶¶ 67-70.) As further alleged, Defendants knew that TikTok's algorithm was promoting the Blackout Challenge to children. (Id. ¶ 71.)

b. Procedural History

Anderson charges that TikTok caused Nylah's death. (Compl.) She brings design defect and failure to warn claims under strict products liability and negligence theories, as well as wrongful death and survival actions. (Id. ¶¶ 101-34, 156-86.) She also brings claims under the Pennsylvania Unfair Trade Practices and Consumer Protection Law and the California Consumer Legal Remedies Act. (Id. ¶¶ 135-55); 73 P.S. §§ 201-1, et seq,; Cal. Civ. § 1750, et seq,

Defendants move to dismiss all Counts, urging: a lack of personal jurisdiction, that Section 230 of the Communications Decency Act bars Anderson's products liability and negligence claims, and that Anderson has failed to state a claim for relief. (Doc. No. 12.)

In response, Anderson defends only her products liability, negligence, wrongful death, and survival claims, abandoning her claims under the Pennsylvania Unfair Trade Practices and Consumer Protection Law and the California Consumer Legal Remedies Act. I will thus dismiss those latter claims. See Levy-Tatum v. Navient Sols., Inc., 183 F. Supp. 3d 701, 712 (E.D. Pa. 2016) (dismissing claims the plaintiff failed to defend in opposing the defendant's motion to dismiss). The matter has otherwise been fully briefed. (Doc. Nos. 12, 17, 21, 22.)

Because I conclude that Section 230 precludes Anderson's products liability and negligence claims—on which her wrongful death and survival claims depend—I will grant Defendants' Motion.

II. LEGAL STANDARDS

I must accept as true Anderson's well-pled factual allegations and make all reasonable inferences in her favor. See Fed. R. Civ. P. 12(b)(6); In re Rockefeller Ctr. Props., Inc., 311 F.3d 198, 215 (3d Cir. 2002). I may consider Defendants' affirmative defense—that Section 230 bars Anderson's suit—at the motion-to-dismiss stage. Putt v. TripAdvisor, Inc., No. 20-3836, 2021 WL 242470, at *3 (E.D. Pa. Jan. 25, 2021). Anderson is "not required to anticipate and plead around an affirmative defense," however. Id.; see also Schmidt v. Skolas, 770 F.3d 241, 248 (3d Cir. 2014). Dismissal is thus permissible only if Section 230 immunity is "evident from the face of the complaint." Brody v. Hankin, 145 F. App'x 768, 771 (3d Cir. 2005) (quoting Bethel v. Jendoco Constr. Corp., 570 F.2d 1168, 1174 n. 10 (3d Cir. 1978)) (emphasis omitted).

III. DISCUSSION

In pertinent part, CDA Section 230 provides as follows:

No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.


****

No cause of action may be brought and no liability may be imposed under any State or local law that is inconsistent with this section.
47 U.S.C. § 230(c)(1) and (e)(3).

In thus precluding interactive service providers from being "treated as the publisher[s]" of third-party content, Congress immunized the providers' "decisions relating to the monitoring, screening, and deletion of content from [their] network[s]—actions quintessentially related to a publisher's role." Green v. Am. Online (AOL), 318 F.3d 465, 471 (3d Cir. 2003).

Congress conferred this immunity "to maintain the robust nature of Internet communication and, accordingly, to keep government interference in the medium to a minimum." Zeran v. Am. Online, Inc., 129 F.3d 327, 330 (4th Cir. 1997); 47 U.S.C. § 230(b)(1)-(2). It recognized that because of the "staggering" amount of information communicated through interactive computer services, providers cannot prescreen each message they republish. Zeran, 129 F.3d at 331. Accordingly, Congress conferred immunity on providers to encourage them not to restrict unduly the number and nature of their postings. Id.

Section 230 provides immunity when: (1) the defendant is an interactive computer service provider; (2) the plaintiff seeks to treat the defendant as a publisher or speaker of information; and (3) that information is provided by another content provider. 47 U.S.C. § 230(c)(1). Here, the Parties agree that Defendants are interactive computer service providers, and that the Blackout Challenge videos came from "another information content provider" (third-party users). (See Doc. Nos. 12, 17.) They dispute only whether Anderson, by her design defect and failure to warn claims, impermissibly seeks to treat Defendants as the "publishers" of those videos. It is evident from the face of Anderson's Complaint that she does.

Anderson urges that she seeks to hold Defendants directly liable for their own acts and omissions as designers, manufacturers, and sellers of a defective product, not for their conduct as publishers. See Erie Ins. Co. v. Amazon.com , Inc., 925 F.3d 135, 139-40 (4th Cir. 2019); (Doc. No. 17 at 10.) She cannot defeat Section 230 immunity, however, by creatively labeling her claims:

[W]hat matters is not the name of the cause of action—defamation versus negligence versus intentional infliction of emotional distress—what matters is whether the cause of action inherently requires the court to treat the defendant as the 'publisher or speaker' of content provided by another.
Barnes v. Yahoo!, Inc., 570 F.3d 1096, 1101 (9th Cir. 2009). I must look past how Anderson characterizes a claim and "ask whether the duty that [Anderson] alleges that [Defendants] violated derives from [Defendants'] status or conduct as a 'publisher or speaker.' " Id.

Anderson bases her allegations entirely on Defendants' presentation of "dangerous and deadly videos" created by third parties and uploaded by TikTok users. She thus alleges that TikTok and its algorithm "recommend inappropriate, dangerous, and deadly videos to users"; are designed "to addict users and manipulate them into participating in dangerous and deadly challenges"; are "not equipped, programmed with, or developed with the necessary safeguards required to prevent circulation of dangerous and deadly videos"; and "[f]ail[ ] to warn users of the risks associated with dangerous and deadly videos and challenges," (Compl. ¶¶ 107, 127 (emphasis added).) Anderson thus premises her claims on the "defective" manner in which Defendants published a third party's dangerous content.

Although Anderson recasts her content claims by attacking Defendants' "deliberate action" taken through their algorithm, those "actions," however "deliberate," are the actions of a publisher. Courts have repeatedly held that such algorithms are "not content in and of themselves." Dyroff v. Ultimate Software Grp., Inc., 934 F.3d 1093, 1098 (9th Cir. 2019). The Second Circuit has explained that the use of "tools such as algorithms that are designed to match [ ] information with a consumer's interests" is well within the range of publisher functions covered by Section 230. Force v. Facebook, Inc., 934 F.3d 53, 66 (2d Cir. 2019), cert. denied, — U.S. —, 140 S. Ct. 2761, 206 L.Ed.2d 936 (2020) (No. 19-859); cf. Obado v. Magedson, 612 F. App'x 90, 94 (3d Cir. 2015) ("[A]llegation that the defendants manipulated search engines to maximize search results relating to the alleged defamatory content does not affect their immunity from suit.").

Anderson relies heavily on two inapposite Ninth Circuit decisions. Doe v. Internet Brands, Inc., 824 F.3d 846 (9th Cir. 2016); Lemmon v. Snap, Inc., 995 F.3d 1085 (9th Cir. 2021).

In Internet Brands, the plaintiff was a member of "Model Mayhem," a networking website for "aspiring models." 824 F.3d at 848. The plaintiff "posted information about herself on the website." Id. at 848. Posing as talent scouts, two men contacted the plaintiff and lured her to a fake audition, where they assaulted her. Id. The plaintiff alleged that the failure of the website's operator to post a warning of the risks associated with using the website caused her to fall victim to the scheme. Id. at 849. Significantly, the plaintiff's claims had nothing to do with the site's content: she did not allege that the two men had posted anything to the site or that she was lured by any website posting. Id. at 851. The Court thus deemed Section 230 immunity inapplicable because the defendant's purported duty to warn "[did] not arise from an alleged failure to adequately regulate access to user content," and would not "affect how [the defendant] publishes or monitors such content." Id. at 851, 853.

As I have discussed, however, the duty Anderson invokes directly implicates the manner in which Defendants have chosen to publish third-party content. (Compl. ¶ 107.) Anderson's claims thus are plainly barred by Section 230 immunity. Cf. Doe v. MySpace, Inc., 528 F.3d 413, 420 (5th Cir. 2008) (considering claims "predicated solely on [service provider's] failure to implement basic safety measures to protect minors" as "merely another way of claiming that [service provider] was liable for publishing the communications").

The plaintiffs in Lemmon alleged that a speed filter on Snap's smartphone application, "Snapchat," helped cause the high-speed car accident in which the plaintiffs' two sons were killed. 995 F.3d at 1087-90. As alleged, the speed filter enabled users to record their real-life speed, and users believed that Snapchat would reward them for recording a speed over 100 miles per hour. Id. at 1093. The Court ruled that Section 230 did not apply because the plaintiff's claims were "independent[ ] of the content" created by Snapchat's users. Id. at 1093. Rather, the defect alleged was the way in which the site was designed, highlighting "the interplay between Snapchat's reward system and the Speed Filter." Id. at 1092.

Selectively quoting from Internet Brands and Lemmon, Anderson insists that she is not attacking Defendants' actions as publishers because her claims do not require Defendants to remove or alter the content generated by third parties. (Doc. No. 17 at 12, 14.) Publishing involves more than just these two actions, however. As I have discussed, it also involves decisions related to the monitoring, screening, arrangement, promotion, and distribution of that content—actions that Anderson's claims all implicate. See Force, 934 F.3d at 66. Indeed, "[c]ourts have interpreted 'publication' capaciously to reach claims that, although pleaded to avoid the CDA, 'implicitly require recourse to that content [posted by a third party] to establish liability." Herrick v. Grindr, LLC, 306 F. Supp. 3d 579 (2018), aff'd, 765 F. App'x 586 (2d Cir. 2019); see also Force, 934 F.3d at 64 ("The Circuits are in general agreement that the text of Section 230(c)(1) should be construed broadly in favor of immunity.").

In sum, because Anderson's design defect and failure to warn claims are "inextricably linked" to the manner in which Defendants choose to publish third-party user content, Section 230 immunity applies. Herrick, 765 F. App'x at 591. Anderson's wrongful death and survival claims cannot proceed in light of that tort immunity. See Valentino v. Phila. Triathlon, LLC, 150 A.3d 483, 493 (Pa. Super. Ct. 2016) ("Pennsylvania case law has long held that a wrongful death claimant's substantive right to recover is derivative of and dependent upon a tortious act that resulted in the decedent's death."); Tulewicz v. Se. Pa. Transp. Auth., 529 Pa. 588, 606 A.2d 427, 431 (1992) ("A survival action . . . 'merely continues in [the decedent's] personal representatives the right of action which accrued to the deceased at common law because of the tort.' ") (cleaned up).

IV. CONCLUSION

Nylah Anderson's death was caused by her attempt to take up the "Blackout Challenge." Defendants did not create the Challenge; rather, they made it readily available on their site. Defendants' algorithm was a way to bring the Challenge to the attention of those likely to be most interested in it. In thus promoting the work of others, Defendants published that work—exactly the activity Section 230 shields from liability. The wisdom of conferring such immunity is something properly taken up with Congress, not the courts.

I will thus grant Defendants' Motion on immunity grounds. In light of my decision, I need not address Defendants' contentions respecting jurisdiction and failure to state a claim.

An appropriate Order follows.


Summaries of

Anderson v. Tiktok, Inc.

United States District Court, E.D. Pennsylvania
Oct 25, 2022
637 F. Supp. 3d 276 (E.D. Pa. 2022)
Case details for

Anderson v. Tiktok, Inc.

Case Details

Full title:TAWAINNA ANDERSON, Plaintiff, v. TIKTOK, INC., et al., Defendants.

Court:United States District Court, E.D. Pennsylvania

Date published: Oct 25, 2022

Citations

637 F. Supp. 3d 276 (E.D. Pa. 2022)

Citing Cases

Anderson v. TikTok, Inc.

The District Court dismissed the complaint, holding that TikTok was immune under § 230 of the CDA, 47 U.S.C.…