Opinion
22-cv-00737-CRB
05-05-2022
ORDER GRANTING MOTIONS TO DISMISS
CHARLES R. BREYER United States District Judge
Plaintiff Justin Hart, a California resident, is suing Defendants Facebook Inc., Twitter Inc., President Joseph Biden, Surgeon General Vivek Murthy, the Department of Health and Human Services (HHS), and the Office of Management and Budget (OMB). See Compl. (dkt. 1). Hart alleges that, between late 2020 and mid-2021, Facebook and Twitter flagged his posts as misinformation about COVID-19 and suspended or locked his accounts. Hart contends that these acts violated the First Amendment of the United States Constitution because President Biden and Surgeon General Murthy (collectively, Federal Defendants) allegedly acted jointly with Facebook and Twitter. Hart also argues that Facebook and Twitter violated the Free Speech Clause of the California Constitution as well as California contract and tort law. Facebook, Twitter, and the Federal Defendants move to dismiss. Facebook and Twitter also move to strike under California's anti-SLAPP statute. Finding oral argument unnecessary, the Court GRANTS the motions to dismiss without leave to amend. The Court declines to reach the motions to strike.
Hart also raises a claim against HHS and OMB, but this order does not discuss it, as they did not move to dismiss. Hart alleges that they violated the Freedom of Information Act (FOIA) by failing to timely respond to his document request as to communication between the Federal Defendants, Facebook, and Twitter. See Compl. ¶¶ 66-74.
I. BACKGROUND
A. Parties
Hart is a resident of San Diego County, California. Compl. ¶ Intro 12. He is “the Chief Data Analyst and founder of RationalGround.com, which helps companies, public policy officials, and parents gauge the impact of COVID-19 across the country.” Id. Hart has used Facebook since 2007 as a networking tool for his consulting business and for his website. Id. ¶¶ 30-34. That same year, Hart joined Twitter, which he uses for the same reasons and “as a feeder for his other social media accounts.” Id. ¶¶ 47, 48.
The complaint numbers paragraphs from 1 to 26 (for sections on Introduction, Parties, and Jurisdiction) and then begins again at 1 and goes to 110. Unless noted with “Intro, ” paragraph numbers from the Complaint refer to the paragraph numbers in the body of the complaint (1-110).
Facebook Inc. is a corporation with its principal place of business in California that hosts “one of the most popular social media sites, ” boasting “more than 2.8 billion monthly users worldwide.” Id. ¶ 21.
Twitter Inc. is a corporation with its principal place of business in California that runs a popular social media site used by “more than one in five adult Americans.” Id. ¶ 41.
Vivek Murthy is Surgeon General of the United States and “directs the office of the Surgeon General.” Id. ¶ Intro 15.
Joseph R. Biden, Jr. is President of the United States and directs the federal executive branch, including White House staff. Id. ¶ Intro 16.
B. Facts
1. Terms of Use
Because Hart “refers extensively” to Facebook's Terms of Service and Community Standards and Twitter's Terms of Service, they are incorporated by reference into the complaint. Khoja v. Orexigen Therapeutics, Inc., 899 F.3d 988, 1002 (9th Cir. 2018); see Compl. ¶¶ 25-26 & nn. 17-19 (Facebook's Terms of Service and Community Standards); id. ¶¶ 44-46 & nn. 30-31 (Twitter's Terms of Service); see also Twitter RJN (dkt. 71); Facebook Mot. (dkt. 73) at 3 n.2.
At the relevant times, Facebook's Terms of Service forbade users from sharing “anything . . . [t]hat is unlawful, misleading, discriminatory, or fraudulent.” See https://web.archive.org/web/20210718231018/https://www.facebook.com/legal/terms/plain textterms (Facebook Terms of Service). The Terms of Service also forbade users from sharing anything that violated its “Community Standards.” See id. One category of speech that could violate Facebook's Community Standards was “Integrity and Authenticity, ” Compl. ¶ 27, of which a subcategory was “False News.” See https://web.archive.org/web/20210713153441/https://www.facebook.com/communitystand ards/integrityauthenticity (Facebook Community Standards). The Terms of Service also stated that Facebook “can remove or restrict access to content that is in violation of these provisions.” See Facebook Terms of Service.
Twitter similarly conditions the use of its platform on compliance with its Terms of Service and various rules and policies, which are posted on Twitter's website. See Compl. ¶¶ 44-46. By accepting Twitter's User Agreement, a Twitter user agrees to be bound by the current version of the Terms of Service. See Patchen Decl. Ex. 1 (dkt. 70-2) § 6 (Twitter Terms of Service). In its Terms of Service, Twitter “reserve[s] the right to remove Content that violates the User Agreement” and directs people to its website for information “regarding specific policies and the process for reporting or appealing violations.” Id. § 3. One of Twitter's policies prohibits using “Twitter's services to share false or misleading information about COVID-19 which may lead to harm.” Patchen Decl. Ex. 3 (dkt. 70-4) (Twitter Covid-19 Misleading Information Policy). The policy further states that Twitter “will label or remove false or misleading information” about personal protective equipment “such as claims about the efficacy and safety of face masks to reduce viral spread” and that penalties may include account locks. Id.
2. Allegations as to Facebook
Beginning in September 2020, Hart's Facebook posts triggered warnings from Facebook that they “violated its Community Standard[s].” Id. ¶¶ 35-37. First, on or around September 15, 2020, Facebook issued a warning regarding a July 2020 post in which Hart described a video as depicting “cops defending” a statue of Christopher Columbus in Chicago from “hundreds of ‘peaceful' protestors throw[ing] bottles, cans, canes, and rocks” as part of a “BLM/SJW rally.” Id. ¶ 35. Hart alleges that Facebook's warning claimed that “[f]alse information about COVID-19 [was] found in your post.” Id. On September 25, Facebook banned Hart for 30 days from advertising on Facebook and from “live” communication with his followers after he posted “‘Spotify seems like a great place to work!' - Joseph Goebbels.” Id. ¶ 36.
On April 23, 2021, Facebook restricted Hart from posting or commenting for 24 hours because it stated that three of Hart's posts from earlier in April violated its Community Standards:
If you ever want to know where your BLM donation is going -the co-founder ‘trained Marxist' Patrisee Cullars - just bought this amazing home in LA. (Id. ¶ 37(a))
This is the truth: Covid is almost gone in America. Hospitals are literally empty. Every willing senior has already been vaccinated. In a few weeks every willing adult can be... (Id. ¶ 37(c))(Hart alleges that the third post “was removed from Facebook” but does not allege anything about its content. Id. ¶ 37(b).)
Finally, on July 13, 2021, Hart posted an infographic on his personal Facebook page entitled “Masking Children is Impractical and Not Backed by Research or Real World Data.” Id. ¶ 1. The post argued, among other things, that masking “can often cause headaches and fatigue, ” that “[s]ome masks contain toxic chemicals, ” “[d]eaf & disabled children struggle to learn with masks, ” and masking could “cause a wide variety of . . . health issues.” Id. ¶ 2. Hart alleges that the graphic is “science-based” and contained footnotes to scientific evidence supporting the claims. Id. ¶ 3. Facebook flagged the post with the following notice:
You can't post or comment for 3 days.
This is because you previously posted something that didn't follow our Community Standards.
This post goes against our standards on misinformation that
could cause physical harm, so only you can see it.
Learn more about updates to our standards.Id. ¶ 4.
Hart has a “valid employment contract” with Donorbureau, LLC, a Virginia-based limited liability company, as Administrator of its Facebook account. Id. ¶¶ 91-92. He alleges that he was unable to fulfill his contractual duties because Facebook suspended his account. Id. ¶¶ 92, 96-97.
3. Allegations as to Twitter
On or around July 18, 2021, Hart published the following Tweet on his Twitter account @justinhart:
So the CDC just reported that 70% of those who came down with #COvId19 symptoms had been wearing a mask. We know that masks don't protect you . . . but at some point you have to wonder if they are PART of the problem.Id. ¶ 5. That same day, Twitter locked Hart's account and provided him with notice that he violated the Covid-19 Misleading Information Policy:
Hi Justin Hart,
Your Account, @justinhart has been locked for violating the Twitter Rules.
Specifically for: Violating the policy on spreading misleading and potentially harmful information related to COVID-19.Id. ¶ 6.
4. Statements by Federal Officials
On July 15, 2021-two days after Facebook's final disciplinary action against Hart and three days before Twitter locked his account-the Biden Administration announced a focus on COVID-19-related misinformation on social media. See id. ¶¶ 7-8. At a White House press conference, Surgeon General Murthy stated: “We're asking [our technology companies] to consistently take action against misinformation super-spreaders on their platforms.” Id. ¶ 8. Hart alleges that “a team of government employees are actively researching and tracking social media posts with which it disagrees and relaying those posts to social media companies with instructions to take them down.” Id. ¶ 9. White House Press Secretary Jen Psaki stated that: “We've increased disinformation research and tracking within the Surgeon General's office. We're flagging problematic posts for Facebook that spread disinformation.” Id. ¶ 10. Psaki also said that “we are in regular touch with these social media platforms, and those engagements typically happen through members of our senior staff, but also members of our COVID-19 team.” Id. ¶ 12.
Hart alleges that Biden and Murthy “directed” social media platforms to make four changes: (1) to “measure and publicly share the impact of misinformation on their platform”; (2) to “create a robust enforcement strategy that bridges their properties and provides transparency about the rules”; (3) to “take faster action against harmful posts” because “information travels quite quickly on social media platforms”; and (4) to “promote quality information in their feed algorithm.” Id. ¶¶ 14-17. Hart also alleges that Biden directed Murthy to create a 22-page advisory with “instructions on how social media companies should remove posts with which Murthy and Biden disagree.” Id. ¶ 18. Finally, Hart alleges that Biden “threatened” social media companies who do not comply by “publicly shaming and humiliating them, stating, ‘They're killing people.'” Id. ¶ 19.
C. Procedural History
On July 22, 2021, one week after White House press conference, Hart submitted a FOIA request to HHS and OMB. Compl. ¶ 67; see Ex. A to Ex. 1 (dkt. 78-1) at 2 (FOIA request for “[a]ll records of communications . . . between the White House or HHS and any social media company related to Justin Hart or his social media posts”). Neither agency responded within the 20-day statutory deadline. Id. ¶ 69; see 5 U.S.C. § 552(a)(6)(A)(i).
On August 31, 2021, Hart filed this lawsuit in the Southern District of California against President Biden, Surgeon General Murthy, Facebook, Twitter, HHS, and OMB. See Compl. On February 2, 2022, a district judge granted Facebook and Twitter's motions to transfer, holding that the forum-selection clauses in both companies' Terms of Service were valid and enforceable and applicable to all claims in this suit. See Order Granting Transfer (dkt. 45).
Facebook and Twitter moved to dismiss under Federal Rule of Civil Procedure 12(b)(6) for failure to state a claim. See Twitter Mot. (dkt. 70); Facebook Mot. (dkt. 73). Twitter and Facebook also moved to strike under California's anti-SLAPP statute. Twitter anti-SLAPP (dkt. 72); Facebook Mot. at 17-19. The Federal Defendants moved to dismiss under Rule 12(b)(1) for lack of subject matter jurisdiction. Gov't Mot. (dkt. 69).
II. LEGAL STANDARD
A. 12(b)(6) Motion to Dismiss
Under Rule 12(b)(6) of the Federal Rules of Civil Procedure, a complaint may be dismissed for failure to state a claim for which relief may be granted. Fed.R.Civ.P. 12(b)(6). Rule 12(b)(6) applies when a complaint lacks either a “cognizable legal theory” or “sufficient facts alleged” under such a theory. Godecke v. Kinetic Concepts, Inc., 937 F.3d 1201, 1208 (9th Cir. 2019). Whether a complaint contains sufficient factual allegations depends on whether it pleads enough facts to “state a claim to relief that is plausible on its face.” Ashcroft v. Iqbal, 556 U.S. 662, 678 (2009) (quoting Bell Atlantic Corp. v. Twombly, 550 U.S. 544, 570 (2007)). A claim is plausible “when the plaintiff pleads factual content that allows the court to draw the reasonable inference that the defendant is liable for the misconduct alleged.” Id. at 678. When evaluating a motion to dismiss, the Court “must presume all factual allegations of the complaint to be true and draw all reasonable inferences in favor of the nonmoving party.” Usher v. City of Los Angeles, 828 F.2d 556, 561 (9th Cir. 1987). The Court “must consider the complaint in its entirety, as well as other sources courts ordinarily examine when ruling on Rule 12(b)(6) motions to dismiss, in particular, documents incorporated into the complaint by reference, and matters of which a court may take judicial notice.” Tellabs, Inc. v. Makor Issues & Rights, Ltd., 551 U.S. 308, 322 (2007).
If a court dismisses a complaint for failure to state a claim, it should “freely give leave” to amend “when justice so requires.” Fed.R.Civ.P. 15(a)(2). A court has discretion to deny leave to amend due to “undue delay, bad faith or dilatory motive on the part of the movant, repeated failure to cure deficiencies by amendment previously allowed, undue prejudice to the opposing party by virtue of allowance of the amendment, [and] futility of amendment.” Leadsinger, Inc. v. BMG Music Pub., 512 F.3d 522, 532 (9th Cir. 2008).
B. 12(b)(1) Motion to Dismiss
“The doctrine of standing limits federal judicial power.” Or. Advocacy Ctr. v. Mink, 322 F.3d 1101, 1108 (9th Cir. 2003). The question of whether plaintiffs have standing “precedes, and does not require, analysis of the merits.” Equity Lifestyle Props., Inc. v. Cnty. of San Luis Obispo, 548 F.3d 1184, 1189 n.10 (9th Cir. 2008). To have standing, plaintiffs must establish (1) that they have suffered an injury in fact, (2) that their injury is fairly traceable to a defendant's conduct, and (3) that their injury would likely be redressed by a favorable decision. See Lujan v. Defs. of Wildlife, 504 U.S. 555, 560-61 (1992). Each of these elements must be supported “with the manner and degree of evidence required at the successive stages of the litigation.” Id. at 561. And plaintiffs “must have standing to seek each form of relief requested in the complaint.” Town of Chester v. Laroe Estates, Inc., 137 S.Ct. 1645, 1651 (2017).
Under Rule 12(b)(1), a defendant may move to dismiss for lack of standing and thus lack of subject matter jurisdiction. See White v. Lee, 227 F.3d 1214, 1242 (9th Cir. 2000). Rule 12(b)(1) attacks on standing can be either facial, confining the court's inquiry to allegations in the complaint, or factual, permitting the court to look beyond the complaint. Id.; Safe Air for Everyone v. Meyer, 373 F.3d 1035, 1039 (9th Cir. 2004). For facial attacks, courts accept the jurisdictional allegations in the complaint as true. See, e.g., Whisnant v. U.S., 400 F.3d 1177, 1179 (9th Cir. 2005). When addressing a factual attack, however, courts may consider evidence like declarations submitted by the parties, and the party opposing the motion to dismiss has the burden of establishing subject matter jurisdiction by a preponderance of the evidence. See, e.g., Leite v. Crane Co., 749 F.3d 1117, 1121 (9th Cir. 2014).
III. DISCUSSION
For the reasons set out below, the Court grants the motions to dismiss by Facebook, Twitter, and the Federal Defendants.
A. Facebook and Twitter
Hart's sole federal claim against Facebook and Twitter is a First Amendment claim that requires alleging that they engaged in state action. The Court dismisses this claim because Hart fails to do so plausibly. The Court declines to exercise jurisdiction over the state-law claims and does not reach the motions to strike.
1. State Action
The Ninth Circuit recently reaffirmed that “a private entity hosting speech on the Internet is not a state actor” subject to the Constitution. See Prager Univ. v. Google LLC, 951 F.3d 991, 995 (9th Cir. 2020) (“Despite YouTube's ubiquity and its role as a publicfacing platform, it remains a private forum, not a public forum subject to judicial scrutiny under the First Amendment.”). The Supreme Court also recently explained that “merely hosting speech by others is not a traditional, exclusive public function and does not alone transform private entities into state actors subject to First Amendment constraints.” Manhattan Cmty. Access Corp. v. Halleck, 139 S.Ct. 1921, 1930 (2019).
However, in rare cases, action by a private party can constitute state action. See Pasadena Republican Club v. W. Justice Ctr., 985 F.3d 1161, 1167 (9th Cir. 2021) (noting the four different tests that the Supreme Court has employed to determine if a private party engaged in state action). Hart argues that Facebook and Twitter engaged in state action under either of two theories: a “joint action” theory and a “governmental compulsion or coercion” theory. See Opp. at 3; see Pasadena, 985 F.3d at 1167. Hart does not come close to pleading state action under either theory.
a. Joint Action
Under the joint action test, state action occurs where “the state has ‘so far insinuated itself into a position of interdependence with [the private entity] that it must be recognized as a joint participant in the challenged activity.” Gorenc v. Salt River Project Agr. Imp. & Power Dist., 869 F.2d 503, 507 (9th Cir. 1989) (quoting Burton v. Wilmington Parking Auth., 365 U.S. 715, 725 (1961)). But “a bare allegation of such joint action will not overcome a motion to dismiss.” DeGrassi v. City of Glendora, 207 F.3d 636, 647 (9th Cir. 2000). The Supreme Court has explained:
[A] State normally can be held responsible for a private decision only when it has exercised coercive power or has provided such significant encouragement, either overt or covert, that the choice must in law be deemed to be that of the State. Mere approval of or acquiescence in the initiatives of a private party is not sufficient to justify holding the State responsible for those initiatives.Blum v. Yaretsky, 457 U.S. 991, 1004-05 (1982). This circuit has required “substantial cooperation” or that the private entity and government's actions be “inextricably intertwined.” Brunette v. Humane Society of Ventura Cnty., 294 F.3d 1205, 1211 (9th Cir. 2002). “A conspiracy between the State and a private party to violate constitutional rights may also satisfy the joint action test.” Id. However, the private and governmental actors must have had a “meeting of the minds” to “violate constitutional rights.” Fonda v. Gray, 707 F.2d 435, 438 (9th Cir. 1983).
First, the Court emphasizes that Facebook and Twitter made contemporaneous statements that they took action because they concluded that Hart had violated company policy. See, e.g., id. ¶ 4. (Facebook: “This post goes against our standards on misinformation that could cause physical harm, so only you can see it. . . Learn more about updates to our standards.”); id. ¶ 6. (Twitter: “Your Account, @justinhart has been locked for violating the Twitter Rules. Specifically for: Violating the policy on spreading misleading and potentially harmful information related to COVID-19.”). Both companies' Terms of Service establish that (1) they have misinformation policies; and (2) they enforce them. See Facebook Community Standards (stating that the platform disallows “False News”); Facebook Terms of Service (users may not share anything “[t]hat is . . . misleading”); id. (Facebook “can remove or restrict access to content that is in violation of these provisions”); Twitter Terms of Service (Twitter “reserve[s] the right to remove Content that violates the User Agreement”); id. (directing Twitter users to the website for more details); Twitter Covid-19 Misleading Information Policy (Twitter “will label or remove false or misleading information” about personal protective equipment “such as claims about the efficacy and safety of face masks to reduce viral spread” and that penalties may include account locks). On their own, Facebook and Twitter's contemporaneous statements plausibly explain Hart's injury. See Bell Atlantic Corp. v. Twombly, 550 U.S. 544, 557 (2007) (allegations of secret illegal conduct are insufficient where they are consistent with plausible legal explanations).
Next, the Court rejects Hart's joint action argument as to most of Facebook's conduct because it occurred long before the administration made any statements at all. Mysteriously, Hart believes that the Federal Defendants and Facebook took joint action against his posts months before he alleges that the Federal Defendants even began communicating with Facebook about misinformation. See, e.g., Compl. ¶ 37 (Facebook disciplined Hart in April 2021). Even more mysteriously, Hart apparently believes that the Federal Defendants and Facebook committed joint state action against him in September 2020. See Compl. ¶¶ 35, 36. Yet neither Biden nor Murthy was in government until January 2021. Hart fails to explain how Facebook took joint action with governmental actors from the future. Instead, he argues that Press Secretary Psaki's use of a present tense verb on July 15, 2021 indicates that state actors had already been acting jointly with Facebook and Twitter for months. See Opp. at 8-9 (quoting Compl. ¶ 13). The Court disagrees. See Fed. Agency of News LLC v. Facebook, Inc., 432 F.Supp.3d 1107, 1125 (N.D. Cal. 2020) (allegations “do little to demonstrate joint action in the instant case, as most of [them] post-date the relevant conduct that allegedly injured Plaintiffs”); Children's Health Def, v. Facebook Inc., 546 F.Supp.3d 909, 930 (N.D. Cal. 2021) (similar); see also Fonda, 707 F.2d at 438 (joint action requires a “meeting of the minds”).
Hart also alleges that Facebook “tak[es] its directives” from President Biden in promulgating its “constantly shifting” policies on misinformation. See Compl. ¶¶ 39, 40. Even if true, this allegation would not imply that Facebook's conduct towards Hart was state action, for the reasons discussed in the next paragraph of the order. In any case, the claim is conclusory and implausible, unless it is reinterpreted as an anodyne allegation that Facebook considers currently-known information and the statements of governmental and public health authorities in designing its policies. And (as noted below) neither the government's communication of information nor a private party's use of that information transforms private action into state action. Finally, Hart's allegation that Facebook applies its misinformation policy oddly, see id. ¶ 35 (alleging that Facebook flagged a post about police clashes with protestors as involving COVID-19 misinformation), suggests only that Facebook applies its misinformation policy oddly and has nothing to do with state action.
Despite occurring relatively close in time to two alleged instances of joint action, the Federal Defendants' statements are far too vague and precatory to suggest joint action. The four “key changes” the administration suggested to social media platforms are vague and unenforceable. See, e.g., Compl. ¶¶ 14-15 (recommending that they “measure and publicly share the impact of misinformation on their platform” and “create a robust enforcement strategy”). And Surgeon General Murthy's “22-page advisory” document is, well, advisory. Id. ¶ 18. Hart cites no authority for the proposition that vague government advisory documents transform private action into state action. These documents are issued annually by the thousands and do not secretly transform large swathes of the private sector into state actors.
Furthermore, the administration's statements have no particularized connection to Facebook or Twitter's actions toward Hart. See DeGrassi, 207 F.3d at 647 (noting that a “bare allegation of [] joint action” is insufficient). The two alleged actions that are close in time are: (1) Facebook's flagging of Hart's July 13, 2021 post about the purported dangers of masking (Compl. ¶¶ 1, 4); and (2) Twitter's temporary locking of his account because of his July 18, 2021 tweet (Compl. ¶ 5). Yet Hart makes no allegation that the Federal Defendants ever knew about his July 13 Facebook post, his July 18 tweet, or even his existence. Because the Federal Defendants did not know about Hart's post or tweet, they could not have had a “meeting of the minds” as to the disciplinary action those companies took. See Children's Health Def., 546 F.Supp.3d at 930 (no joint action as to plaintiff on the basis of vague allegations that the government and Facebook were “working together” to fight misinformation); Fed. Agency of News, 432 F.Supp.3d at 1126 (no joint action where the allegations as to government policy were “unconnected” to the harm to the plaintiff). In his opposition, Hart insists that the government policy “called on the private party to take the precise action at issue.” Opp. at 7. This is flatly belied by the complaint. The fact that the White House communicated with Facebook and Twitter about the general topic does not transform into state action their decisions about one post or tweet.
Finally, even if the White House had specifically communicated with these companies about Hart's post or tweet, their enforcement of its policy as to that post or tweet would still not be joint action. One party supplying information to another party does not amount to joint action. See Lockhead v. Weinstein, 24 Fed.Appx. 805, 806 (9th Cir. 2001) (“[M]ere furnishing of information to police officers does not constitute joint action”); Fed. Agency of News, 432 F.Supp.3d at 1124 (“supplying information to the state alone [does not amount] to conspiracy or joint action”) (alteration added). The one-way communication alleged here falls far short of “substantial cooperation.” See Brunette, 294 F.3d at 1212. After all, the Federal Defendants did not “exert[] control over how [Facebook or Twitter] used the information [it] obtained.” See Deeths v. Lucile Slater Packard Children's Hospital at Stanford, 2013 WL 6185175, at *10 (E.D. Cal. Nov. 26, 2013). Indeed, even if the Federal Defendants communicated with Facebook or Twitter about Hart's post and later agreed with those companies' decisions, approval or acquiescence does not make the State responsible for their actions. See Blum, 457 U.S at 1004-05.
The Court easily concludes that the Federal Defendants did not “so far insinuate[] [themselves] into a position of interdependence” with Facebook and Twitter that they “must be recognized as joint participant[s]” in their decisions to enforce their policies against Hart. See Gorenc, 869 F.2d at 507.
b. Government Coercion
Hart also fails to plead state action on the theory that the Federal Defendants coerced Facebook or Twitter into taking action as to his accounts. As noted above, state action may be found where a State “exercised coercive power or has provided such significant encouragement, either overt or covert, that the choice must in law be deemed to be that of the State.” Blum, 457 U.S. at 1004-05. Hart vaguely alleges that the Federal Defendants “directed [] Facebook and Twitter to remove Hart's social media posts.” Compl. ¶ 20. He also alleges that President Biden “threatened social media companies who do not comply with his directives by publicly shaming and humiliating them, stating, ‘They're killing people.'” Id. ¶ 19. Hart argues that, in light of “public battles over the future of Section 230 [of the Communications Decency Act] legislation and ongoing antitrust investigations, including by executive agencies” such a statement amounts to a “threat.” Opp. at 4.
None of these conclusory allegations come remotely close to coercion. Indeed, Hart admits that this theory is even weaker than the joint action theory, see Opp. at 4, and he does not try to argue that any case finding government coercion is factually analogous, see Opp. at 3-4. For obvious reasons, the government's vague recommendations and advisory opinions are not coercion. Nor can coercion be inferred from President Biden's comment that social media companies are “killing people.” Compl. ¶ 19. A President's one-time statement about an industry does not convert into state action all later decisions by actors in that industry that are vaguely in line with the President's preferences. And Hart has not alleged any connection between any (threat of) agency investigation and Facebook and Twitter's decisions. See Zhou v. Breed, No. 21-15554, 2022 WL 135815, at *1 (9th Cir. Jan. 14, 2022) (“The mere fact that Breed or other public officials criticized a billboard or called for its removal, without coercion or threat of government sanction, does not make that billboard's subsequent removal by a private party state action.”). Finally, even if Hart had plausibly pleaded that the Federal Defendants exercised coercive power over the companies' misinformation policies, he still fails to specifically allege that they coerced action as to him. See Children's Health Def., 546 F.Supp.3d at 933 (rejecting a government coercion argument because there was no allegation that a state actor ordered Facebook to “take any specific action with regard to [plaintiff] or its Facebook page”).
It is still more difficult to understand how general legislative debates, such as those surrounding Section 230, could provide a President with coercive power over a private company sufficient to confer state action.
As Hart fails to plead state action under either a joint action or government coercion theory, his First Amendment claim against Facebook and Twitter fails as a matter of law.
2. California Claims
Hart also alleges four California claims against Facebook and Twitter. The first two (against both defendants) involve the Free Speech Clause of the California Constitution and a promissory estoppel theory. Compl. at 17-19. The next two (against Facebook only) allege intentional interference with contract and negligent interference with prospective economic advantage (as to the disruption of Hart's business relationship with Donorbureau). See id. at 19-21. Having dismissed the sole federal claim against these defendants, the Court declines to exercise supplemental jurisdiction over these claims.
A district court may decline to exercise supplemental jurisdiction where it has dismissed all claims over which it has original jurisdiction. See 28 U.S.C. § 1367(c)(3); Oliver v. Ralphs Grocery Co., 654 F.3d 903, 911 (9th Cir. 2011) (not error to decline supplemental jurisdiction where “balance of the factors of ‘judicial economy, convenience, fairness, and comity' did not ‘tip in favor of retaining the state-law claims' after dismissal of the [federal] claim”); Acri v. Varian Assocs., Inc., 114 F.3d 999, 1000 (9th Cir. 1997) (“[S]tate law claims ‘should' be dismissed if federal claims are dismissed before trial”) (emphasis in original). The claims under the California Constitution and various contract or tort law theories involve novel issues that are best addressed, in the first instance, by a state court.
Accordingly, the Court grants Facebook and Twitter's motions to dismiss.
3. Motions to Strike
Both Twitter and Facebook bring motions to strike under California's anti-SLAPP statute. See Twitter Anti-SLAPP; Facebook Mot. at 17-19. That statute facilitates “the early dismissal of unmeritorious claims filed to interfere with the valid exercise of the constitutional rights of freedom of speech and petition.” Club Members for an Honest Election v. Sierra Club, 45 Cal.4th 309 (2008); see Cal. Code Civ. Proc. § 425.16(b)(1).
Analysis of a motion to strike pursuant to the anti-SLAPP statute consists of two steps. The defendant must first show that the statute applies because the defendant was “engaged in conduct (1) in furtherance of the right of free speech; and (2) in connection with an issue of public interest.” See Doe, 730 F.3d at 953. If the defendant makes this showing, the court then considers whether the plaintiff has demonstrated “a reasonable probability” of prevailing on the merits of his claims. In re NCAA Student-Athlete Name & Likeness Licensing Litig., 724 F.3d 1268, 1273 (9th Cir. 2013) (quoting Batzel v. Smith, 333 F.3d 1018, 1024 (9th Cir. 2003)).
Because the Court does not exercise supplemental jurisdiction over the California claims, the Court cannot determine whether Hart has demonstrated a reasonable probability of prevailing on the merits. Accordingly, the Court does not reach either anti-SLAPP motion.
B. Federal Defendants
As explained above, Hart has not plausibly pleaded that any action by President Biden or Surgeon General Murthy was causally related to Facebook and Twitter's decisions to enforce their misinformation policies against Hart. Thus, even if Hart has pleaded injury cognizable for the requested relief-which is doubtful-the Court lacks Article III standing over Hart's claims against the Federal Defendants for two independent reasons: (1) the injury is not “fairly traceable” to their conduct, and (2) even if it were, that injury could not be redressed by a favorable decision. See Lujan, 504 U.S. at 560-61.
A claim for injunctive relief (all that is available under the First Amendment) requires plausibly pleading that he faces “real and immediate threat of repeated injury” from the Federal Defendants. Fortyune v. Am. Multi-Cinema, Inc., 364 F.3d 1075, 1081 (9th Cir. 2004). The intimations of future injury in Hart's complaint are vague and based entirely in innuendo.
Another district court reached similar conclusions on both causation and redressability grounds in a case brought by an activist and a nonprofit organization skeptical of vaccines against Congressman Adam Schiff. See Ass'n of Am. Physicians & Surgeons v. Schiff, 518 F.Supp.3d 505 (D.D.C. 2021), aff'd, 23 F.4th 1028 (D.C. Cir. 2022). The plaintiffs there alleged injury from censorship by technology companies and claimed that Congressman Schiff caused that censorship in violation of the First Amendment. First, the court explained why this causation theory was insufficient for Article III standing:
Plaintiffs cannot satisfy the causation element of standing because all the alleged harms stem from the actions of [social media companies], not from Congressman Schiff.... The open letters and public statements made by Congressman Schiff do not mention AAPS, do not advocate for any specific actions, and do not contain any threatening language. Despite this, Plaintiffs allege that, through the open letters and public comments, Congressman Schiff coerced several companies to take specific actions against AAPS.... These allegations are not plausible and ignore the innumerable other potential causes for the actions taken by the technology companies.Id. at 515-16 (citations omitted).
Next, on redressability, the district court concluded that “[i]t [was] pure speculation that any order directed at Congressman Schiff . . . would result in the [technology] companies changing their behavior” towards the plaintiffs. Ass'n of Am. Physicians, 518 F.Supp.3d at 516. That is, even if there were a causal link between Congressman Schiff's actions and the plaintiffs' injury, it was highly implausible that enjoining Congressman Schiff would have any effect whatsoever on the policies of the technology companies, much less how those policies are enforced against the plaintiffs. See id.
On appeal, the D.C. Circuit affirmed the district court's causation analysis and did not reach the redressability issue. The court explained that the vague allegations failed to plausibly allege a causal relationship: “[I]t is far less plausible that the companies' actions were a response to [Congressman Schiff's] inquiry than that they were a response to widespread societal concerns about online misinformation.” 23 F.4th at 1034-35. Along the way, it noted that “several of the [] adverse actions by the technology companies occurred before” Congressman Schiff took any action at all. Id. at 1034.
The above analysis is apt here. Hart makes only vague and implausible allegations connecting the Federal Defendants' conduct to his injury. That injury has no causal relationship with the Federal Defendants' actions, and no court order as to the Federal Defendants could redress it. See Lujan. 504 U.S. at 560-61. Accordingly, the Coin! giants the Federal Defendants' motion to dismiss for lack of jurisdiction.
C. Leave to Amend
A couit should “freely give leave” to amend “when justice so requires.” Fed.R.Civ.P. 15(a)(2). However, a court has discretion to deny leave to amend where amendment would be futile. See Leadsinger, 512 F.3d at 532.
The Court finds that leave to amend would be futile. Hart fails to come close to alleging that Facebook and Twitter's enforcement of their misinformation policies against him were state action. Thus, the only federal claim against Facebook and Twitter fails, and this Court lacks subject-matter jurisdiction over the claim against the Federal Defendants. Nor can the complaint be amended to advance some other theory of federal jurisdiction.
Hart, Facebook, and Twitter are all residents of California, so Hart cannot amend his complaint to establish diversity jurisdiction over the state claims.
However, Hart still has a FOIA claim against HHS and OMB as to his request for information about the Federal Defendants' supposed communications with Facebook and Twitter about Iris accounts. See Compl. ¶¶ 66-74; Ex. A to Ex. 1 (dkt. 78-1). If Hart prevails and learns facts that plausibly suggest that “the state has so far insinuated itself into a position of interdependence with [Facebook and Twitter] that it must be recognized as a joint participant” in enforcing their company policies, see Gorenc. 869 F.2d at 507, the Corut will permit amendment.
IV. CONCLUSION
For the foregoing reasons, the Court GRANTS the motions to dismiss without leave to amend, but without prejudice to Hart bringing his state claims in state corut. The Court does not reach the motions to strike.
IT IS SO ORDERED.