Winter v. Facebook, Inc., 4:21-CV-01046 JAR

CourtUnited States District Courts. 8th Circuit. United States District Court (Eastern District of Missouri)
Writing for the CourtJOHN A. ROSS UNITED STATES DISTRICT JUDGE
PartiesELLIOT WINTER, et. al., Plaintiffs, v. FACEBOOK, INC., TIKTOK INC., TIKTOK PTE LTD, BYTEDANCE LTD., BYTEDANCE INC., and MONICA DOLAN, Defendants.
Docket Number4:21-CV-01046 JAR
Decision Date22 November 2021

ELLIOT WINTER, et. al., Plaintiffs,
v.

FACEBOOK, INC., TIKTOK INC., TIKTOK PTE LTD, BYTEDANCE LTD., BYTEDANCE INC., and MONICA DOLAN, Defendants.

No. 4:21-CV-01046 JAR

United States District Court, E.D. Missouri, Eastern Division

November 22, 2021


MEMORANDUM AND ORDER

JOHN A. ROSS UNITED STATES DISTRICT JUDGE

On July 9, 2021, Plaintiffs Elliot Winter and Alexandria Hurlburt filed this action against Defendants Facebook, Inc. (“Facebook”); TikTok Inc., TikTok PTE Ltd., Bytedance Ltd., and Bytedance Inc. (“TikTok”)[1]; and Monica Dolan (“Dolan”), in the Circuit Court of St. Louis City, Missouri. See Elliot Winter, et al. v. Facebook, Inc., et al., Case No. 2122-CC08817 (22nd Jud. Cir.). Plaintiffs allege that Dolan and others associated with her “engaged in a pattern of behavior that resulted in the harassment of the Plaintiffs on her social media accounts” and that “[a]s a direct and proximate cause of Facebook and TikTok's failure to take down [] false abusive posts and/or posts containing [P]laintiffs [sic] personal identifying information for the purposes of stalking and harassment, [P]laintiffs sustained damages in an amount in excess of $500, 000.00.” (Compl. At ¶¶ 9, 97). Plaintiffs assert claims against Dolan for defamation - slander (Count I); defamation - libel (Count II); tortious interference with business contract and expectancy (Count III); invasion of

1

privacy (Count IV); false light (Count V); intentional infliction of emotional distress (Count VI); stalking (Count VII); harassment (Count VIII); and unlawful posting of certain information over the internet (Count IX); and a single claim of “gross negligence” against Facebook and TikTok (Count X).

Facebook, with TikTok's consent, removed the case to this Court on August 20, 2021 based on diversity jurisdiction. (Doc. No. 1). Both Facebook and TikTok have moved to dismiss the case. (Doc. Nos. 10, 12). The motions are fully briefed and ready for disposition. Because the arguments raised in support of and in opposition to dismissal are largely the same, the Court has addressed the motions together.

Legal standard

The purpose of a Rule 12(b)(6) motion to dismiss for failure to state a claim is to test the legal sufficiency of a complaint to eliminate those actions “which are fatally flawed in their legal premises and deigned to fail, thereby sparing the litigants the burden of unnecessary pretrial and trial activity.” Young v. City of St. Charles, 244 F.3d 623, 627 (8th Cir. 2001). To survive a Rule 12(b)(6) motion to dismiss, a complaint must contain “enough facts to state a claim to relief that is plausible on its face.” Ashcroft v. Iqbal, 556 U.S. 662, 678, (2009) (quoting Bell Atl. Corp. v. Twombly, 550 U.S. 544, 570 (2007)).

A plaintiff need not provide specific facts in support of his allegations, Erickson v. Pardus, 551 U.S. 89, 93 (2007) (per curiam), but “must include sufficient factual information to provide the ‘grounds' on which the claim rests, and to raise a right to relief above a speculative level.” Schaaf v. Residential Funding Corp., 517 F.3d 544, 549 (8th Cir. 2008) (citing Twombly, 550 U.S. at 555 & n.3). This obligation requires a plaintiff to plead “more than labels and conclusions, and

2

a formulaic recitation of the elements of a cause of action will not do.” Twombly, 550 U.S. at 555. A complaint “must contain either direct or inferential allegations respecting all the material elements necessary to sustain recovery under some viable legal theory.” Id. at 562 (internal citation omitted). This standard “simply calls for enough facts to raise reasonable expectation that discovery will reveal evidence of [the claim or element].” Id. at 556. The plausibility of the plaintiff's claim is reviewed “as a whole, not plausibility of each individual allegation.” Zoltek Corp. v. Structural Polymer Grp., 592 F.3d 893, 896 n.4 (8th Cir. 2010) (internal quotation marks and citation omitted).

On a motion to dismiss, the Court accepts as true all of the factual allegations contained in the complaint, even if it appears that “actual proof of those facts is improbable, ” Twombly, 550 U.S. at 556, and reviews the complaint to determine whether its allegations show that the pleader is entitled to relief. Id. at 555-56. The principle that a court must accept as true all of the allegations contained in a complaint is inapplicable to legal conclusions. Iqbal, 556 U.S. at 678-79. Although legal conclusions can provide the framework for a complaint, they must be supported by factual allegations. Id. at 679.

If a motion to dismiss is granted, a court should normally grant leave to amend unless it determines that the pleading could not possibly be cured by allegations of other facts. Cornelia I. Crowell GST Tr. v. Possis Med., Inc., 519 F.3d 778, 783-84 (8th Cir. 2008) (citing Wisdom v. First Midwest Bank, 167 F.3d 402, 409 (8th Cir. 1999)).

Discussion

Facebook and TikTok argue that Plaintiffs' negligence claim against them should be dismissed for two reasons. First, the claim is barred by § 230 of the Communications Decency Act

3

(“CDA”), which immunizes internet service providers (“ISPs”) against liability arising from content created by third parties. 47 U.S.C. § 230(c)(1); Johnson v. Arden, 614 F.3d 785, 791 (8th Cir. 2010). Second, Plaintiffs have failed to plead a claim for negligence, and specifically, facts establishing the existence of a duty that Facebook and TikTok owed them.

(1) Communications Decency Act

In enacting the CDA, Congress made it the “policy of the United States” to “promote the continued development of the Internet, ” and “preserve the vibrant and competitive free market that presently exists for the Internet and other interactive computer services, unfettered by Federal or State regulation[.]” Klayman v. Zuckerberg, 753 F.3d 1354, 1355-56 (D.C. Cir. 2014) (quoting 47 U.S.C. § 230(b)(1), (2)). To that end, “Congress decided not to treat providers of interactive computer services like other information providers such as newspapers, magazines or television and radio stations, all of which may be held liable for publishing obscene or defamatory material written or prepared by others.” M.A. ex rel. P.K. v. Village Voice Media Holdings, LLC, 809 F.Supp.2d 1041, 1047-48 (Mo. E.D. 2011) (citation and internal quotation marks omitted).

The CDA provides, in relevant part, that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider, ” § 230(c)(1), and expressly preempts any state law to the contrary, § 230(e)(3). Johnson v. Arden, 614 F.3d 785, 790-91 (8th Cir. 2010). The Act defines an “information content provider” as “any person or entity that is responsible, in whole or in part, for the creation or development of information provided through the Internet or any other interactive computer service.” § 230(f)(3). “Read together, these provisions bar plaintiffs from holding ISPs legally responsible for information that third parties created and developed.” Johnson,

4

614 F.3d at 791 (citation omitted); see also East Coast Test Prep LLC v. Allnurses.com, Inc., 971 F.3d 747, 752 (8th Cir. 2020). “Congress thus established a general rule that providers of interactive computer services are liable only for speech that is properly attributable to them.” Johnson, 614 F.3d at 791 (quoting Nemet Chevrolet, Ltd. v. Consumeraffairs.com, Inc., 591 F.3d 250, 254 (4th Cir. 2009)).

An ancillary goal of the CDA was to “encourage service providers to self-regulate the dissemination of offensive material over their services, ” by granting them immunity from material published by third parties regardless of whether the interactive computer service provider took an active role in regulating the content therein. Zeran, 129 F.3d at 331. Accordingly, “§ 230 forbids the imposition of publisher liability on a service provider for the exercise of its editorial and self-regulatory functions.” Id.

The majority of federal circuits, including the Eighth Circuit, have interpreted the CDA “to establish broad ‘federal immunity to any cause of action that would make service providers liable for information originating with a third-party user of the service.'” Id. at 791 (quoting Almeida v. Amazon.com, Inc., 456 F.3d 1316, 1321 (11th Cir. 2006)). See, e.g., Klayman, 753 F.3d at 1359 (affirming dismissal of a claim against Facebook alleging its delay in removing an offensive page created by another information content provider constituted intentional assault and negligence); Universal Commc'n Sys., Inc. v. Lycos, Inc., 478 F.3d 413, 419 (1st Cir. 2007) (affirming dismissal of a claim brought by a publicly-traded company against an internet message board operator for allegedly false and defamatory postings by pseudonymous posters); Batzel v. Smith, 333 F.3d 1018 1032-33 (9th Cir. 2003) (holding that even if operator of internet services could have reasonably concluded that the information was sent for internet publication, he was

5

immunized from liability for the defamatory speech as a “provider or user of interactive computer services” under the CDA); Green v. Am. Online, 318 F.3d 465, 471 (3d Cir. 2003) (holding that under the CDA the defendant ISP is not liable for failing to monitor, screen, or delete allegedly defamatory content from its site); Ben Ezra,...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT