Doe v. Twitter, Inc.

Decision Date19 August 2021
Docket NumberCase No. 21-cv-00485-JCS
Citation555 F.Supp.3d 889
Parties John DOE, et al., Plaintiffs, v. TWITTER, INC., Defendant.
CourtU.S. District Court — Northern District of California

Christen Michelle Price, Pro Hac Vice, Peter A. Gentala, Pro Hac Vice, Benjamin Wyman Bull, Pro Hac Vice, Danielle M. Pinter, Pro Hac Vice, National Center on Sexual Exploitation, Washington, DC, Lisa D. Haba, Pro Hac Vice, The Haba Law Firm, PA, Longwood, FL, Paul Anthony Matiasic, Hannah E. Mohr, The Matiasic Firm, P.C., San Francisco, CA, for Plaintiff A Minor.

Hannah E. Mohr, Paul Anthony Matiasic, The Matiasic Firm, P.C., San Francisco, CA, for Plaintiff B.M.

Hannah E. Mohr, The Matiasic Firm, P.C., San Francisco, CA, for Plaintiff John Doe # 2.

Michael Graham Rhodes, Kathleen R. Hartnett, Kyle Christopher Wong, Cooley LLP, San Francisco, CA, Jamie D. Robertson, Linh Khahn Nguyen, Cooley LLP, San Diego, CA, for Defendant.


Re: Dkt. No. 48

JOSEPH C. SPERO, Chief Magistrate Judge


Plaintiffs John Doe #1 and John Doe #2 allege that when they were thirteen years old they were solicited and recruited for sex trafficking and manipulated into providing to a third-party sex trafficker pornographic videos ("the Videos") of themselves through the social media platform Snapchat. A few years later, when Plaintiffs were still in high school, links to the Videos were posted on Twitter. Plaintiffs allege that when they learned of the posts, they informed law enforcement and urgently requested that Twitter remove them but Twitter initially refused to do so, allowing the posts to remain on Twitter, where they accrued more than 167,000 views and 2,223 retweets. According to Plaintiffs, it wasn't until the mother of one of the boys contacted an agent of the Department of Homeland Security, who initiated contact with Twitter and requested the removal of the material, that Twitter finally took down the posts, nine days later.

Plaintiffs assert state and federal claims against Twitter based on its alleged involvement in and/or enabling of sex trafficking and the distribution of the child pornography containing their images. Twitter, however, contends that even after Congress's enactment of the Fight Online Sex Trafficking Act and Stop Enabling Sex Traffickers Act in 2018, the conduct alleged by Plaintiffs is shielded from liability under Section 230 of the Communications Decency Act ("CDA"). Thus, Twitter brings a Motion to Dismiss First Amended Complaint ("Motion") seeking dismissal of all of Plaintiffs’ claims on the basis that it is immune from liability under the CDA. In the Motion, Twitter also contends Plaintiffs fail to state viable claims as to many of their claims. A hearing on the Motion was held on August 6, 2021. For the reasons stated below, the Motion is GRANTED in part and DENIED in part.1

A. First Amended Complaint

Plaintiffs’ First Amended Complaint ("FAC"), which is the operative complaint, contains detailed allegations describing: 1) Twitter's platform, business model and content moderation policies and practices (FAC ¶¶ 23-51); 2) the ways Twitter allegedly permits and even aids in the distribution of child pornography on its platform and profits from doing so (FAC ¶¶ 52-84); 3) how pornographic content featuring John Doe #1 and John Doe #2 was created and eventually ended up on Twitter's platform (FAC ¶¶ 85-100); and 4) Twitter's response to requests that the pornographic photos and videos containing Plaintiffs’ images be removed from Twitter (FAC ¶¶ 101-132).

Based on these allegations, Plaintiffs assert the following claims:

1) violation of the Trafficking Victims Protection Reauthorization Act ("TVPRA"), 18 U.S.C. §§ 1591(a)(1) and 1595(a) based on the allegation that "Twitter knew, or was in reckless disregard of the fact, that through monetization and providing, obtaining, and maintaining [child sexual abuse material ("CSAM")] on its platform, Twitter and Twitter users received something of value for the video depicting sex acts of John Doe #1 and John Doe #2 as minors." FAC ¶¶ 133-143 (Claim One);

2) violation of the TVPRA, 18 U.S.C. §§ 1591(a)(2) and 1595(a), based on the allegation that Twitter "knowingly benefited, or should have known that it was benefiting, from assisting, supporting, or facilitating a violation of 1591(a)(1)." FAC ¶¶ 144-155 (Claim Two); 3) violation of the duty to report child sexual abuse material under 18 U.S.C. §§ 2258A and 2258B. FAC ¶¶ 156-163 (Claim Three);

4) civil remedies for personal injuries related to sex trafficking and receipt and distribution of child pornography under 18 U.S.C. §§ 1591, 2252A, and 2255, based on the allegations that Twitter was "notified of the CSAM material depicting John Doe #1 and John Doe #2 as minors on its platform and still knowingly received, maintained, and distributed this child pornography after such notice[,]" causing Plaintiffs to suffer "serious harm and personal injury, including, without limitation, physical, psychological, financial, and reputational harm." FAC ¶¶ 164-176 (Claim Four);

5) California products liability based on the allegedly defective design of the Twitter platform, which is "designed so that search terms and hashtags utilized for trading CSAM return suggestions for other search terms and hashtags related to CSAM" and through use of "algorithm(s), API, and other proprietary technology" allows "child predators and sex traffickers to distribute CSAM on a massive scale" while also making it difficult for users to report CSAM and not allowing for immediate blocking of CSAM material once reported pending review. FAC ¶¶ 177-190 (Claim Five);

6) negligence based on allegations that Twitter had a duty to protect Plaintiffs, had actual knowledge that CSAM containing their images was being disseminated on its platform and failed to promptly remove it once notified. FAC ¶¶ 191-197 (Claim Six);

7) gross negligence based on the same theory as Plaintiffs’ negligence claim. FAC ¶¶ 198-203 (Claim Seven);

8) negligence per se based on the allegation that Twitter's conduct violated numerous laws, including 18 U.S.C. §§ 1591 and 1595 (benefiting from a sex trafficking venture), 18 U.S.C. § 2258A (failing to report known child sexual abuse material), 18 U.S.C. § 2552A (knowingly distributing child pornography), Cal. Civ. Code § 1708.85 (intentionally distributing non-consensually shared pornography), and Cal. Penal Code § 311.1 (possessing child pornography). FAC ¶¶ 204-26 (Claim Eight);

9) negligent infliction of emotional distress. FAC ¶¶ 207-212 (Claim Nine);

10) distribution of private sexually explicit materials, in violation of Cal. Civ. Code § 1708.85, based on the allegation that "[b]y refusing to remove or block the photographic images and video depicting him after Plaintiff John Doe #1 notified Twitter that both he and John Doe #2 were minors, Twitter intentionally distributed on its online platform photographic images and video of the Plaintiffs." FAC ¶¶ 213-218 (Claim Ten);

11) intrusion into private affairs, based on the allegation that "Twitter intentionally intruded into Plaintiffs’ reasonable expectation of privacy by continuing to distribute the photographic images and video depicting them after John Doe #1 notified Twitter that Plaintiffs were minors and the material had been posted on its platform without their consent." FAC ¶¶ 219-223 (Claim Eleven);

12) invasion of privacy under the California Constitution, Article 1, Section 1. FAC ¶¶ 224-228 (Claim Twelve); and

13) violation of California Business and Professions Code § 17200 ("UCL") based on allegations that "Twitter utilized and exploited Plaintiffs for its own benefit and profit" and "Plaintiffs, to their detriment, reasonably relied upon Twitter's willful and deceitful conduct and assurances that it effectively moderates and otherwise controls third-party user content on its platforms." FAC ¶¶ 229-234 (Claim Thirteen).

Plaintiffs seek compensatory and punitive damages, injunctive relief, restitution, disgorgement of profits and unjust enrichment and attorneys’ fees and costs.

B. Statutory Background
1. The CDA

The CDA was enacted as part of the Telecommunications Act of 1996. It contains a "Good Samaritan" provision that immunizes interactive computer service ("ICS") providers from liability for restricting access to certain types of materials or giving users the technical means to restrict access to such materials, providing as follows:

(c) Protection for "Good Samaritan" blocking and screening of offensive material
(1) Treatment of publisher or speaker
No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.
(2) Civil liability
No provider or user of an interactive computer service shall be held liable on account of—
(A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or
(B) any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1).

47 U.S.C. § 230(c).

"This grant of immunity dates back to the early days of the internet when concerns first arose about children being able to access online pornography." Enigma Software Grp. USA, LLC v. Malwarebytes, Inc. , 946 F.3d 1040, 1046 (9th Cir. 2019), cert. denied, ––– U.S. ––––, 141 S. Ct. 13, 208 L.Ed.2d 197 (2020). At that time, "[p]arents could not program their computers to block online pornography, and this was at least partially due to a combination of trial court decisions in New York that had deterred the creation of online-filtration efforts." Id. Under the New York cases, "if a provider remained...

To continue reading

Request your trial
6 cases
  • Doe v. Reddit, Inc.
    • United States
    • U.S. Court of Appeals — Ninth Circuit
    • October 24, 2022
  • G.G. v., Inc.
    • United States
    • U.S. District Court — Northern District of Illinois
    • May 16, 2022
  • G.G. v., Inc.
    • United States
    • U.S. Court of Appeals — Seventh Circuit
    • August 3, 2023
    ..."an act in violation of this chapter." 8. Nearly every court agrees. See Lundstrom, 2021 WL 5579117, at *5-6; Doe v. Twitter, Inc., 555 F. Supp. 3d 889, 916-18 (N.D. Cal. 2021); J.L. v. Best Western Int'l, Inc., 521 F. Supp. 3d 1048, 1062 (D. Colo. 2021); A.B. v. Hilton Worldwide Holdings I......
  • G.G. v.
    • United States
    • U.S. Court of Appeals — Seventh Circuit
    • August 3, 2023
    ... G.G. and Deanna Rose, Plaintiffs-Appellants, v., Inc., Defendant-Appellee. No. 22-2621 United States Court of Appeals, Seventh Circuit August 3, 2023 ...           ARGUED ... services that we do not view as amounting to ... "participation." Salesforce supports its policy ... concerns with Twitter, Inc. v. Taamneh , 143 S.Ct ... 1206 (2023), where the Supreme Court interpreted the civil ... liability provisions of the Justice ... ...
  • Request a trial to view additional results
4 firm's commentaries
1 books & journal articles

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT