United States v. Clark

Decision Date18 May 2023
Docket Number22-cr-40031-TC
PartiesUNITED STATES OF AMERICA v. JEREMY DAYTON CLARK
CourtU.S. District Court — District of Kansas
MEMORANDUM AND ORDER

Toby Crouse, United States District Judge

The United States charged Jeremy Dayton Clark with five counts of receiving, possessing, and distributing child pornography in violation of 18 U.S.C. § 2252(a). Doc. 1. Clark moves to suppress all evidence and information derived from the search of his home. Doc. 29. For the following reasons, Clark's motion is denied.

I
A

The Fourth Amendment protects [t]he right of the people to be secure in their persons, houses, papers, and effects against unreasonable searches and seizures.” U.S Const, amend. IV. “The basic purpose of this Amendment as recognized in countless decisions of tire Supreme Court is to safeguard die privacy and security of individuals against arbitrary invasions by government officials.” United States v. Mathews, 928 F.3d 968, 975 (10th Cir. 2019) (citing Camara v. Mun. Ct. of S.F., 387 U.S. 523, 528 (1967) (cleaned up)).

The Fourth Amendment has been interpreted to proscribe two types of searches: first, when die government trespasses on a person's property; second, when it violates a person's reasonable expectation of privacy. Kyllo v United States, 533 U.S. 27, 31-33 (2001); Mathews, 928 F.3d at 975. Under die trespass theory, die Fourth Amendment proscribes intrusion “on a constitutionally protected area,” which usually means a person's property. United States v. Jones, 565 U.S. 400, 407 (2012). hi addition to this “traditional property-based understanding,” die Fourth Amendment has been interpreted under a privacy-based approach. Byrd v. United States, 138 S.Ct. 1518, 1526 (2018) (quoting Florida v. Jardines, 569 U.S. 1, 11 (2013). An expectation of privacy is protected if the defendant has exhibited an actual, subjective expectation of privacy and that expectation is one that society is prepared to recognize as objectively reasonable. Smith v. Maryland, 442 U.S. 735, 740 (1979). If the individual has neither a property nor a privacy interest or if the search is performed by a private actor, then the Fourth Amendment does not apply. United States v. Benoit, 713 F.3d 1, 6 (10th Cir. 2013).

Searches and seizures-of people, their homes, and their personal property-are presumed unreasonable when conducted without a warrant. United States v. Karo, 468 U.S. 705, 717 (1984). For a warrant to be valid, it must be supported by probable cause and must “particularly describ[e] . . . the persons or things to be seized.” United States v. Leary, 846 F.2d 592, 600, 605 (10th Cir. 1988) (quoting U.S. Const. amend. IV) (alteration in original). Searches that exceed a valid warrant's scope are invalid. Cf. Marron v. United States, 275 U.S. 192, 196 (1927) (prohibiting “the seizure of one thing under a warrant describing another”); Bivens v. Six Unknown Agents of Fed. Bureau of Narcotics, 403 U.S. 388, 394 n.7 (1971); Leary, 846 F.2d at 600.

The remedy for an illegal search is exclusion. The exclusionary rule generally forbids the Government from using evidence obtained from an illegal search. Herring v. United States, 555 U.S. 135, 139 (2009). But that relief is not granted reflexively; the rule applies only where it “result[s] in appreciable deterrence” for law enforcement. Id. at 137, 141 (quoting United States v. Leon, 468 U.S. 897, 909 (1984)). Suppression of evidence is therefore not “an automatic consequence of a Fourth Amendment violation.” Id. at 137.

B

The basis of this prosecution concerns Clark's use of a peer-to-peer website, known as Omegle. The Government alleges that Clark shared child sex abuse material (CSAM) with another user via a single Omegle-hosted video stream. At issue is whether the Fourth Amendment precludes the government's warrantless review of 25 of the 26 screenshots from that video chat that Omegle captured and reported to law enforcement and whether the warrant later obtained based on those 26 screenshots was legally sufficient to support the search of Clark's home.

1. It is helpful to understand how Omegle works. Omegle is a website that randomly connects one anonymous user with another anonymous user for one-on-one video and text chats. Doc. 34 at 3;[1] Doc. 29 at 2. Video chats are conducted peer-to-peer via users' web browsers. Doc. 34 at 3.

Omegle does not require users to create a profile, register, or provide any identifying information. Doc. 34 at 3; United States v. Wilbert, No. 16-CR-6084, 2018 WL 6729659, at *2 (W.D.N.Y. Aug. 20, 2018). The only thing that Omegle captures from each user is their respective IP address. United States v. DiTomasso, 56 F.Supp.3d 584, 588 (S.D.N.Y. 2014).[2] Upon entering the site, and before starting a chat, Omegle informs its users that all content on the platform is monitored and may be disclosed to law enforcement. Doc. 34-1 at 8; Wilbert, 2018 WL 6729659, at *2; DiTomasso, 56 F.Supp.3d at 588. It also informs users that the service requires them to share their IP address with Omegle. DiTomasso, 56 F.Supp.3d at 596.

Omegle uses a two-step process to monitor video chats for inappropriate content like CSAM. Doc. 34 at 3. First, Omegle runs software on each user's computer during a chat that captures periodic screenshots of those chats. Id. at 4; Doc. 34-1 at 5; United States v. Powell, 925 F.3d 1, 4 (1st Cir. 2018). That software applies an algorithm to review the captured screenshots and discards those that are likely to be appropriate. Doc. 34-1 at 6. The remaining images, which are more likely to be inappropriate, are sent to Omegle's human content moderators. Id. at 6-7; Doc. 29-1 at 3. Second, Omegle's content moderation team scans a large grid of the algorithm-flagged images and, based on a reviewer's training and experience, flags photos in the grid for distinct content and terms of service violations, including CSAM. Doc. 34-1 at 6-7; see Powell, 925 F.3d at 4.

When an Omegle content moderator flags a screenshot as containing CSAM, Omegle reports the screenshot to the National Center for Missing & Exploited Children (NCMEC). Doc. 34-1 at 8; Doc. 29-1 at 3.“Multiple files may be attached” to the report, in which case “the first file is the one which was specifically flagged.” Doc. 29-1 at 3. The rest of the attached files come from “the same IP address and/or ID cookie” and were “in the moderation system at the time the first file was flagged.” Id.

NCMEC has “two primary authorizing statutes-18 U.S.C. § 2258A and 42 U.S.C. § 5773(b)-[which] mandate its collaboration with federal (as well as state and local) law enforcement.” United States v. Ackerman, 831 F.3d 1292, 1296 (10th Cir. 2016). One of those duties is “to operate the CyberTipline as a means of combating Internet child sexual exploitation.” Id. (citing 42 U.S.C. § 5773(b)). “Electronic communication service providers” with “actual knowledge” of a child pornography violation must report to NCMEC's CyberTipline. 18 U.S.C. § 2258A(a); 18 USC § 2258E(6); Ackerman, 831 F.3d at 1296. NCMEC must then forward that report to law enforcement. Ackerman, 831 F.3d at 1296 (citing 18 U.S.C. § 2258A(c)).

2. The video chat at issue in this case occurred in May 2019. Doc. 29 at 2. Omegle's monitoring system detected apparent inappropriate content during a video chat session. Id. In that session, a user sent various CSAM videos and images to another. Doc. 34 at 5. That same day, Omegle filed a CyberTipline Report with NCMEC. Doc. 34 at 5; see Doc. 29-1.

Omegle's report to NCMEC included the sender's IP address and 26 screenshots of the video chat session. Doc. 29 at 2; Doc. 34 at 6. Based on the timestamps and testimony at the hearing, all of the files appear to be from the same chat session. Doc. 29-1. Each of those images had been flagged by the algorithm and then sent to content moderators as likely containing inappropriate content. Based on Omegle's report, one of those 26 images was specifically marked as having been reviewed by a content moderator to confirm the presence of CSAM. Doc. 29 at 5-6. NCMEC confirmed that the screenshot Omegle flagged contained CSAM and traced the IP address to Kansas. Doc. 34 at 6. Based on that location, NCMEC forwarded its report to the Kansas Internet Crimes against Children (ICAC) taskforce in July 2019. Id.

Wichita Police Detective Jennifer Wright, assigned to the Kansas ICAC taskforce, reviewed all 26 screenshot files and confirmed that each contained CSAM. Doc. 34 at 7; Doc. 29-4 at 3. She also determined, using publicly available information, that the internet service provider associated with the IP address was Midcontinent Communications (Midco). Doc. 34 at 7; Doc. 29 at 7-8.

In August 2019, Detective Wright applied for a warrant to search Midco's records for the subscriber information associated with that IP address. Doc. 34 at 7. The same day, Midco told Detective Wright that Lisa Lee was the subscriber with a listed address in Lawrence, Kansas. Id. Detective Wright then transferred the case to Lawrence Police Officer Lindsay Bishop based on the location.[3] Id.

Officer Bishop reviewed the CyberTipline Report, Detective Wright's report, and all of the screenshots Omegle captured. Doc. 29 at 9; Doc. 29-2 at 2; Doc. 34 at 7. Officer Bishop then used open-source records to identify three adults, including Clark, as residents of the address. Doc. 34 at 7. Officers surveilled the residence and, among other things, confirmed the Wi-Fi was not publicly accessible. Id. at 7-8.

After consulting with a prosecutor at the Douglas County District Attorney's Office, Officer Bishop applied for a search warrant of the residence on January 23, 2020. Doc. 34 at 8. Officer Bishop's affidavit for the warrant stated that she viewed Omegle's...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT