The Private Search Doctrine and Computerized Hashing Technology
- The Private Search Doctrine
- Google’s Computerized Hashing Technology
- Child Pornography and the Internet
A law enforcement officer opening a pornography suspect’s e-mail attachments, without a search warrant, is an illegal search. The fact that an electronic communication service provider has already concluded that the e-mail attachment contains child pornography, determined via an electronic hashing system, without having actually viewed the e-mail attachments, does not make the subsequent law enforcement search lawful. In such a case, the “private search doctrine” is inapplicable.
Defendant Luke Noel Wilson was into child pornography, as evidenced by his habit of uploading images of young girls in sexually compromising situations into his Google Gmail account as e-mail attachments. Four instances of child pornography were discovered by Google in defendant’s e-mails, resulting in his federal prosecution in this case. (He was also prosecuted and convicted in state court; see People v. Wilson (Oct. 21, 2020) 56 Cal.App.5th 128 [Review denied by the California Supreme Court at 2021 Cal. LEXIS 485; Jan. 20, 2021], and “Note,” below.) Although electronic communication service providers (aka; Internet Service Provider,” or “ISP”), such as Google, are not required by law to “affirmatively search, screen, or scan” for violations of federal child pornography laws, many do so anyway as a means of “reduc(ing) . . . and prevent(ing) the online sexual exploitation of children.” Should an ISP discover child pornography sent through its system, however, that ISP is directed by federal law, “as soon as reasonably possible after obtaining actual knowledge” of “any facts or circumstances from which there is an apparent violation of . . . child pornography [statutes],” to “mak[e] a report of such facts or circumstances” to the “National Center for Missing and Exploited Children” (or “NCMEC”). (18 U.S.C. § 2258A(a)) NCMEC is then required to forward that information (in what is known as a “CyberTip”) to the appropriate law enforcement agency for possible investigation. (18 U.S.C §§ 2258A(a)(1)(B)(ii), (c)). This is what happened in this case. Google has a computerized screening system, as described below, that was triggered when defendant uploaded into his Gmail account four images of apparent child pornography. No one at Google had to open or view defendant’s e-mails in order to determine he was receiving child pornography. This is because Google has developed a “proprietary hashing technology” to identify apparent child pornography without having to individually open or look at a customer’s actual e-mail. This, in a nutshell, is how it works: First, a team of Google employees has been trained by experts on the federal statutory definition of child pornography and how to recognize it when they see it. As part of their training, these employees have viewed actual images of child pornography, giving each type of pornography a specific “hash value” which is added to a repository of hashes electronically stored by Google. This hash value is generated by a computerized algorithm and consists of a short alphanumeric sequence that is considered unique to the computer file. Google “apparently” (the record was not clear) stores only the hash values of images identified as apparent child pornography, but not the actual images. Also, the various electronic service providers—including Google—have established four categories of child pornography; “A1” for a sex act involving a prepubescent minor; “A2” for a lascivious exhibition involving a prepubescent minor; “B1” for a sex act involving a pubescent minor; and “B2” for a lascivious exhibition involving a pubescent minor. Should someone, such as defendant, then upload child pornography, Google’s system automatically compares the previously stored hashes to the hashes established by the customer’s uploaded e-mail. Google is automatically notified if this hash comparison determines that the uploaded e-mail contains child pornography, and establishing which category it is. (The state reported decision in People v. Wilson (Oct. 21, 2020) 56 Cal.App.5th 128, contains a more detailed version of the description of this system for detecting child pornography. See California Legal Update, Vol. 25, #14 [Dec. 30, 2020].) In defendant’s case, no one at Google actually opened or viewed defendant’s e-mail attachments, the determination that they contained child pornography being based solely upon the automated assessment that the images defendant uploaded were the same as images the previously trained Google employees had earlier viewed and classified as Category A1 child pornography (i.e., a sex act involving a prepubescent minor), as noted above. A report was then made by Google to NCMEC which—also without opening or viewing the files—sent to the San Diego Internet Crimes Against Children Task Force (“ICAC”), it being determined that the IP address of the computer receiving the pornography was located in the San Diego region. At the Task Force, Agent Thompson opened defendant’s e-mail attachments for the first time, verifying that defendant’s Gmail e-mails did in fact contain child pornography. This was done without a search warrant. Based upon this, Agent Thompson applied for two warrants—in which the pornographic attachments were described in detail—to search both defendant’s e-mail account and his home. A search of both resulted in the seizure of large amounts of child pornography, resulting in the instant prosecution in federal court. Defendant’s motion to suppress the seized child pornography was denied by the federal district court trial judge. Convicted of possession and distribution of child pornography and sentenced to 11 years in prison with 10 years of supervised release, defendant appealed.
The Ninth Circuit Court of Appeal reversed. The issue in this appeal was whether the so-called “private search doctrine” excused Agent Thompson’s warrantless opening and viewing of defendant’s e-mails, viewing defendant’s child pornography for the first time. The lower federal district court, in denying defendant’s motion to suppress, ruled that Agent Thompson acted lawfully. (See United States v. Wilson (June 26, 2017) 2017 U.S. Dist. LEXIS 98432.) In a parallel prosecution, California’s Fourth District Court of Appeal also ruled that the private search doctrine excused Agent Thompson’s warrantless viewing of defendant’s e-mails. (People v. Wilson, supra.) Other federal circuits have also found the private search doctrine to apply in similar situations, justifying an investigator’s later actual warrantless opening of the files. (See United States v. Ringland (8th Cir. 2020) 966 F.3rd 731; United States v. Reddick (5th Cir. 2018) 900 F.3rd 636, and United States v. Miller (6th Cir. 2020) 982 F.3rd 412.) The Ninth Circuit here disagreed with all of the above (citing one case which agreed with the Ninth Circuit’s analysis; United States v. Ackerman (10th Cir. 2016) 831 F.3rd 1292.). Under the “private search doctrine,” it is a rule that when a private party (i.e., non-law enforcement) opens a container and views material (typically, contraband) under circumstances that had the private party been a law enforcement officer, it would have been considered a Fourth Amendment search, and then that container is passed onto the government (e.g., law enforcement) which replicates the opening and viewing of the container and its contents already viewed by the private party, the Fourth Amendment is not violated. So long as law enforcement restricts its inspection and viewing to what has already been seen by the private party (i.e., “does not exceed the scope” of the prior viewing), then this action comes within the private search doctrine, and is lawful. (See Coolidge v. New Hampshire (1971) 403 U.S. 443; and United States v. Jacobsen (1984) 466 U.S. 109.) The question here is whether Google’s (the private party’s) hash identification system, without anyone from Google actually opening and viewing the defendant’s child pornography contained in his e-mail attachments, is sufficient to trigger the private search doctrine, thus allowing for a law enforcement officer to later open up those same e-mails and view their contents without a search warrant. The Ninth Circuit, noting that the government bears the burden of proof on this issue, ruled that it does not in that Agent Thompson’s warrantless opening of defendant’s e-mail attachments exceeded the scope of Google’s private computerized hash-system search (such as it was) of those e-mails. In rejecting the Government’s argument that defendant had lost any expectation of privacy when he received and sent pornography via e-mail, the Court ruled as follows: “First, the government search exceeded the scope of the antecedent private search because it (i.e., Agent Thompson’s search) allowed the government to learn new, critical information that it used first to obtain a warrant and then to prosecute Wilson. Second, the government search also expanded the scope of the antecedent private search because the government agent viewed Wilson's email attachments even though no Google employee—or other person—had done so, thereby exceeding any earlier privacy intrusion.” Finding a “large gap” between what Google and NCMEC knew about what defendant had in his e-mails, when compared to the specific detail Agent Thompson was able to learn upon opening the e-mails, the Court found the private search exception to be inapplicable to this situation. Defendant’s Fourth Amendment rights were therefore violated and his motion to suppress should have been granted.