The Private Search Doctrine and Computerized Hashing Technology

CAC00054
Rules

A law enforcement officer opening a pornography suspect’s e-mail attachments, without a search warrant, is an illegal search.  The fact that an electronic communication service provider has already concluded that the e-mail attachment contains child pornography, determined via an electronic hashing system, without having actually viewed the e-mail attachments, does not make the subsequent law enforcement search lawful. In such a case, the “private search doctrine” is inapplicable.

Facts

Defendant Luke Noel Wilson was into child pornography, as evidenced by his habit of uploading images of young girls in sexually compromising situations into his Google Gmail account as e-mail attachments.  Four instances of child pornography were discovered by Google in defendant’s e-mails, resulting in his federal prosecution in this case.  (He was also prosecuted and convicted in state court; see People v. Wilson (Oct. 21, 2020) 56 Cal.App.5th 128 [Review denied by the California Supreme Court at 2021 Cal. LEXIS 485; Jan. 20, 2021], and “Note,” below.)  Although electronic communication service providers (aka; Internet Service Provider,” or “ISP”), such as Google, are not required by law to “affirmatively search, screen, or scan” for violations of federal child pornography laws, many do so anyway as a means of “reduc(ing) . . . and prevent(ing) the online sexual exploitation of children.”  Should an ISP discover child pornography sent through its system, however, that ISP is directed by federal law, “as soon as reasonably possible after obtaining actual knowledge” of “any facts or circumstances from which there is an apparent violation of . . . child pornography [statutes],” to “mak[e] a report of such facts or circumstances” to the “National Center for Missing and Exploited Children” (or “NCMEC”). (18 U.S.C. § 2258A(a)NCMEC is then required to forward that information (in what is known as a “CyberTip”) to the appropriate law enforcement agency for possible investigation. (18 U.S.C §§ 2258A(a)(1)(B)(ii)(c)).  This is what happened in this case.  Google has a computerized screening system, as described below, that was triggered when defendant uploaded into his Gmail account four images of apparent child pornography.  No one at Google had to open or view defendant’s e-mails in order to determine he was receiving child pornography.  This is because Google has developed a “proprietary hashing technology” to identify apparent child pornography without having to individually open or look at a customer’s actual e-mail.  This, in a nutshell, is how it works:  First, a team of Google employees has been trained by experts on the federal statutory definition of child pornography and how to recognize it when they see it.  As part of their training, these employees have viewed actual images of child pornography, giving each type of pornography a specific “hash value” which is added to a repository of hashes electronically stored by Google.  This hash value is generated by a computerized algorithm and consists of a short alphanumeric sequence that is considered unique to the computer file.  Google “apparently” (the record was not clear) stores only the hash values of images identified as apparent child pornography, but not the actual images.  Also, the various electronic service providers—including Google—have established four categories of child pornography; “A1” for a sex act involving a prepubescent minor; “A2” for a lascivious exhibition involving a prepubescent minor; “B1” for a sex act involving a pubescent minor; and “B2” for a lascivious exhibition involving a pubescent minor.   Should someone, such as defendant, then upload child pornography, Google’s system automatically compares the previously stored hashes to the hashes established by the customer’s uploaded e-mail.  Google is automatically notified if this hash comparison determines that the uploaded e-mail contains child pornography, and establishing which category it is.  (The state reported decision in People v. Wilson (Oct. 21, 2020) 56 Cal.App.5th 128, contains a more detailed version of the description of this system for detecting child pornography.  See California Legal Update, Vol. 25, #14 [Dec. 30, 2020].)  In defendant’s case, no one at Google actually opened or viewed defendant’s e-mail attachments, the determination that they contained child pornography being based solely upon the automated assessment that the images defendant uploaded were the same as images the previously trained Google employees had earlier viewed and classified as Category A1 child pornography (i.e., a sex act involving a prepubescent minor), as noted above.  A report was then made by Google to NCMEC which—also without opening or viewing the files—sent to the San Diego Internet Crimes Against Children Task Force (“ICAC”), it being determined that the IP address of the computer receiving the pornography was located in the San Diego region.  At the Task Force, Agent Thompson opened defendant’s e-mail attachments for the first time, verifying that defendant’s Gmail e-mails did in fact contain child pornography.  This was done without a search warrant.  Based upon this, Agent Thompson applied for two warrants—in which the pornographic attachments were described in detail—to search both defendant’s e-mail account and his home.  A search of both resulted in the seizure of large amounts of child pornography, resulting in the instant prosecution in federal court.  Defendant’s motion to suppress the seized child pornography was denied by the federal district court trial judge.  Convicted of possession and distribution of child pornography and sentenced to 11 years in prison with 10 years of supervised release, defendant appealed.