Private Person Searches
So long as a warrantless search conducted by a government entity is preceded by a private person search, the government search does not implicate the Fourth Amendment as long as it does not exceed the scope of the initial private search. Google, using its proprietary hashing procedures to identify child pornography sent by its subscribers, is a private person search.
Defendant Luke Noel Wilson, a San Diego resident, liked to use a scam where he would lure young women with acting and modeling aspirations into photoshoots, finding his prospective candidates on a website where the women advertised their availability. Once he got their attention, his plan was to begin the photo sessions with his targets being fully clothed, leading to being partially nude, to totally nude, to “sexually explicit,” to overt pornography, sometimes with himself participating. The transition was eased through the use of alcohol and monetary payments. One such connection was an 18-year-old woman who soon introduced defendant to her younger sister; 15-year-old J.A. Defendant gradually led J.A. through the above progression, a relationship that lasted for several years until she was a young adult, and even after she became pregnant via a boyfriend when she was 17. Not yet satisfied, or perhaps just broadening his repertoire of pornographic photographs, defendant eventually coxed J.A. into photographing herself sexually abusing her infant daughter as well as a five-year-old cousin, paying her for the photos. Much of the exchange of photographs was accomplished via the Internet, with defendant using his Google Gmail account. J.A. eventually began to feel guilty about committing acts that she knew were wrong (i.e., abusing her daughter and cousin), telling defendant that she was done even though she continued to sit for photoshoots of herself for him. But then she got busted by the F.B.I. and charged with felony child abuse (per P.C. § 273a(a)), and, under a plea bargain, was sentenced to 10 years of probation. Meanwhile, Google became aware that defendant was transmitting child pornography via his e-mail Gmail account. Google uses a screening process that employs a “proprietary ‘hashing’ technology” to identify apparent child sexual abuse images on its services. This is how it works: (This is all Greek to me, so I pretty much plagiarized this from the Court’s decision itself.) Since 2008, Google has used a computerized “hashing technology” to assist in this process. It starts with trained Google employees using software to generate a “hash” value for any image file they find depicting child pornography. At least one Google employee reviews some random offending child pornography image before it is assigned a unique hash value, or a “digital fingerprint,” that is then stored in Google's repository of hash values. This hash value is generated by a computer algorithm and consists of a short alphanumeric sequence that is considered unique to the computer file. The resulting hash values are then added to a repository. The repository therefore contains hash values, not the actual child pornography images. When a user uploads new content to its services, Google automatically scans and generates hash values for the uploaded files and compares those hash values to all known hash values in the repository. If Google’s system detects a match between a hash value for uploaded content and a hash value in the repository for a file which was previously identified as containing apparent child pornography, the system generates a report to be sent to the National Center for Missing and Exploited Children (NCMEC) in the form of a “Cybertip” NCMEC is statutorily obligated to serve as a national clearinghouse and maintain a tip line for Internet service providers to report suspected child sexual exploitation violations. (18 U.S.C. § 2258A(c)) Also by statute, NCMEC is obligated to forward those reports to federal law enforcement. It may also (as it did in this case) forward the reports to state and local law enforcement. In the process, Google may or may not open the image file for manual review to confirm it contains apparent child pornography. In June, 2015, Google’s system identified four image files, each with hash values matching values for apparent child pornography images in its repository, attached to an e-mail created by a Gmail account later identified as belonging to defendant. Google generated a Cybertip report to NCMEC identifying and forwarding the four image attachments. The report included only the four image files, not the e-mail body text or any other information specific to the e-mail. Google classified the images, using a common categorization matrix, as “A1,” indicating they depicted prepubescent minors engaged in sex acts. The report reflected that a Google employee did not manually review these specific files after they were flagged using Google’s hashing technology, and before sending them to NCMEC. Upon determining that the Gmail account at issue was in San Diego, NCMEC forwarded it onto the “San Diego Internet Crimes Against Children” (ICAC) task force. This task force is comprised of law enforcement individuals from multiple agencies, including the San Diego Police Department (SDPD). When the ICAC received the report, an administrative assistant with SDPD printed the report with the attached electronic images and provided them to two ICAC investigators. These investigators opened the files, viewed the images, and determined that the images warranted an investigation. An ICAC sergeant conducted his own review and agreed with that recommendation. Using the information contained in the report and based on his own review of the images, ICAC Investigator William Thompson obtained a search warrant requiring Google to provide all content and user information associated with the identified Gmail address. The warrant resulted in the discovery of defendant’s e-mails offering to pay J.A. to molest and exploit children. Thompson also reviewed e-mails in which defendant distributed child pornography to others. This led to another search warrant authorizing the search of defendant’s apartment and vehicle, the execution of which resulted in the seizure of computer equipment, storage devices, and other effects as well as a thumb drive containing thousands of images of child pornography. Additional images were found on devices in defendant’s apartment. Charged in state court with a whole pile of child pornography-related offenses, defendant filed a motion to suppress which the trial court denied. He therefore went to trial, the jury convicting him on all counts. Sentenced to an indeterminate term of 45-years-to-life, defendant appealed.