Private Person Searches 

CAC00014
CASE LAW

The Private Search Doctrine

Google’s Computerized hashing technology

Child Pornography and the Internet

RULES

So long as a warrantless search conducted by a government entity is preceded by a private person search, the government search does not implicate the Fourth Amendment as long as it does not exceed the scope of the initial private search.  Google, using its proprietary hashing procedures to identify child pornography sent by its subscribers, is a private person search. 

FACTS

Defendant Luke Noel Wilson, a San Diego resident, liked to use a scam where he would lure young women with acting and modeling aspirations into photoshoots, finding his prospective candidates on a website where the women advertised their availability.  Once he got their attention, his plan was to begin the photo sessions with his targets being fully clothed, leading to being partially nude, to totally nude, to “sexually explicit,” to overt pornography, sometimes with himself participating.  The transition was eased through the use of alcohol and monetary payments.  One such connection was an 18-year-old woman who soon introduced defendant to her younger sister; 15-year-old J.A.  Defendant gradually led J.A. through the above progression, a relationship that lasted for several years until she was a young adult, and even after she became pregnant via a boyfriend when she was 17.  Not yet satisfied, or perhaps just broadening his repertoire of pornographic photographs, defendant eventually coxed J.A. into photographing herself sexually abusing her infant daughter as well as a five-year-old cousin, paying her for the photos.  Much of the exchange of photographs was accomplished via the Internet, with defendant using his Google Gmail account.  J.A. eventually began to feel guilty about committing acts that she knew were wrong (i.e., abusing her daughter and cousin), telling defendant that she was done even though she continued to sit for photoshoots of herself for him. But then she got busted by the F.B.I. and charged with felony child abuse (per P.C. § 273a(a)), and, under a plea bargain, was sentenced to 10 years of probation.  Meanwhile, Google became aware that defendant was transmitting child pornography via his e-mail Gmail account.  Google uses a screening process that employs a “proprietary ‘hashing’ technology” to identify apparent child sexual abuse images on its services.  This is how it works:  (This is all Greek to me, so I pretty much plagiarized this from the Court’s decision itself.)  Since 2008, Google has used a computerized “hashing technology” to assist in this process.  It starts with trained Google employees using software to generate a “hash” value for any image file they find depicting child pornography.  At least one Google employee reviews some random offending child pornography image before it is assigned a unique hash value, or a “digital fingerprint,” that is then stored in Google's repository of hash values.  This hash value is generated by a computer algorithm and consists of a short alphanumeric sequence that is considered unique to the computer file. The resulting hash values are then added to a repository. The repository therefore contains hash values, not the actual child pornography images.  When a user uploads new content to its services, Google automatically scans and generates hash values for the uploaded files and compares those hash values to all known hash values in the repository. If Google’s system detects a match between a hash value for uploaded content and a hash value in the repository for a file which was previously identified as containing apparent child pornography, the system generates a report to be sent to the National Center for Missing and Exploited Children (NCMEC) in the form of a “Cybertip”  NCMEC is statutorily obligated to serve as a national clearinghouse and maintain a tip line for Internet service providers to report suspected child sexual exploitation violations.  (18 U.S.C. § 2258A(c))  Also by statute, NCMEC is obligated to forward those reports to federal law enforcement.  It may also (as it did in this case) forward the reports to state and local law enforcement.  In the process, Google may or may not open the image file for manual review to confirm it contains apparent child pornography.  In June, 2015, Google’s system identified four image files, each with hash values matching values for apparent child pornography images in its repository, attached to an e-mail created by a Gmail account later identified as belonging to defendant.  Google generated a Cybertip report to NCMEC identifying and forwarding the four image attachments. The report included only the four image files, not the e-mail body text or any other information specific to the e-mail. Google classified the images, using a common categorization matrix, as “A1,” indicating they depicted prepubescent minors engaged in sex acts. The report reflected that a Google employee did not manually review these specific files after they were flagged using Google’s hashing technology, and before sending them to NCMEC.  Upon determining that the Gmail account at issue was in San Diego, NCMEC forwarded it onto the “San Diego Internet Crimes Against Children” (ICAC) task force.  This task force is comprised of law enforcement individuals from multiple agencies, including the San Diego Police Department (SDPD).  When the ICAC received the report, an administrative assistant with SDPD printed the report with the attached electronic images and provided them to two ICAC investigators. These investigators opened the files, viewed the images, and determined that the images warranted an investigation. An ICAC sergeant conducted his own review and agreed with that recommendation.  Using the information contained in the report and based on his own review of the images, ICAC Investigator William Thompson obtained a search warrant requiring Google to provide all content and user information associated with the identified Gmail address. The warrant resulted in the discovery of defendant’s e-mails offering to pay J.A. to molest and exploit children. Thompson also reviewed e-mails in which defendant distributed child pornography to others.  This led to another search warrant authorizing the search of defendant’s apartment and vehicle, the execution of which resulted in the seizure of computer equipment, storage devices, and other effects as well as a thumb drive containing thousands of images of child pornography. Additional images were found on devices in defendant’s apartment.  Charged in state court with a whole pile of child pornography-related offenses, defendant filed a motion to suppress which the trial court denied.  He therefore went to trial, the jury convicting him on all counts.  Sentenced to an indeterminate term of 45-years-to-life, defendant appealed.

HELD

The Fourth District Court of Appeal (Div. 1) affirmed defendant’s conviction.  Among the many issues decided on appeal was the lawfulness of the search of defendant’s thumb drive and computers, etc., defendant arguing that it was all the product of an illegal warrantless search of his e-mail account conducted by Google, with law enforcement using that allegedly illegally seized information as a basis for the warrants obtained by ICAC Investigator Thompson.  Defendant also argued that when Investigator Thompson initially looked at his (defendant’s) photographs without a search warrant, he did so illegally.  At defendant’s motion to suppress, Thompson testified about his investigation, acknowledging that neither Google nor NCMEC had opened the image files attached to defendant’s e-mail, and that he himself did not obtain a search warrant before first viewing the attachments.  The general rule is that searches and seizures are presumed to be illegal; i.e., in violation of the Fourth Amendment, absent a search warrant or some other legally recognized exception.  “Private searches” (i.e., searches conducted by private persons) are exempt from this rule.  The suppression requirements of the Fourth Amendment do not apply to searches by private persons.  Taking it a step further, the United State Supreme Court has held that “if a government search is preceded by a private search, the government search (also) does not implicate the Fourth Amendment as long as it does not exceed the scope of the initial private search.” (Italics added; United States v. Jacobsen (1984) 466 U.S. 109, 115–117.)  In other words, so long as law enforcement, in making a warrantless search, does not view anything a private person or entity hasn’t already seen, the Fourth Amendment is not implicated.  Typically, this involves some private party opening a container and viewing contraband, and then notifying law enforcement of his or her discovery with law enforcement then repeating what the private person has already done.  Jacobson covers this type of situation, saying that this is all legal, having been initiated by a “private person search.”  In this case, the Court held that Google’s actions of scanning user content, assigning hash values to that content, comparing user content to a repository of hash values, flagging offending images with hash values that match previously reviewed child pornography images, and then sending the apparent child pornography to NCMEC, is substantially the same, constituting in effect private action that was not performed at the direction of the government.  Therefore, Investigator Thompson in this case did not violate the Fourth Amendment when he opened and viewed the four photographs that Google (as, in effect, a private person) had flagged as pornography via it’s hashing procedure.  Thompson having viewed what Google had marked as pornography, and his subsequent search warrants, therefore, were lawful, as was the eventual discovery of the child pornography in defendant’s computers and thumb drive being used in evidence against him at trial.

AUTOR NOTES

The Court’s written analysis is a lot more complicated than my summary, above.  And it took me (not being the most brilliant person when it comes to computer technology) several readings to make sense of it all so that I could simplify it enough to get it down to its basics.  If you’re still confused, just know that when Google sends you (“you” being law enforcement) something that it says its “hashing” system has identified as pornography, you can take a look at what Google sent you without a warrant to see if you agree, and if so, then get your warrants for the identified suspect’s computers.  That’s really all the above legal mumbo jumbo and computer gobbledygook says. 

Author Notes

The Court’s written analysis is a lot more complicated than my summary, above.  And it took me (not being the most brilliant person when it comes to computer technology) several readings to make sense of it all so that I could simplify it enough to get it down to its basics.  If you’re still confused, just know that when Google sends you (“you” being law enforcement) something that it says its “hashing” system has identified as pornography, you can take a look at what Google sent you without a warrant to see if you agree, and if so, then get your warrants for the identified suspect’s computers.  That’s really all the above legal mumbo jumbo and computer gobbledygook says.