Bad people make good case law. That’s just how our criminal justice system works. And so it is here in this decision, which flows from criminal charges that, in turn, flow from proactive efforts meant to thwart the sharing of child sexual abuse material.
In this case, Ryan Maher was convicted of CSAM possession. Having been informed of the chain of events leading to his arrest, Maher challenged the warrant used by law enforcement, claiming it was tainted by previous unconstitutional intrusions.
Like almost every service provider, Google checks users’ emails for hash values indicating known CSAM images. It passes these hashes on to NCMEC (National Center for Missing and Exploited Children), which then decides whether or not to pass this information on to law enforcement. Google is also welcome to pass on this information on its own — something it informs email users it might do if it detects illegal content.
That’s what happened here, as the Second Circuit recounts at the opening of its decision [PDF]. But the government overstepped when it decided to perform a further search without acquiring a warrant. (h/t Volokh Conspiracy)
No one at Google visually examined the contents of the Maher file before reporting it to the National Center for Missing and Exploited Children (the “NCMEC”) as “apparent child pornography.” Rather, that report was based on a computer-conducted algorithmic search of the Maher file, which identified a match between the hash value for the image contained in the Maher file (the “Maher file image”) and the hash value of an image (the “original file image”) that Google had earlier located in another file (the “original file”). Thus, when law enforcement authorities visually examined the contents of the Maher file, they went beyond the scope of Google’s private algorithmic search in that they learned more than the hash value for the Maher file image; they learned exactly what was depicted in that image.
This extra step prior to obtaining a warrant meant the government couldn’t claim it was just the innocent beneficiary of a private search. Google didn’t actually view the image that matched NCMEC’s hash values. It actually wasn’t viewed by anyone until investigators opened the file to verify its contents.
As the court goes on to note, the original tip might have provided probable cause, but New York State Police investigators should have used this probable cause to obtain a warrant, rather than deciding it didn’t need to obtain a warrant until after it had performed its own search.
In these circumstances, Google’s hash match may well have established probable cause for a warrant to allow police to conduct a visual examination of the Maher file. But, for reasons stated in this opinion, we conclude that neither the private search doctrine relied on by the district court nor the Google Terms of Service agreement cited by the government authorized the police to open the Maher file and to conduct such a visual examination of its contents without a warrant.
Normally, courts will just allow the “private search” assumption to carry the day and refuse to suppress unlawful searches performed by the government. In this case, that argument falls flat, mainly because the government — for whatever reason — chose to attack the suppression motion from a rather novel angle.
The government didn’t argue there was no reasonable expectation of privacy in the contents of Maher’s emails — an argument that would have persuaded no judge anywhere, even if Maher disclaimed ownership of the CSAM image discovered in his email account. Instead, it pitched its own interpretation of the Third Party Doctrine: one that posits that because Google informs users the company might “review content” in Gmail accounts, the government is free to do pretty much the same thing without a warrant.
Not so, says the Appeals Court. What someone may agree to share with a private company to utilize a service is not a blank check for similar intrusion by the government. The court doesn’t go so far as to draw a bright line, but it says enough to indicate the existing lines are bright enough the government should know better than to cross them.
We need not here draw any categorical conclusions about how terms of service affect a user’s
expectation of privacy as against the government. On this appeal, it suffices that we conclude that Google’s particular Terms of Service—which advise that Google “may” review users’ content—did not extinguish Maher’s reasonable expectation of privacy in that content as against the government.[…]
Nor is a different conclusion compelled by the fact that Google’s Terms of Service also warn users that the company “will share personal information outside of Google if . . . reasonably necessary to[] . . . [m]eet any applicable law.” (emphasis added). As noted supra at 7 n.5, federal law requires electronic service providers such as Google to file a report with the NCMEC when they have “actual knowledge” of child pornography on their platforms. 18 U.S.C. § 2258A(a)(1)(A), (B). But the same law specifically does not require Google “affirmatively [to] search, screen, or scan” for such material. Not surprisingly then, Google does not tell users that it will engage in the sort of content review for illegality that could trigger disclosure obligations under § 2258A(a)(1)(A), (B). Rather, it tells users only that it “may” engage in such review. Indeed, in the next sentence, Google emphasizes that it “does not necessarily . . . review content,” and tells users, “please don’t assume that we do.” (emphasis added). Such qualified language is hardly a per se signal to Google users that they can have no expectation of privacy in their emails, even as against the government.
Which is long way of basically stating what should be considered obvious: that agreeing to share things with private company third parties is not nearly the same thing as agreeing to share the same things with the government at any point the government desires to access content or communications. And yet, that’s pretty much how the Third Party Doctrine operates. Fortunately, it has never been so broadly read to cover private communications, but that definitely appears to be the argument the New York State Police tried to advance while pushing back against this challenge of its warrantless search.
It probably shouldn’t have bothered. The search is saved by the “good faith exception” and the conviction remains in place. But even though none of that does much for Maher, it does at least make it clear to law enforcement operating in the Second Circuit where the Third Party Doctrine ends and the Fourth Amendment begins. And that dividing line can’t simply be ignored because somewhere in the ToS boilerplate users are “informed” (lol) the company they’re entrusting their communications with may occasionally share these with the government.