[Editor’s Note: This article has been republished with permission. It was originally published November 2, 2023 on the eDiscovery Assistant Blog]
In Episode 122, CEO and Founder of eDiscovery Assistant Kelly Twigger discusses the first time a court has considered the constitutionality of a reverse keyword warrant to search the IP addresses of accounts that had run a search related to criminal activity in 𝐒𝐭𝐚𝐭𝐞 𝐯. 𝐒𝐞𝐲𝐦𝐨𝐮𝐫.
Background
Each week on our Case of the Week series, I choose a recent decision in ediscovery and talk to you about the practical implications of that judge’s ruling. This week’s decision covers an issue of first impression — whether a reverse keyword warrant to search Google’s database for IP addresses of accounts that had searched a single address within 15 days before an alleged arson violated the Fourth Amendment and the Colorado Constitution.
This is our third state court decision in a row, and this time from the Colorado Supreme Court, my home state. It’s another example of how the case law is increasing in state courts in ediscovery.
We often see new and novel issues with ESI stemming from criminal cases — the first standard on authenticating social media evidence came from a criminal matter. This one that we’re discussing today is a bit different in that the information has to come from Google versus a party in a civil action, but it is very instructive of the value of search history and some of the issues that we will start to see on the civil side.
This decision comes to us from a 𝐒𝐭𝐚𝐭𝐞 𝐯. 𝐒𝐞𝐲𝐦𝐨𝐮𝐫 from the Colorado Supreme Court, and it is dated October 16, 2023. The decision is authored by Colorado Supreme Court Justice, William Hood. The issues here are search and seizure, scope of warrant, criminal, and privacy.
Facts
This decision stems from an underlying criminal prosecution against the defendant, Seymour, for arson. Two months after a fire at a Denver home that killed five people, an exhaustive investigation by the Denver Police Department (DPD) had not generated any leads. Footage from a neighbor’s home security surveillance showed three masked individuals with what appeared to be a gas canister at the house when the fire started, leading investigators to believe that the arsonist had targeted this address.
In pursuing the theory of the targeted address, the investigators inferred that the perpetrators would have researched the property before burning it down, or at the very least, looked up directions to get there. Investigators sought and obtained a series of reverse keyword warrants, which required Google to identify users who had searched the address within a specified period.
The first warrant requested a list of any users who had searched one of nine variations of the address in question in the 15 days before the fire. Via affidavit, a Google policy specialist explained that to protect its users legal and privacy interests, Google uses a staged process to respond to warrants. The company first provides law enforcement with a “de-identified or anonymized list of responsive searches.” Then if law enforcement concludes that any of the anonymous results are relevant to the investigation, Google will identify the users if court-ordered to do so. Because the first warrant here ordered Google to immediately produce identifying information, complying with the warrant would have violated Google’s policies, so the company refused to produce the requested records and that first warrant was withdrawn.
On their second try, the DPD obtained a warrant seeking the same list of users. Instead of requesting personally identifying information like names and birthdates, DPD requested two days of location data for each account. Google again refused to comply because of privacy concerns and investigators again withdrew the warrant.
The third time was indeed a charm here. DPD then obtained a warrant requesting that Google produced an anonymized list containing “an Identifier assigned by Google representing each device along with the associated IP address for any Google accounts that searched the address during the 15-day period before the fire while using Google services.”
Because this warrant complied with Google’s policies, Google complied with the warrant and produced a spreadsheet of 61 searches made by eight accounts. Five of the eight accounts had Colorado based IP addresses. The DPD then successfully retrieved the names and other personal information associated with those five Colorado accounts through another warrant to which Google submitted without objection. One individual was eliminated as a suspect because she was related to the alleged victims. DPD also sent warrants to social media platforms and internet service providers to obtain information about the remaining four people. The defendant was one of three remaining suspects and was eventually charged with numerous felonies, including multiple counts of first-degree murder, arson, and burglary.
At trial, Seymour moves to suppress all evidence resulting from the search executed under the reverse keyword warrant, arguing that it was unconstitutional because it was not adequately particularized and lacked individualized probable cause. The trial court denied Seymour’s motion, and we are before the Supreme Court on a motion to intervene.
Analysis
Since we are on appeal, the standard of review here is critical, and whether evidence should be suppressed is a mixed question of law and fact. The Court starts this analysis by looking at the constitutionally protected search and seizure interest to determine whether Seymour had standing to assert his claim here. To establish that standing, Seymour must have a reasonable expectation of privacy in the place to be searched or a possessory interest in the property seized. The courts use a two-prong test to determine if a claimed privacy interest warrants constitutional protection. First, whether the individual exhibited an actual expectation of privacy, and second, whether objectively the expectation is one that society is prepared to recognize as reasonable. The second or more objective prong typically dictates whether constitutional protections apply.
The Court noted here that an individual’s Google search history “holds for many Americans the privacy of life. That a user’s online search history may indicate an individual’s interest in a specific religion or research into sensitive medical conditions, information that could reveal intimate details about an individual’s private life. The contents of an individual’s internet search reveals as much, if not, in many instances, more about one’s private life than other records in which we previously held individuals have a reasonable expectation of privacy.” But then the Court addressed the fact that Google’s multistep process here first yielded only nominally anonymized IP addresses, which identified computer devices or potentially networks from which users initiated the relevant searches.
Google provided law enforcement no names or other personally identifying information until investigators obtained a subsequent warrant asking Google to link the IP addresses to the names of the Colorado users. According to the Court, even if it determines that individuals have a privacy interest in their internet search histories, the Court still has to consider whether that privacy interest encompasses those histories when viewed simply in with an anonymized IP address. Essentially, the Court saying, if there’s no specific user tied to the information, is the privacy interest still the same?
The Court went through and analyzed multiple federal cases that have held that a user does not have a privacy interest in data that is shared with third parties, but noted that that is where the Colorado Constitution differs from the Fourth Amendment in the US Constitution. The Colorado Constitution has been interpreted to provide greater protections than the Fourth Amendment, and the Court has expressly rejected the third-party provision. Therefore, according to the Court, Seymour had a reasonable expectation of privacy in his Google search history under Article Two Section 7 of the Colorado Constitution.
The Court next turned to whether Seymour had a possessory interest in his search history, noting that a seizure occurs when the government meaningfully interferes with an individual’s possessory interests in property. The Court here points to case law and to Google’s licensing agreement that makes clear that Google does not own its users’ content. Instead, Google states that users own their own content on Google, which, according to testimony from a Google policy specialist, includes those search histories.
Law enforcement’s copying of Seymour’s Google search history meaningfully interfered with his possessory interest in that data and constituted a seizure subject to constitutional protection, according to the Court. Coupled with the right to privacy, the Court found that Seymour had standing to challenge the warrant. The Court then next agreed with the trial court that the warrant was adequately particularized and sought a very precise and narrow subset of information before looking at the issue of probable cause. We’ve already got the fact that Seymour has the ability to create standing to challenge the warrant here. Now we’ve got to look at whether or not the warrant was sufficiently particularized and also had probable cause.
We’re at probable cause, and what the Court does in this analysis is much less weighty than its analysis to date on the other issues. The Court essentially punted and found that it need not resolve the issue of whether a search of Google’s database required probable cause to a single account holder, meaning that the DPD had reason to search Seymour’s history on the database.
This is important because this is really the key principle that’s discussed here. In the search of Google’s database, does it require probable cause as to a single account holder? Instead of resolving that issue here, the Court decides that it does not need to resolve it because the evidence would still be admissible under the good faith exception to the exclusionary rule.
The good faith exception to the exclusionary rule provides that there is an exception to the Fourth Amendment’s barring evidence at trial that is obtained pursuant to an unlawful search and seizure. If officers had a reasonable good faith belief that they were acting according to legal authority, such as by relying on a search warrant that is later found to have been legally defective, the illegally seized evidence is admissible under this exception.
The Court found basis for the good faith exception — that until today’s ruling,
“[N]o court had established that individuals have a constitutionally protected privacy interest in their Google search history. And absent precedent, the DPD reasonably believed that it only needed to show a nexus between the alleged crime and a subset of Google’s database. In an eight-page affidavit, which the reviewing judge called one of the most detailed he had seen in a long time, the affiant explained why. Based on his training and experience, he believed the fire had been planned, why he believed the perpetrators searched the address online beforehand, and why the record of the search could be found in Google’s database. To the extent investigators reasonably believed it was legally necessary, the affidavit demonstrated the requisite minimum nexus between the alleged arson and Google’s database.”
In conclusion, the Court agreed with the trial court that the deterrent purpose of the exclusionary rule was not served by suppressing the warrant, several iterations of which were approved by two judges that “authorized a relatively new and previously unchallenged investigative technique.”
This decision from the Court that allows this reverse keyword warrant to search Google’s database includes both a concurring opinion and a dissent, and both of them are worth reading. Essentially what you’ve got here is you’ve got both the trial court and the Supreme Court saying the good faith exception to this warrant applies and there’s no ability to suppress it. But both the dissent and the concurrence argue that probable cause did not exist and that the courts carve out here, “turns Fourth Amendment jurisprudence on its head.”
But the concurrence by Judge Berkenkotter agrees with the majority that the good faith exception applies because there’s no well-established law concerning the constitutionality of keyword searches, and therefore the DPD had no reason to know “that individuals have a constitutionally protected interest in their Google search history or that the department might have needed to demonstrate a connection between the alleged crime and Seymour’s individual Google account.”
The dissent, which is written by Justice Marquez, argues that the reverse keyword warrants “permit exactly what the Fourth Amendment forbids” by authorizing law enforcement to rummage through the private search histories of a billion individuals for potential evidence of criminal activity.
Justice Marquez lays out three or four different points that he feels are pertinent to the Fourth Amendment discussion, but he doesn’t really talk about the Court’s use of the good faith exception. What he says here is that the majority fails to articulate that the search here is a scan of Google’s entire database of user records, which means that the search violated the privacy and possessory interests of every other Google user. Because those interests exist, law enforcement must have probable cause to scan an individual user’s search history. But here, according to Justice Marquez, the DPD used a reverse keyword search because it lacked probable cause with respect to any individual Google user associated with an IP address. The dissent also goes into a discussion that I really recommend you read because I’m sure this decision is only the tip of the iceberg on this issue and it will impact the availability of search history on the civil side as well.
Takeaways
The reverse keyword warrants at issue in Seymour should be concerning to everyone listening and to those who aren’t. The volumes of information that are maintained about every single one of us by companies like Google, social media platforms, Meta — it’s extensive. Every click, every search, every like, how long you spend watching a video, whether you click on an ad, etc, all of that information is being tracked. One of the reasons that the US government is trying to ban TikTok is because of the volume of information that’s being tracked about children, because children are predominant users of the TikTok platform. This is something that’s widely recognized, and it’s really coming to issue in discovery.
This decision from the Colorado Supreme Court, which essentially eschews Fourth Amendment law because no court has ruled on the constitutionality of reverse keyword warrants to provide notice to law enforcement, opens up Pandora’s box. It is not clear yet how this decision will impact civil discovery, but we know from years of watching the evolution of eDiscovery case law that many of these concepts begin on the criminal side. The last two years of case law alone on the civil side have seen more challenges to privacy as a limitation on relevance than ever before. As we wade further and further into ESI and discovery that may contain private information, this issue is going to continue to come up.
There’s a specific article — it’s included in our eDiscovery Academy — from Judge Francis in which he wrote about the element of privacy and whether or not it should be included in the proportionality analysis. His conclusion was that it’s not. There are other articles out there which advocate the position that privacy is part of the proportionality analysis. In any event, whichever side is correct, I think this issue is going to become much more prominent in the civil discovery as we move forward.
Past and upcoming episodes of #CaseoftheWeek can be accessed via the ACEDS YouTube channel.