Forensic Focus Legal Update December 2019 – Part II: Search And Seizure

by Christa Miller

In cooperation with the National White Collar Crime Center (NW3C) and SEARCH, The National Consortium for Justice Information and Statistics, Forensic Focus offers a quarterly roundup of the latest legal developments around digital forensic evidence.

Comprising major legislation and case law from around the country, this guide is intended to help our readers get a better understanding of how different countries’ laws are adapting to the rapid pace of technological change.

Part 1 of this inaugural guide covered data privacy laws, bilateral sharing of cloud-based evidence, whether evidence collection could violate a company’s terms of service, and fallout from the Coalfire pen testing arrests.

In Part 2, we cover United States case law regarding technology — facial recognition, pole cams, geofencing, and third-party DNA databases — when it comes to search and seizure, along with the “reasonable suspicion” standard when it comes to border searches.


Get The Latest DFIR News

Join the Forensic Focus newsletter for the best DFIR articles in your inbox every month.


Unsubscribe any time. We respect your privacy - read our privacy policy.

The material published here is for general public education. It is not legal advice for any specific situation. If the reader needs specific legal advice, the reader should consult a qualified lawyer.

Search & Seizure

In the United States, a number of news articles highlight the interplay between digital technology and the Fourth Amendment of the US Constitution, “the right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures.” At the end of 2019, technology for review includes facial recognition, pole-mounted surveillance cameras, third-party DNA databases, and geofencing.

Facial Recognition

A bipartisan US Senate bill, the Facial Recognition Technology Warrant Act, seeks to impose some limits on the way federal and other law enforcement agencies use facial recognition technology for ongoing surveillance. To track an individual for longer than 72 hours, whether in real time or through historical records, investigators would need to obtain a search warrant.

Law enforcement in the US is currently unfettered when it comes to surveilling the general public via facial recognition technology. The new bill likens this capability to “other forms of intrusive searches and surveillance, such as searching cellphones, conducting a wiretap, collecting cellphone location information, or installing a location tracking device,” according to a bill summary

On the other hand, the bill doesn’t cover use of the technology for identification purposes and moreover, covers an aspect of the technology that isn’t currently in use, according to one US law professor

Still, federal law enforcement agencies would be required to create testing procedures in conjunction with the National Institute of Standards and Technology. That part of the bill addresses the considerable evidence of facial recognition’s challenges identifying people based on their gender, age, or ethnicity.  

Pole-Mounted Surveillance Cameras

Law enforcement’s use of surveillance cameras mounted to utility poles outside private residences is nothing new, but recent court decisions set important precedents at both state and federal levels in the United States, building on key Supreme Court decisions in Carpenter v. US, US v. Jones, and Riley v. California. 

In December, Colorado’s state appeals court ruled that four months of “warrantless and continuous pole camera surveillance of defendant’s fenced-in back yard was unreasonable and violated his reasonable expectation of privacy under the Fourth Amendment.” People v. Tafoya, 2019 COA 176, 2019 Colo. App. LEXIS 1799 (Nov. 27, 2019).

That decision followed one from June, when a federal district court in Massachusetts ruled that eight months’ worth of “constant pole camera digital recording of all comings and goings from defendants’ house… violated their reasonable expectation of privacy under Carpenter and chilled freedom of association.” United States v. Moore-Bush, 2019 US Dist. LEXIS 92631 (D. Mass. June 3, 2019). 

TechDirt’s Tim Cushing pointed out that although a 2014 Washington state case, US v. Vargas, resulted in a ruling that just six weeks of video-cam surveillance was unconstitutional, a Sixth Circuit Court of Appeals judge had held in 2016 that ten-week surveillance was no different from an eight-hour period.

The facts of these cases matter: in both Colorado and Massachusetts, cameras controllable in real time were in use, not just passive surveillance, wrote Cushing. In addition, the Colorado camera “also did something the average passerby couldn’t do (I mean, in addition to staring at someone’s house for 13 weeks straight): it could see above the suspect’s six-foot privacy fence to the end of the driveway near the house’s garage and entrance.”

Geofencing

Another test on the use of law enforcement investigative technology has been introduced in Virginia, where a motion to suppress was filed in October in US v. Chatrie. Police wrote a “geofence” warrant “to obtain the cell phone location information of 19 Google users who happened to be in the vicinity of a bank robbery on a Monday afternoon in Richmond,” according to the motion.

Chatrie argues that the warrant is not only overbroad, but also lacks the particularity required by the Fourth Amendment — the enumeration of “places to be searched and things to be seized.”

The motion alleges that the warrant failed to establish probable cause to search Chatrie’s Sensorvault — Google’s “large cache of deeply private data” — because “there are no facts to indicate that the bank robber used either [Google phones and services], whether ever or at the time of the robbery…. Instead, based only on Google’s popularity and the prevalence of cell phones generally, law enforcement searched a trove of private location information belonging to 19 unknown Google users who happened to be near a local bank on a Monday evening.”

This was problematic because “[n]ot only can this data reveal private activities in daily life, but it can also show that someone is inside a constitutionally protected space, such as a home, church, or hotel—all of which are in the immediate vicinity of the bank that was robbed in Richmond.”

Third-Party DNA Databases

Law enforcement use of databases like GEDmatch and 23andMe grabbed attention last year, after police matched DNA found at crime scenes with familial DNA profiles to solve the Golden State Killer cold cases. This year, with GEDmatch’s acquisition by crime scene DNA sequencing company Verogen, government access to sensitive health data is becoming more pressing.

For Slate’s Future Tense, Aaron Mak wrote: “It’s not clear whether the database users have standing to challenge a warrant because they aren’t technically the subject of the criminal investigation. But the suspects themselves also might not have standing because it isn’t technically their DNA that law enforcement is trying to access in the database.” 

The New York Times’ Kashmir Hill and Heather Murphy also highlighted the tension between consumer DNA sites’ privacy policies and government requests for information. 

Finally, a Twitter thread around 23andme’s data retention and deletion practices questions whether existing federal and state laboratory and data retention regulations might in fact conflict with the CCPA. 

(Text © Ashton Meyers, NW3C. Used with permission.)

In 2018 when the General Data Protection Requirement (GDPR) became effective in Europe, 23andme had to adjust in order to comply with the new regulations. 23andme not only adjusted their privacy statement and terms of service, they also added enhanced security measures and privacy protections. However, further adjustment will likely be necessary in order to comply with the California Consumer Privacy Act (CCPA) on January 1st, 2019.

23andme already had a jump on these adjustments compared to companies that don’t need to comply with the GDPR. In order to comply with the GDPR, 23andme already has in place the right to access and delete your data; as well as offering data portability in an accessible format. Before January 1, it is likely that 23andme will also make minor adjustments to consent forms and take steps to ensure that they are only working with third party service providers who also comply with the CCPA.

However, 23andme will not only have to comply with the CCPA, but also federal laws which are already in place like the Clinical Laboratories Improvement Act (CLIA) and the Health Insurance Portability Accountability Act (HIPAA). This means that the right to deletion will only apply to things like deletion of account information and any research involving personal data. To remain compliant with the CLIA, a third-party laboratory will still have to retain personal genetic information, date of birth, and sex.

With the lack of a federal data privacy law there is little the CCPA can do about this and generally, the CCPA shies away from applying to certain health data altogether through exceptions laid out in provisions of the law. Furthermore, if a federal law ever gets passed, it will likely include similar exceptions when applying to regulations already in place like HIPAA and the CLIA.

All this should urge consumers to stamp a “Buyer Beware” on companies like 23andme because their data will not be fully deleted, at least for some time after the request.

DNA is an even bigger issue in Kenya, where the National Integrated Identity Management System (NIIMS) could link biometric data “to everything from identity cards to access to education, health, and social services.” Quartz Africa reports that the move is a response to a corrupt citizenship process and escalating terrorist attacks. Now that Kenya has enacted its own GDPR-style regulation, it remains to be seen how the government will protect its own citizens’ personal data.

Reasonable Suspicion At The US Border

Border searches of digital devices have made headlines in the US several times this year. In August, the 9th Circuit Court of Appeals ruled that officials could conduct suspicionless manual mobile device searches, but that reasonable suspicion was required for a forensic examination. This was backed up by a US District Court in Massachusetts in November. 

(Text © Robert Peters, Esq. Used with permission.)

Historically, border searches have not required probable cause for a search warrant, and many courts have held that even the lower threshold of reasonable suspicion is not required for border searches, where individuals’ personal privacy interests are weighed against government interests in contraband interdiction among others. United States v. Touset, 890 F.3d 1227, 1229 (11th Cir. 2018). 

For example, one court upheld the warrantless forensic preview search of multiple electronic devices, on the grounds that this was a routine border search, and no individualized suspicion was required. United States v. Feiten, 2016 WL 894452, AT *4 (E.D. Mich. 2016). 

Reasonable suspicion is a “commonsense, nontechnical” concept involving “factual and practical considerations.” Illinois v. Gates, 462 US 213, 231 (1983). Courts requiring a reasonable suspicion threshold typically engage in a factual analysis of the circumstances surrounding the search. In this line of cases, manually searching a

  • computer (United States v. Ickes, 393 F.3d 501, 503, 505–06 (4th Cir. 2005)) 
  • camcorder (United States v. Linarez–Delgado, 259 F. App’x 506, 508 (3d Cir. 2007)) 
  • laptop (United States v. Arnold, 533 F.3d at 1005, 1008 (9th Cir. 2008)) 
  • or even a floppy disk (United States v. Bunty, 617 F. Supp.2d 359, 363–65 (E.D.Pa. 2008)) 

All constitute routine border searches and are permissible, as are forensic previews (United States v. Stewart, 729 F.3d 517, 521, 525 (6th Cir. 2013). 

The invasiveness, or lack thereof, of a particular search conducted at the border receives significant judicial attention. In Feiten, the court upheld a border search in part because of the less invasive nature of a forensic preview, and held that the particular forensic tool, OS Triage, implicated fewer privacy concerns than a manual search, given the thumbnail preview feature and the ability to match file names with known child pornography.

That court also found that OS Triage’s ability to maintain file integrity was relevant, particularly since manual searches “can actually alter key evidentiary aspects of each file inspected.” The court therefore characterized law enforcement’s use of OS Triage as “an exercise of electronic restraint” well within the pertinent legal boundaries.

A third 2019 case, United States v. Wanjiku, 919 F.3d 472 (2019), demonstrates one approach in jurisdictions that tend to apply a reasonable suspicion analysis to border searches. In Wanjiku, government actors conducted a forensic preview on the defendant’s phone based on reasonable suspicion, and no warrant was obtained.

The American Civil Liberties Union (ACLU) and the Electronic Frontier Foundation (EFF) filed an amicus brief advocating for the court to impose a probable cause standard for border searches of mobile devices. 

Both the amicus brief and Wanjiku on appeal emphasized the extremely personal content of mobile devices in the modern world, and suggested that warrantless mobile searches functionally permitted dragnet investigations, drawing on language from key decisions such as Riley v. California, 134 S.Ct. 2473 (2014) and Carpenter v. United States, 138 S. Ct. 2206 (2018).

The reasonable suspicion of Wanjiku was developed not only with the initial screening criteria of Wanjiku being a US male, between 18 and 60, with a prior arrest, traveling alone, from a country with high sex tourism. It was also based on government databases and publicly available social media data, which created an understandably greater interest in Wanjiku:

  • His prior arrest was for contributing to the delinquency of a minor.
  • This was his third trip in two years to the Philippines.
  • He lacked any business or family ties to the Philippines.
  • His prior flight was booked with the email address “Mr. DONGerous”.
  • His Facebook photo was masked, with the profile having “very young” Facebook friends.

The confluence of these facts resulted in Wanjiku being directed to a secondary inspection site, where his own behavior — bolting from the inspection line, visible nervousness, and “vague and evasive answers” to questions about his spending or travel habits — and additional items found on his person, including hotel receipts, syringes, condoms, and injectable testosterone, added to border agents’ reasonable suspicion.

Importantly, the Seventh Circuit discussed the specific forensic tools deployed and their non-invasive functionality in its analysis. (The court specifically noted that the preview processes were “non-destructive.”) Wanjiku provided the passcode to his phone when the agent claimed authority to search at the border, and the agent manually scrolled through the phone, encountering a few suspect images before turning the phone over to HSI. Cellebrite and XRY were utilized, resulting in the discovery of child pornography images.

Undercovered by the ACLU, EFF, and Wanjiku, but directly addressed by the Seventh Circuit, is the fact that significant evidence of child sexual abuse material was discovered not only on the mobile device, but also Wanjiku’s hard drive (discovered by an EnCase forensic preview) and laptop (which underwent a full forensic examination in a lab after contraband was discovered elsewhere).

This fact was relied on by the court to distinguish the case in some ways from Riley and Carpenter; the Seventh Circuit conceded that the US Supreme Court has “recently granted heightened protection to cell phone data,” but held that Riley and Carpenter do not apply to border searches where governmental interests “are at their zenith.”

The Seventh Circuit transparently declined to address whether reasonable suspicion or probable cause applied to border searches, choosing to “avoid entirely the thorny issue” since law enforcement acted in good faith when they searched the devices, reasonably relying on Supreme Court precedent “that required no suspicion for non-destructive border searches of property, and nothing more than reasonable suspicion for highly intrusive border searches of property.” However, the Court did find that the circumstances of the search, including Wanjiku’s unique behavior, clearly constituted reasonable suspicion.

The border search issue is unlikely to go away anytime soon owing to recent developments, such as border agents’ newly granted access to intelligence databases, including social media data. This issue may become more significant if data from device searches — including those in which visa applicants are required to provide their account credentials — start being added to these databases, and will be one to watch.

Attorneys from different parts of the world who would like to participate in this project are welcome to contact Forensic Focus’ content manager at christa@forensicfocus.com.

Leave a Comment

Latest Videos

Digital Forensics News Round Up, March 27 2024 #dfir #digitalforensics

Forensic Focus 20 hours ago

Digital Forensics News Round-Up, March 21 2024 #digitalforensics #dfir

Forensic Focus 21st March 2024 6:15 pm

This error message is only visible to WordPress admins

Important: No API Key Entered.

Many features are not available without adding an API Key. Please go to the YouTube Feeds settings page to add an API key after following these instructions.

Latest Articles