Forensic Focus Legal Update December 2020: Refining Search & Seizure; New Laws & Guidance

The fourth quarter of 2020 has seen some significant legal developments when it comes to digital evidence. In this issue (which also happens to be the anniversary of the inaugural Forensic Focus Legal Update), we highlight:

  • Search warrant reform in the United Kingdom
  • Court decisions at various levels of the United States federal court system, including the U.S. Supreme Court
  • New federal and state laws
  • A roundup of forthcoming and recently published articles

U.K. search warrants are being reformed

In response to a variety of problems — defective warrants, inefficiency, insufficient powers to access evidence on remote servers, and inadequate safeguards — the United Kingdom’s Home Office in 2016 commissioned a review of current search warrant law and practices.

In October this year, the Home Office’s Law Commission made its final report available to Parliament. It’s still awaiting a response. However, among its 64 key recommendations are several that apply specifically to digital data and forensics:

  • Law enforcement powers are updated to apply more clearly to electronic devices and data, allowing more effective acquisition of digital evidence without either inhibiting investigators or imposing “unreasonable demands on law enforcement agencies.”
  • This would include on-premise search and acquisition of electronic devices and, potentially, remotely stored data.
  • To protect property and privacy rights, unneeded data would need to be “swiftly deleted, and devices returned as soon as is practical.”
  • The ability to apply for and execute search warrants would be offered to English and Welsh fraud authorities. Although they might need to rely, to a lesser extent, on police to conduct searches, they’d also be subject to similar (reformed) safeguards.

Acknowledging that “The problems relating to electronic material transcend search warrants,” the Law Commission further recommends “a wider review of the law governing the acquisition and treatment of electronic material in criminal investigations.”

In U.S. courts: Computer access, geofencing, ALPRs, and biometric data

The U.S. Supreme Court heard oral arguments on Monday, November 30 in Nathan Van Buren v. United States, a case that’s expected to help define the 1986 Computer Fraud and Abuse Act (CFAA). The “pre-eminent anti-hacking law in the United States,” the CFAA covers both civil and criminal cases involving federal computers, as well as those used in interstate or international commerce.

Get The Latest DFIR News

Join the Forensic Focus newsletter for the best DFIR articles in your inbox every month.

Unsubscribe any time. We respect your privacy - read our privacy policy.

Specifically at issue: whether someone who’s authorized to access information on a computer for certain purposes — in this case, a law enforcement officer searching state motor vehicle records — has violated the CFAA’s Section 1030(a)(2) if they access the information for an improper purpose.

U.S. circuit courts of appeals are divided over the CFAA’s scope, in practice, of what “exceeding of authorized access” means. According to Politico, the justices questioned the meaning of words like “authorized,” and “obtain” to try to determine how far to go in narrowing it. A ruling likely won’t be handed down until early in the new year. But the case has implications for independent and corporate digital forensics analysts as well as criminal investigators:

  • Writing for Dark Reading in September, Robert Lemos noted: “…security researchers and technology companies are concerned with the potential for the case to turn independent vulnerability research into unauthorized access and, thus, a prosecutable offense.”
  • During oral arguments, attorney Jeffrey Fisher, representing Van Buren, referred to hypothetical scenarios such as an employee violating a company’s acceptable-use policy by using their Zoom account to talk to distant relatives during a holiday.
  • Less innocuous, according to one labor and employment law firm in its blog, are examples of trade-secret or intellectual-property theft by insiders who are authorized to obtain the data, but only for specific business-related purposes.

Authorized access to information, even for its intended purpose, has also been under scrutiny by courts this quarter. That’s because increasingly, data can be used to track or define people’s private movements.

Motor vehicle data was also the subject of a recent challenge to government use of automated license plate readers (ALPRs) to collect and store motor vehicle tags. In October, the Virginia Supreme Court ruled that ALPRs don’t collect sufficiently personal data to qualify as an “information system” under the state’s Data Act. 

In the Northern District of Illinois, Eastern Division, meanwhile, geofencing “reverse” search warrants served on Google have come up three times this year. Most recently, in October, a U.S. magistrate judge in that court granted six search warrants — in contrast to two colleagues, who in July and August had rejected geofence warrants they deemed overbroad:

  • In July, a warrant was denied because its three geofences circumscribed a 100-meter radius comprising business and residential properties in “a densely populated city” for a 45-minute period on a specific date. In his decision, a U.S. magistrate judge defined how to constrain both the geofence and the phone numbers of interest.
  • The second warrant that same month in that same case was indeed more narrowly defined, but was denied anyway because it didn’t establish probable cause to seize users’ device information when it couldn’t even define who was involved in the offense.
  • In August, a third search warrant for the same case was again denied after the applicants eliminated their requirement for Google to produce subscriber information. That didn’t matter, noted Bloomberg Law: “Although the information will be anonymized, the warrant also puts no limitation on the government’s discretion on how it will go about selecting a suspect.”

October’s ruling involves a couple of crucial differences from previous rulings. First, the crime of arson had been committed on vehicles in business parking lots. Additionally, the crimes took place in the wee hours of the morning, outside regular business hours.

Had the arsons been committed on or near residential buildings and/or during normal business activities, and/or had there been fewer than six areas of interest, the geofence warrant might have been viewed differently.

In general, says Katherine Hansen, a deputy district attorney and digital evidence specialist with Colorado’s Denver District Attorney’s Office: “A lack of understanding of geofencing technology and the warrant process related to geofencing seems to be contributing to the inconsistency in rulings and the content of the rulings themselves.”

Even Google seems uncertain on what’s required for each step of the geofencing procedure, adds Hansen, “making uniformity in warrant drafting difficult.” As a result, investigators seeking a good rule of thumb — or simply to win constitutional challenges — may be waiting for a while.

A new U.S. federal law tackles IoT security

Even as the U.S. Supreme Court mulls what constitutes computer fraud and abuse, the IoT Cybersecurity Improvement Act, passed by Congress and presented to the president for signing in late November, seeks to require two things from federal government contractors:

  • Comply with standard security requirements, set by NIST, in IoT device manufacturing.
  • Implement vulnerability disclosure policies.

The new law follows a pair of 2019 laws enacted in California and Oregon. Ideally, according to a CyberScoop article, the requirements will take hold across the industry in much the same way as the EnergyStar rating. That could help to reduce the risk of threats from, say, attackers using botnets like Mirai, or unauthorized access to devices.

“There has been uncertainty among researchers whether IoT vendors are disclosing vulnerabilities to their government customers,” said Jessica Hyde, Director of Forensics with Magnet Forensics and a noted expert on IoT forensics. 

She continued: “It is encouraging to see the IoT Cybersecurity Act put forward with the goal of improving the security of government agencies, and by extension citizens, in the digital age. The efforts of the National Institute of Standards and Technology to develop standards and guidelines for the disclosure of security vulnerabilities pertaining to IoT devices will allow researchers to feel confident that action will be taken when they identify critical vulnerabilities. The new guidelines should help ensure that this critical information is available to those who need it most in the public sector.”

The U.S. election brings new state laws

In Massachusetts, what’s being called a “right to repair” law will expand an existing law allowing vehicle owners in the state to repair their own vehicles. Starting with model year 2022, telematics systems will need to use a standardized open data platform that provides owners and shops with direct access.

The new law “covers the data that telematics systems collect and wirelessly transmit… [and] allows owners and independent mechanics to send commands to the vehicle for repair, maintenance and diagnostic testing.”

Compliance won’t be easy, according to CNet, and the measure may be headed to court. Even so, its impact could be much broader in the industry. From a digital forensics standpoint, accessibility to the data via mobile app could introduce a potential new investigative avenue — while vehicle datasets might become the latest to validate through manufacturer warrant returns.

In Michigan, Proposal 2 amends Article I, Section 11 of the state’s constitution to require law enforcement to obtain a search warrant every time they collect electronic data and communications. According to the Citizens Research Council of Michigan, the amendment will now read:

“The person, houses, papers, possessions, electronic data, and electronic communications of every person shall be secure from unreasonable searches and seizures. No warrant to search any place or to seize any person or things or to access electronic data or electronic communications shall issue without describing them, nor without probable cause, supported by oath or affirmation….”

The measure can trace its roots all the way back in 2008, reported Michigan Radio, when the American Civil Liberties Union became concerned that the Michigan State Police were searching mobile devices without a warrant. That organization made a Freedom of Information Act request to the agency regarding its use of mobile forensics tools like Cellebrite’s UFED.

Legislators became interested, and ultimately, the amendment was the result. Government Technology reported that it follows similar measures passed in Missouri in 2014, and in New Hampshire in 2018.

The amendment is ambiguous, however, when it comes to defining “electronic data,” says Alicia Loy, Cyber and Economic Crime Attorney at the National White Collar Crime Center (NW3C). Furthermore, she adds, as an amendment and not a piece of legislation, this doesn’t create comprehensive privacy protection on the level of, say, California’s 2015 Electronic Communication Privacy Act (CalECPA), which protects location data, content, metadata and device searches and allows these different types of data to be obtained with different means of legal process.  

“Until future case law addresses whether a warrant is required for all electronic data or communications (as Senator Runestad explained to Michigan Radio would be required by this amendment),” says Loy, “investigators may be forced to rely on exceptions to the warrant requirement (exigency or consent, for example) to obtain electronic communications or data.”

Finally, the California Privacy Rights Act of 2020 (CPRA) expands and strengthens the provisions first introduced in the California Consumer Privacy Act (CCPA).

Previously in the Forensic Focus Legal Update, we covered what the CCPA meant for digital forensics examiners, particularly those working corporate investigations. 

The new CPRA won’t go into effect until January 1, 2023, but forensic examiners are well advised to work together proactively with investigators and legal teams to prepare for investigations to address the new additions:

  • “Sharing” (not just selling) personal information when it’s used for “cross-context behavioral advertising.”
  • The use of “sensitive personal information” such as precise location, race, religion, sexual orientation, health, etc.; think geofencing and other patterns of life.
  • Data collection, retention, and use. (The “use” portion might be another area where the Supreme Court’s interpretation of the CFAA in the upcoming Van Buren decision will be important.)

In addition, the new law replaces the attorney general’s office with a new California Privacy Protection Agency. It also alters the definition of “businesses” covered under the new law, reducing small business liability while expanding coverage.

Balancing privacy and public safety

Underpinning all of these cases and laws is the concept of how data is accessed and used, primarily by government authorities, but also by others. That was the focus of an October report, “Mass Extraction: The Widespread Power of U.S. Law Enforcement to Search Mobile Phones.”

Produced by Washington, D.C.-based nonprofit Upturn, the report doesn’t highlight much that will be surprising to anyone who has worked in this field for a long period of time. However, because it could influence policy — think of the new constitutional amendment in Michigan — its observations and conclusions are a glimpse into how common law enforcement practices could be interpreted and, potentially, altered.

That’s especially true as prosecutors and investigators increasingly turn to various forms of data to validate and authenticate evidence — as we showed in our story about pattern of life analysis. Geofencing and, additionally, data stored on remote servers are part of this.

A forthcoming law review article describes some of these issues. “Not an Ocean Away, Only a Moment Away: A Prosecutor’s Primer for Obtaining Remotely Stored Data”* seeks to fill a gap in legal guidance on collecting and introducing cloud-based evidence at trial.

Nearly three years since the enactment of the U.S. Clarifying Lawful Overseas Use of Data (CLOUD) Act, multiple digital forensics vendors now offer tools to extract digital data from the cloud. The National Institute of Standards and Technology (NIST) finalized its publication NISTIR-8006, “Cloud Computing Forensic Science Challenges,” and the Scientific Working Group on Digital Evidence followed up with “Best Practices for Digital Evidence Acquisition from Cloud Service Providers.”

“A Prosecutor’s Primer,” authored** by attorneys Robert Peters, Matthew Osteen, Alicia Loy, Joseph Remy, and Justin Fitzsimmons, uses real-life scenarios to illustrate legal considerations for obtaining remotely stored data:

  • An overview of the 1986 Stored Communications Act (SCA), along with the trajectory of Fourth Amendment jurisprudence since the SCA’s passage
  • Relevant CLOUD Act provisions
  • Bilateral agreements following the enactment of the CLOUD Act (notably the one established between the U.S. and U.K.)  
  • Potential pitfalls involving accessing remotely stored data, as well as possible solutions.
  • Practices — and sample documents or templates — for obtaining data stored domestically, internationally (both with and without CLOUD Act agreement and/or mutual legal assistance treaty (MLAT), and even from extraterrestrial locations.

“A Prosecutor’s Primer” will appear in the Mitchell Hamline Law Review in 2021.

* Forensic Focus’ Christa Miller volunteered editorial services for this article.

** The authors also serve as contributors and advisors to Forensic Focus in the development of the quarterly Legal Update.

Additional publications

Part 3 in a series contributed to Forensic Focus by New Delhi (India)-based digital forensics expert Santosh Khadsare drew on Indian law as well as crowdsourced answers on LinkedIn to cover several key questions:

  • For a digital forensics professional, is knowledge of cyber laws a must? If yes, why?
  • Can anyone without qualifications but with experience be deposed as an expert witness in a court?
  • If a digital forensics laboratory is notified by the government / notifying agency, but the expert whose signed the forensic report has left or is not available, can the present incumbent depose on their behalf as an expert witness?
  • What are the statements that a digital forensics analyst / investigator/ expert should use in a report and a court?

Attorney Gary Weingarden continued his Medium series, “Clarifying Legal Ideas from Technology Certifications,” in Part 3: Hearsay. Weingarden reminds readers that hearsay is a problem for a trial attorney to solve, but his examples focus on digital evidence, and as such might be useful for any digital forensic examiner who expects to testify at trial.

Two articles from attorney and forensic examiner Craig Ball highlight challenges that digital forensics and e-discovery vendors — and the attorneys they work for — might face:

  • The Metadata Vanishes is an in-depth description of image metadata; how it came, in one case, to disappear from responsive photos produced in discovery; and how to prevent that from happening. (Hint: use a ZIP container.)
  • The Case for Native, I Swear highlights the importance of native formats in modern-day production of electronically stored information (ESI). Although the focus here is on e-discovery, Ball’s description of outdated forms of production in discovery — TIFF images of responsive documents — isn’t unlike arguments about screenshots of social media posts.

Have a piece of legal analysis or other relevant material you’d like us to publish, or mention in our next quarterly legal update? Please email!

Christa Miller is a Content Manager at Forensic Focus. She specializes in writing about technology and criminal justice, with particular interest in issues related to digital evidence and cyber law.

Leave a Comment

Latest Videos

This error message is only visible to WordPress admins

Important: No API Key Entered.

Many features are not available without adding an API Key. Please go to the YouTube Feeds settings page to add an API key after following these instructions.

Latest Articles