Forging Trust in Digital Forensics as Technology Evolves

February’s publications continued a trending topic in recent research: establishing reliability in forensic science as a whole, and digital forensics particularly. Even as some works explored the decryption of encrypted data, and the use of “hacked” data from encrypted devices, other papers looked at privacy and method verification.

While the papers published in February represent, as always, a diversity of perspectives from around the globe, we begin with the United States’ recognition of Black History Month. The Leahy Center for Digital Forensics & Cybersecurity blog highlighted some of the most influential African-American leaders in cybersecurity and technology, including:

Read the whole post for more details about all of their many accomplishments.

Quality, reliability, and trust in digital forensic science

On the other hand, according to the Equal Justice Institute, Black Americans have disproportionately been affected by false convictions, even where science is involved. Forensic science has been undergoing a reckoning since, as several of the papers published last month reflect. 

Researchers from Australia’s University of Adelaide and University of Sydney asked: “What drives public beliefs about the credibility of a scientific field?” in their paper “Beyond CSI: Calibrating public beliefs about the reliability of forensic science through openness and transparency.”


Get The Latest DFIR News

Join the Forensic Focus newsletter for the best DFIR articles in your inbox every month.


Unsubscribe any time. We respect your privacy - read our privacy policy.

Focusing on the “reproducibility crisis” affecting many forensic sciences, the paper offers three contributions:

  • A review of other research, which revealed “that forensic practices do not enjoy uniformly high reliability ratings from the public and these ratings are not calibrated with the scientific consensus.”
  • A review of three empirically-tested ways – acknowledging uncertainty and mistakes, and the role of epistemic trust; promoting openness and transparency; and alignment with public expectations for findings reports – that other fields are dealing with their own crises.
  • Recommendations for “how forensic science can leverage transparency and openness to improve and maintain its long-term credibility” based on empirical evidence from the independent Houston Forensic Science Center.

Part of the problem in this is the explainability of some technology. Another paper questioned whether computational algorithms in particular promote greater scientific rigor in forensics, or instead make it difficult to scrutinize the results they generate towards admissible evidence. 

In “Probabilistic reporting and algorithms in forensic science: Stakeholder perspectives within the American criminal justice system,” University of Lausanne researchers’ study of 15 criminal justice stakeholders – laboratory managers, prosecutors, defense attorneys, judges, and other academic scholars – concluded with five major requirements that extend to digital forensics:

  • “…more robust research establishing stronger empirical foundations and scientific rigor for many pattern evidence disciplines, including a better characterization of the limitations of those methods.”
  • Better development and design of algorithms so that they and their evidence can be presented in court with minimal confusion for lay fact-finders.
  • Clarity around the types of algorithms – traditional rule-based programmed algorithms vs. artificial intelligence / machine learning-based algorithms – so stakeholders can understand their risks and benefits. 
  • Potential to recruit and select practitioners with scientific and mathematical, rather than forensic, expertise.
  • Policy safeguards, standards, and oversight for the development, validation, and application of all forensic science methods, including algorithmic tools.

Additionally, the paper articulated the need for “greater investments in foundational education and training for the forensic science and legal communities—specifically practitioners who will be expected to use the algorithms and judges who will be expected to assess the admissibility of the algorithms, as well as greater allocation of resources for forensic laboratories to support these investments while maintaining the caseload and throughput demanded of them.”

The discussion around bias in forensic analysis continued in “A practical tool for information management in forensic decisions: Using Linear Sequential Unmasking-Expanded (LSU-E) in casework.” Authors from Duke University School of Law, University College London, Towson University, and ForensicAid presented a practical worksheet “designed to bridge the gap between research and practice” in forensic decision-making.

The authors’ goal: to encourage the implementation of LSU-E, a framework that can “improve decision quality by increasing the repeatability, reproducibility, and transparency of forensic analysts’ decisions, as well as reduce bias” when evaluating information from multiple sources.

A note of caution: LSU-E hasn’t been fully tested in digital forensics, according to Nina Sunde, a lecturer at the Norwegian Police University College and a PhD candidate at the University of Oslo who studies cognitive bias. “It’s a promising measure,” she said, “however its applicability in digital forensics needs to be studied before we can conclude that it is an effective and recommended method.”

Researchers from the United Kingdom’s Open University made “The case for Zero Trust Digital Forensics,” arguing, “Some aspects of [digital forensics] investigations are inevitably contingent on trust, however this is not always explicitly considered or critically evaluated.” For example, they wrote, detection of digital artifact tampering is minimally conducted, which in turn can introduce reasonable doubt in the evidence.

The authors took a page from the network security book, introducing the “Zero Trust” concept for digital forensics: “a strategy adopted by investigators whereby each aspect of an investigation is assumed to be unreliable until verified” using a “multifaceted” verification process. The paper includes a qualitative review of existing artifact verification techniques.

The “Zero Trust” concept is likened to the “forensic readiness” model, the topic of a paper by researchers from the University of South Africa, University of the Western Cape also in South Africa, and Schreiner University in Texas, USA. “An extended digital forensic readiness and maturity model” investigates the structure required to implement and manage digital forensic readiness (DFR) within an enterprise.

By extending the DFR Commonalities framework (DFRCF), the researchers designed a digital forensic maturity assessment model (DFMM) to enable organizations to assess their forensic readiness and security incident responses. Semi-structured interviews with forensic practitioners and academics then validated the DFMM.

Another aspect of trust in digital forensics centers on privacy. In “Defining principles for preserving privacy in digital forensic examinations,” Cranfield University’s Graeme Horsman (formerly of Teesside University) asks whether it’s possible to maintain digital data privacy in the course of a law enforcement investigation, and if so, how.

To that end, Horsman proposes 10 Privacy-Preserving Data Processing Principles (PPDPP), which “define conduct that is indicative of privacy-preserving” to encourage examiners to demonstrate evidence of adherence to the principles’ spirit. These principles aren’t dissimilar from existing good practices, but lend an extra dimension to those practices and are a critical lens through which to view digital forensic work.

Trust and credibility are also at the root of “Evidence from hacking: A few tiresome problems,” a paper authored by de Montfort University’s Peter Sommer at FSI: DI. With some jurisdictions allowing the use in court of digital data acquired via hacking, ensuring the data’s forensic soundness is becoming ever more paramount.

Referring specifically to the multijurisdictional Encrochat investigation in 2020, Sommer described a dearth of “suggested good practice or standard operating procedures to cover the issues,” including issues with access, extraction, disclosure and discovery, auditing, and more.

Of course, as Sommer pointed out, intelligence can and does function “as a means of finding more conventional testable evidence.” But what’s introduced versus withheld in discovery “can create uncertainty in the minds of investigators and prosecutors…. The issue is not so much the grounds for withholding but the subsequent impact on the fairness of the ensuing trial.”

These issues are important to consider in light of the technically oriented papers published in February:

Mobile app acquisition and analysis

Three papers from Korean researchers focused on mobile apps. First, authors from the Korean National Police University, Korea University, and Saudi Arabia’s Naif Arab University for Security Science explored “Forensic analysis of instant messengers: Decrypt Signal, Wickr, and Threema.”

Their proposed methodology for analyzing a messenger app’s decryption algorithm allowed for the extraction and decryption of the encrypted databases, multimedia, logs, and preferences files from the three named apps on both unrooted and rooted devices, as well as static and dynamic analysis of that data.

Kookmin University’s Soram Kim, Giyoon Kim, Sumin Shin, and Jongsung Kim collaborated on two papers:

Forensic analysis of note and journal applications” researched apps with security features; specifically, “the secret values used for locking, and the methods for storing user-created content” in 56 note and journal apps. As it turned out, 95 percent of the apps offering “security” features actually store users’ data insecurely and can be accessed using a password from another app.

Methods for recovering deleted data from the Realm database: Case study on Minitalk and Xabber,” coauthored with Byungchul Youn, Jian Song, and Insoo Lee, analyzed the structure of the Realm database and various deletion functions. Using the data structure and unallocated area, the researchers were able to recover deleted Realm database data for two apps.

Peripheral to digital forensics

Often, digital forensics relies on other disciplines to lend context to the data obtained from a digital device. For example, call detail records can correlate metadata or fill in the blanks when data is missing, while video surveillance can provide necessary context and even evidence when crimes occur.

In “An investigation into the accuracy of follow-on GPRS/mobile data CDRs,” researchers from University College London and Forensic Analytics Ltd  investigated the accuracy of 3G and 4G follow-on GPRS (General Packet Radio Service)/mobile data CDRs (Call Detail Records) from three UK mobile network operators.

Their findings: these records are not consistently accurate, and practitioners relying on them should use “‘at or before’ the start time of the CDR” terminology when referring to them.

The impracticality of human monitors for “smart city” surveillance cameras lent itself to “A semi-supervised deep learning based video anomaly detection framework using RGB-D for surveillance of real-world critical environments.”

Authored by researchers from India’s Visvesvaraya National Institute of Technology, the paper presented “a multi-modal semi-supervised deep learning based CNN-BiLSTM autoencoder framework to detect anomalous events in critical surveillance environments like Bank-ATMs.”

Finally, in “A study on command block collection and restoration techniques through detection of project file manipulation on engineering workstation of industrial control system,” researchers from the Korean National Police University and Gachon University explored a method of detecting and restoring programmable logic controller (PLC) data that have been changed as a result of a cyber-attack, so that investigations can be carried out more readily.

Christa Miller is a Content Manager at Forensic Focus. She specializes in writing about technology and criminal justice, with particular interest in issues related to digital evidence and cyber law.

Leave a Comment

Latest Videos

This error message is only visible to WordPress admins

Important: No API Key Entered.

Many features are not available without adding an API Key. Please go to the YouTube Feeds settings page to add an API key after following these instructions.

Latest Articles