The summer has been a busy one for standards-setting organizations in the United States and European Union:
- In the U.S., the National Institute of Standards and Technology (NIST) published its final guidelines for cloud forensics, along with a document towards creating explainable artificial intelligence (AI) systems.
- In Europe, FORMOBILE published its “Novel Training in Mobile Forensics” curriculum and framework.
- In the United Kingdom, the National Police Chiefs’ Council (NPCC)’s new Digital Forensic Science Strategy accounts for the expansion of digital forensic science onto the “front line” of policing, and the associated challenges and opportunities.
New NIST publications on cloud forensics and explainable AI
First drafted in 2014, NISTIR 8006, “NIST Cloud Computing Forensic Science Challenges,” defines and discusses the challenges of achieving effective cloud forensics — not just in civil and criminal investigations, but also incident response along with solution and tool development.
Ostensibly this means adapting the standard digital forensics process — identification, acquisition, preservation, examination, interpretation, and reporting — to the cloud environment. At the same time, though, the publication observes: “The cloud exacerbates many technological, organizational, and legal challenges already faced by digital forensic examiners.”
Focusing on the technical challenges while acknowledging that legal and organizational principles inform the technical, the publication categorizes nine major focus areas:
- Architecture challenges including variability between providers, proliferation of systems, locations, and endpoints that store data, and an accurate, secure provenance for maintaining and preserving chain of custody.
- Data collection challenges such as locating forensic artifacts in large, distributed, and dynamic systems; locating and collecting volatile and deleted data; inability to image all forensic artifacts in the cloud; and accessing one tenant’s data without breaching others’ confidentiality.
- Analysis challenges including forensic artifact correlation across and within cloud providers, reconstruction of events from virtual images or storage, metadata integrity, and log data timeline analysis.
- Anti-forensic challenges such as obfuscation, malware, data hiding, or other techniques that compromise evidentiary integrity.
- Incident first responder challenges including confidence, competence, and trustworthiness of the cloud; difficulty in performing initial triage; and processing large volumes of collected forensic artifacts.
- Role management challenges such as identifying an account’s owner, decoupling cloud user credentials from physical users, ease of anonymity and creating fictitious identities online, determining exact data ownership, and authentication and access control.
- Legal challenges including jurisdictional access to data, in particular issuing subpoenas without knowledge of the data’s physical location; and lack of effective channels for international communication and cooperation during an investigation, including with cloud providers.
- Standards challenges such as a lack of even minimum or basic standard operating procedures, practices, and tools, test and validation procedures, or interoperability among cloud providers.
- Training challenges including a lack of cloud forensic training and expertise for both investigators and instructors, limited knowledge of cloud providers’ record-keepers about evidence, and even the misapplication of non-cloud-oriented digital forensic training materials.
NIST has also drafted “Four Principles of Explainable Artificial Intelligence,” an outcome of the organization’s research on building trust in the AI systems that are increasingly “involved in high-stakes decisions.”
The four principles for explainable AI are:
- AI systems should deliver accompanying evidence or reasons or their outputs;
- AI systems should provide meaningful and understandable explanations to individual users;
- Explanations should correctly reflect the AI system’s process for generating the output; and
- The AI system “only operates under conditions for which it was designed or when the system reaches a sufficient confidence in its output.”
(Note: these four criteria aren’t dissimilar to those Bollé et al. suggested in their paper, “The role of evaluations in reaching decisions using automated systems supporting forensic analysis,” recapped in our recent article.)
New curriculum and frameworks for mobile forensics training
The European Union’s FORMOBILE project has introduced its Novel Training in Mobile Forensics curriculum and framework, part of developing an end-to-end forensic investigation chain for mobile devices that includes a new standard for mobile forensics and a suite of new tools.
The framework includes six main phases of an investigation “from crime scene to court” — the entire chain of a mobile forensic investigation, including the major stakeholder groups:
- First responders and investigators
- Mobile forensic analysts and experts from common and specialized labs
- Law enforcement managers with little to no digital forensics expertise
- Prosecutors and judges
The training is based on FORMOBILE’s interactions with law enforcement agencies, which identified gaps and requirements. Through April 2022, the curriculum scope will include a minimum of seven “progressively challenging” non-confidential courses and two confidential courses. These courses consist of online materials, a physical campus week, webinars, and workshops. Role-based certifications will be available.
Digital forensic science: “a golden thread”
Finally, the United Kingdom’s NPCC published its Digital Forensic Science Strategy together with the Forensic Capability Network, the Association of Police and Crime Commissioners, and Transforming Forensics.
Building on and in support of the National Policing Digital Strategy as well as the Policing Vision 2025, the new strategy accounts for the expansion of digital forensic science onto the “front line” of policing, treating it “as the ‘golden thread’ running through the investigative process.”
The strategy accounts for improving commercial practices and research and development; workforce development; meeting data challenges with infrastructure requirements for processing, analysing and sharing digital forensic data; knowledge management; and trust-building.
Acknowledging the ethical issues arising from unprecedented access to personal data, as well the potential of technology like the cloud and machine learning, the strategy also identifies three core challenges — volume, complexity, and legitimacy of digital data — and the issues they cause, including (among others) lack of support services, recruitment and retention, lack of awareness for digital forensic science in policing, and limited strategic engagement with academia and industry partners to work on long term solutions.