Should digital forensics be standardized? Can it be standardized successfully? These questions have been hotly debated for a number of years, perhaps never more so than in recent years as digital forensics has become increasingly important in both criminal and civil legal proceedings.
In 2017, the United Kingdom announced an ambitious initiative to achieve standards-based accreditation for all forensic labs — both public and private sector — responsible for work on criminal cases. The move came in conjunction with a high-profile rape case in which mobile device evidence was mishandled.
A survey from Forensic Focus conducted in 2018 reflected the debate’s extent. Nearly 62 percent of respondents agreed that a formal means of standardization is necessary for the digital forensics community, but opinions remained divided on the best way to accomplish that.
Why standardize?
Standardizing technical organizations — including digital forensics laboratories — is all about credibility. Standards, such as those published by The International Organization for Standardization (ISO) in conjunction with the International Electrotechnical Commission (IEC), help labs to identify and address risks and opportunities through planning and implementation. They also facilitate cooperation between labs, including across borders.
In a digital forensics context, risks and opportunities come from a near-constant rate of change, variances in examiner skillsets and expertise, and a lab’s own resources. These variances can mean that disparate labs might all come up with their own methods based on their unique character, and never know whether results are truly reproducible. That can impact credibility, not just of individual labs, but also the profession as a whole.
That’s too big a risk in a landscape like digital forensics. For example, Gareth Davies, Senior Lecturer in Forensics and Security at the University of South Wales and a member of the FSR’s Digital Forensics Specialists Group, said not all stakeholders might understand why it may not be possible to parse a given app one month, but becomes possible the next. Because that kind of variance can impact case outcomes, stakeholder confidence could be shaken.
For laboratories generally, ISO/IEC 17025:2017 specifies competency requirements for laboratories, including impartiality and consistency, that help to promote confidence in labs’ work. Likewise ISO/IEC 17020:2012, which specifies requirements for inspection bodies — or in forensics, those performing field sampling.
Still, one third of respondents to Forensic Focus’ survey disagreed that ISO 17025 covers all necessary aspects of digital forensics, while just seven percent agreed. Thirty-seven percent were neutral, and 23 percent didn’t know. “I believe having a standard is extremely important,” said one respondent, “however given how quickly digital forensics moves and all the different possibilities cases throw up, I’m not convinced ISO 17025 is the best fit for the field.”
Whether adhering to ISO 17025 would improve their organisation’s processes and prospects was also split. Of the 16 percent agreed it would, one respondent wrote: “Been doing it for 8 years and what an improvement it makes with the quality of investigations and reports.”
At the same time, though, 42 percent of respondents remained neutral on that point, while 28 percent disagreed the standard would improve anything. Another 14 percent didn’t know.
Several respondents reflected that standardizing digital forensics would be too expensive. “This money could have been spent on training,” wrote one respondent. Another said, “It’s an incredibly time consuming & expensive process – there’s no central governance to go for help, or to share best practice. Everyone seems to be going it alone.”
“17025 was never designed in mind with it being applied to digital forensics,” said Davies. “We don’t have the same sort of physical tools as a traditional forensic science lab… a lot of the things that we do in terms of generating results are done in software, so there are different challenges.”
To bridge the gap between what Davies calls “the foundations of 17025” and the realities of digital forensics, and to help smooth adoption of the standard, the Forensic Science Regulator (FSR) introduced the Codes of Practice and Conduct for the discipline in 2017.
The broad-based quality standard helps to ensure consistency — and therefore confidence — in results across disparate forensic laboratories across the UK. As a result, said Davies, they may use different tools to achieve the same ends. “We’re still talking about the bits and bytes on that chip from the phone,” he explained. “And we should be able to come up with a standard result and our interpretation of it.”
How the UK’s standardization effort is going
In 2017, the FSR was tasked with meeting three requirements to improve quality:
- The equal application of quality standards across forensic science disciplines and delivery services.
- Services’ compliance to a point where procedures continually improve — including the correction of any failures.
- Stakeholders share an understanding of quality and standards, and this understanding permeates the U.K.’s Criminal Justice System (CJS).
In June last year, Forensic Focus covered a midyear update which Forensic Science Regulator Dr Gillian Tully delivered at the (DFRWS) European Union conference.
The FSR’s annual report published in January 2021 reflects long-term improvement. Although “the level of compliance in the broader digital forensics field is increasing slowly,” the report reads, digital forensics, along with the work of digital media investigators, are still considered “disciplines with low levels of compliance.”
The accreditation process has indeed been rocky, but Davies anticipates a reduction in overall issues noted from initial assessments. “Because we were brand new to this, there was a lot of learning on the job,” he explained.
Both managers and practitioners who had never before gone through a United Kingdom Accreditation Service (UKAS) assessment learned numerous things that they could use to improve prior to the next assessment.
“We all started paying more attention and taking more responsibility for things within our remits,” Davies recalled. “We ultimately all pulled together as a unit and started to understand the nature of the problem and how it was being assessed in terms of 17025 implementation and the Codes, and we had a clearer picture of what work we had set out in front of us, so we could attack it more effectively.”
What does the standardization process involve?
Davies said the different “layers” of digital forensics can compound the existing challenges in terms of what to prioritize, or accredit first — particularly as casework is ongoing throughout the process, and often as a function of resources, both human and monetary.
To mitigate law enforcement labs having to do so much on their own, the Defence Science and Technology Laboratory (Dstl), an executive agency of the Ministry of Defence, has been working to develop digital forensic quality standards (DFQS). Forensic analyst Holly Duns said the DFQS are “primarily supporting law enforcement as they try to navigate accreditation issues.”
The DFQS apply to any police or commercial digital forensics lab that performs casework, making it subject to the accreditation requirement. The DFQS include two key elements:
- Ground truth datasets targeted to different parts of the digital forensics process.
- Method validation, for which the datasets can be used.
Davies said the DFQS started with a list of digital forensic processes that Dstl provided to all UK law enforcement agencies. That, he explained, helped labs not just to identify gaps or processes that had been left off the list, but also to begin to prioritize.
Another part of standardization includes validation requirements statements for tools, or as Davies describes, “confirmation that a method has been developed, tested, documented, and processed in accordance with the guidelines contained in the Codes of Practice.”
In other words: “You’re saying you get everything that’s available to a user [and] making sure that things are working as we expect them to work. And we can basically validate that that has completed successfully and most importantly, accurately.”
For example, Davies said the validation requirement statement for hard disk imaging would be to achieve the same result using any write blocker from any manufacturer plus any digital forensic tool of choice.
On the other hand, mobile devices can complicate validation requirements. Updates to firmware, operating systems, and apps happen on a near constant basis — and methods and tools need to be tested against each.
That’s because every update, whether on the device or app developer side or the forensic tool vendor side, can “break” the process, resulting in mixups between senders and recipients, date and time stamps, or other data that can have “huge impacts in terms of justice,” said Davies.
At the same time, actually testing methods and tools can add workload. Reliance on ground truth datasets can help to reduce the load, allowing the examiner to focus on the tool and the methods to test them. But this can create an opportunity cost to casework.
Likewise variances in examiner expertise. Training and experience aren’t standard, and even using the same tool could derive different results. That can be particularly true, said Davies, when it comes to devices an examiner has never seen before. “A competent member of staff should only be doing what they are able to do via their qualifications and their training,” he said, “and the fact that they’ve been certified up to that level of expertise.”
Another confounding factor comes from a certain tension between physical and digital forensics disciplines. When it comes to vehicles, Davies said, crime scene technicians need to collect fingerprints, DNA, and/or hair among other samples.
Frequently, that needs to happen before digital forensics examiners look at the data, because the other evidence could degrade. But it can also impact digital data. “They’re opening the doors. They’re climbing around,” he said. “It’s overwriting old data which we may or may not need, but from a crime scene point of view, preserving the original data becomes tricky. You’re creating all these new events.”
Meeting and maintaining a standard
Standardizing operations across labs of all sizes and operational capacities and requirements has led to its own set of challenges. For one, larger agencies have different challenges than smaller agencies, including case and evidence volumes.
Besides differences in resources to handle those issues, said Davies, lab managers’ purchasing decisions tend to be based on which tools work best for the work they see — routine versus complex work, for example, or specialized investigations such as crimes against children or counterterrorism.
That’s a result of the mandate to review cases independently — using validated methods and tools — to come up with investigative findings that can be compared and contrasted to another lab’s. “People go about solving objectives differently, in a different order,” Davies explained.
Furthermore, the labs are implementing ISO 17025 largely in isolation. That’s not solely because of the pandemic, though the pandemic doesn’t help. These challenges, together with resources and capacity, add up to an uneven standardization process.
For example, Davies explained, a smaller lab with less staff turnover might have more experienced people and less work volume than a larger lab. “So it’s easier for them to concentrate on and get a handle on [accreditation],” he said. “In some ways the labs are challenged by the fact that implementing ISO and the Codes is very much a full time job for a number of people.”
That’s in part because the implementation is ongoing — not a one-off. “We’re all on different parts of the journey,” said Davies. “Some people have successfully implemented various parts of the standard and the Codes of Practice. So some people are perhaps further down that road than others.”
On the other hand, Davies said, law enforcement agencies have been working together to achieve different levels of accreditation. “[They’re] doing interoperation comparison results for testing of methods and tools… developing their own methods validation [and] sharing that with their local partners, enabling each other,” he explained.
Automation may be another way to clear this hurdle. Stating that wider changes — associated with the National Police Chiefs Council (NPCC)’s Digital Forensic Science Strategy — would need to be made for full compliance to be possible, the FSR’s report noted a first step: an automated digital forensic service for child sexual exploitation (CSE) cases.
This effort is being undertaken by the Forensic Capability Network (FCN), a new organization established to deliver on the UK’s Transforming Forensics (TF) Programme. The FCN is responsible for deliverables across multiple disciplines; one of its key deliverables is a centralized quality management system.
A culture of quality vs. a culture of bureaucracy
Observing “frequent complaints about form-filling and bureaucracy” — reflected also in Forensic Focus’ 2018 survey — the FSR’s report stated: “The requirements are not themselves bureaucratic, but the manner in which a particular organisation chooses to implement them may be bureaucratic” — and that, the report added, could make the difference for effective implementation.
Indeed: “At the moment valuable time is spent not processing case work but checking others’ work or following a tick box regime,” wrote one Forensic Focus survey respondent, “rather than empowering people to think for themselves, solving problems in a logical way appropriate to the investigation in hand.”
Another was concerned that the requirements had “massively increased the time taken to examine an exhibit, with little or no benefit in return,” while a third worried: “It is liable to create too much emphasis on having the accreditation, for which organisations are spending an obsessive amount of time on, in turn neglecting the core role of doing digital forensics.”
Davies believes this kind of feedback can be attributed to inadequate cultural change management. Senior managers have a responsibility, he said, “to ensure acceptance of a culture and expectation of adherence to process and standards.”
To do that, managers have to be willing to explore why employees feel the use of validated methods are slowing their process. “Sometimes we’re reluctant to change historical approaches and practices,” said Davies. However, he added, “A validated method may also be seen as an attempt to de-skill a particular job or process… so an investigator may feel a lack of being an individual at that level.”
Change management involves demonstrating the benefit or the opportunity to the employee, as well as providing opportunities to take advantage. “Having a defined approach to a regular problem may enable skill to be derived through what I’ve called muscle memory,” said Davies. “I would suggest validated methods are well-defined and we’ll have a minimum time for completion dependent on the experience and expertise of the investigator.”
The ability to produce standard documentation, operating procedures, and templates that help define a task’s competencies make it possible to train employees on specific methods, helping to develop their confidence and competence even further — and in turn, help to speed the process.
“One thing 17025 and the Codes have done is help examiners explore opportunities for training and up-skilling because it’s a fundamental requirement,” Davies said — something that hasn’t always been possible owing to resource limitations.
Because this level of professional development improves the overall process, larger problems such as backlogs and time pressures are reduced. In addition, said Davies, evidence that makes its way to “the right person… protects the lab in terms of the work and the results in producing their integrity and also the examiner, because, it’s unfair to give an examiner a job they can’t do well and expect them to do it to an evidential standard.”
That’s a result borne out in the FSR’s report, which noted that in spite of a certain “spectrum of maturity in culture,” there were also “signs of changing attitudes and it has been encouraging to note a few police forces improving the senior officer oversight of quality in the past year, in an aim to spread quality culture beyond the forensic services department.”
In one case, the report noted, standardized “workflows, methods and training… has resulted in fewer Crown Prosecution Service (CPS) requests for additional work, fewer defence challenges and fewer court appearances” with the result that examiners could “acquire and process more exhibits within their service level agreements, release staff to attend scenes and release staff into other areas of digital forensics.”
Accreditation vs. new and novel methods
“Much of traditional forensics work already has an established stable base and a slow moving evolution of areas of investigation,” said Davies, contrasting the “atomically predictable” human body with digital devices that look alike on the outside, but internally — not just in terms of components but also apps and customizations — can look very different.
“It complicates the problem so much more,” Davies said. ““[Devices are] changing in terms of updates, functionality…. Things can be patched and then what we could do on a Friday, we come back into the lab on a Monday and a software update has broken that capability for us,” he explained.
“Tools that were used yesterday may no longer work today [or devices] might go away forever, and then you’ve got a whole new device coming in that you don’t have a [standard operating procedure] in terms of interpretation or evaluation to place against it.” Another complicating factor: cloud storage.
These trends will simply accelerate, said Davies, as the internet of things (IoT) continues to grow. “The Locard principle is changing the possibilities of downloading data from objects that have not previously contained information, been networked, or interacted with one another, because we’ve got devices talking to devices and passing information along,” he said, pointing to examples like a smartphone leaving “footprints” on a home’s wi-fi router or Bluetooth devices.
The rapid rate of change in digital technology, coupled with its complexity, means some methods will need to become part of forensic labs’ repertoire before they’ve been fully validated or accredited, although, Davies said, “the regulator has had the foresight to include… new and novel techniques” within the Codes of Practice.
That’s part of Dstl’s work on the DFQS. As the lab works to deliver the highest priority ground truth datasets, said Duns, its lab is looking ahead to challenges with novel techniques — those which U.K. labs will turn their focus to after accrediting more common and accepted techniques.
“As part of our datasets work it is the intention to provide the support so that they can in the future be formally validated,” said Duns. “When we are evaluating our research we always seek to understand the potential for data modification in accordance with the necessary [Association of Chief Police Officers (ACPO)] guidelines.”
But even that’s challenging, said Davies, to replicate for training purposes: it’s virtually impossible to break hard disks or chips consistently enough for trainees to repeat a particular technique — much less deal with a break “in the wild.”
Duns said labs can rise to these challenges by deploying novel techniques “with care in the hands of trained and competent practitioners, as they do with any other complex techniques.” These practitioners could undertake some local verification to provide confidence in casework results.
Davies said another way around these issues is to treat the data derived from novel methods as intelligence rather than evidence. That can be challenging for police as well as attorneys who are trying to build cases, but the alternative, Davies said, is to testify about systems the experts know so little about.
Davies offered vehicle infotainment and telematics systems as an example. “None of us are real experts in that area,” he said. “We’re still learning so much, and the systems differ so much, and they change and they update…. to talk about the accuracy and the integrity of the data we get out of said systems — well, I know very little about how they were programmed, engineered, and ultimately how and why they store this data”
He compares it to the early mobile feature phones, when Nokia, Symbian, LG and Phillips were among the brands sharing the market before iOS and Android smartphones standardized and stabilized the landscape. “Everybody had their own operating system. Everybody had their own interface cable,” he recalled.
With vehicles, limited forensic tool sets for such a variety of systems make validation difficult. Moreover, said Davies, controlled testing has demonstrated that some systems record events incorrectly. “So how trustworthy are these systems, and the data they are providing us?” he asked.
These are questions that can complicate the prospect of testifying in court as an expert witness. Some vehicle data, such as braking or acceleration events, doors opening, and other details wouldn’t be possible to corroborate to a paired mobile device’s data.
Even the similarities between legacy and novel systems are limited, making it difficult for specialists in one area to lend help to another. For example, vehicles rely heavily on NAND flash chips just like many mobile devices. However, parsing, interpreting and analyzing the data from a chipoff or ISP acquisition would be completely different, said Davies, because the file systems and the data types aren’t the same.
On the flip side are kiosks: data acquisition tools designed for non-specialists, such as patrol officers, for use at police stations. “Traditionally, in UKAS accreditation there is an emphasis on user competency,” said Duns.
“However, with kiosks this is reversed. [They] have to be deemed competent over the user, meaning there is a lot of additional work being undertaken on the validation and verification work to ensure the kiosks are fit for purpose.” To that end, said Duns, Dstl is working in conjunction with the FCN and Staffordshire Police to update the national kiosk validation package around quality assurance.
ISO 17025 isn’t all there is to standardization, of course. Evidence evaluation, tools, and training are all subsets, and some newer technology — think blockchain and cloud collaborative portals — are being explored to help standardize cross-jurisdictional efforts. Whether full standardization can ever be achieved globally remains to be seen, but it is a conversation — and an effort — that is far from concluded.
Excellent article!
Things in principle don’t seem to have changed that much since the Survey.
Incidentally, it was 2017 not 2018 and although Forensic Focus gave it a great deal or coverage it didn’t commission it – it was the initiative and work of three experienced practitioners/teachers/trainers, of whom I was one.
The central problems of reliability of digital evidence are not fully solved by certificates and accreditations within “forensic science” but must include the role of expert evidence.
If you tell someone they are a regulator then the most likely thing they will do is issue a collection and hierarchy of regulations. If some-one is tasked with assessment of competence they will look for compliance to standards. Neither of these routes guarantees the courts reliability in digital evidence.
To rehearse a few salient problems:
• ISO17025 is a laboratory calibration standard which suits only some aspects of forensic science activity – the sort which is dealt with by single tests and standard operating procedures
• The continuing vast rate of change in digital hardware, software and commercial/social usage means that fully tested tools and procedures will always lag the use/deployment of digital devices and the evidence they potentially produce in the outside world
• Most digital forensic practitioners rely on extensive, would-be comprehensive suites of investigatory tools which are subject to frequent updating so that there is never a point at which they can be properly tested
• Although there are a few situations in which the mere discovery of a file or artefact is sufficient – an obvious example is the strict liability requirement in the IIOC possession offence – in most situations the witness before the court has to explain the significance of a sequence of events – necessary, for example for “making” and “distribution” offences. Or, to show that particular files and dates and times demonstrate unauthorised access, or deliberate acquisition of terrorist material, or planning with others to acquire and distribute narcotics, or to commit a fraud. All these latter activities are really the province of expert opinion evidence
• Some of the problems of novel and reconstructive evidence can be resolved in meeting between experts – under CPR 19.6 (see also https://bit.ly/3xsyvnS).
The current sole reliance on forensic science regulation is doomed; it has a role in a limited number of circumstances. Accrediting laboratories seem a curious route when evidence is given in court by individual human beings. There is currently no system of accreditation for expert witnesses which applies to both prosecution and defence. (Police and prosecutors can make use of a semi-formal scheme run by the NCA).
Two current cases in which reliability is important are before the courts: The Post Office civil case is substantially about reliability though so is observance of disclosure protocols. In the on-going Operation Venetic cases the collector of evidence, French law enforcement, refused to provide any detail of their methods and violated all of the four ACPO Principles, yet evidence of the content of messages between (for the most-part) suspected narcotics traffickers is currently being allowed without any ability to test.
Thanks for the comment Peter, that’s really useful!
Just to clarify, the survey we refer to in the article is the one Forensic Focus sent out to our readers in 2018, not the 2017 one about ISO 17025.