OSAC And Standards In The Digital Evidence World

by Steve Johnson AI CLPE, CFA, Standards Ambassador – Organization of Scientific Area Committees (OSAC) for Forensic Science

In 1998, as the personal computer and cell phone industry was starting to explode, the Scientific Working Group for Digital Evidence (SWGDE) was formed to meet the emerging need for the development of sound, scientific standards, practices and guidelines in the burgeoning digital evidence realm. The federal government, through the National Institute of Justice (NIJ) created the first Scientific Working Group (SWG) in 1988 for DNA as it became a more practical and reliable tool for criminal investigation. Many other SWGs were created over the next two decades as law enforcement agencies and forensics laboratories became more sophisticated and relied upon to solve crimes. SWGDE is one of the oldest SWGs in existence and remains one of the few (along with the Facial Identification Scientific Working Group and the Scientific Working Group on DNA Analysis Methods) that continues to exist and meet regularly. Most of the other 22 SWGs lost their federal funding in 2013. 

In order to fill the void created by the loss of these valuable teams of forensic experts, the Department of Commerce, through the National Institute of Standards and Technology (NIST) created and funded the Organization of Scientific Area Committees (OSAC) for Forensic Science. In its original inception, OSAC was made up of 23 forensic discipline-specific subcommittees, which did not include digital evidence. The forensic community soon recognized the need to add a Digital Evidence Subcommittee to represent the tens of thousands of forensic science service providers (FSSPs) working in this area.

OSAC works to strengthen forensic science through standards. To do this, OSAC:

In the summer of 2022, SWGDE was officially incorporated as an SDO and is recognized by OSAC in that capacity. This is important to the forensic digital evidence community insomuch as approved digital evidence standards are eligible to be listed on the OSAC Registry.  SWGDE has dozens of members representing over 70 law enforcement, academic, commercial, and other stakeholders in the digital evidence community. Given its status as a standards development organization, SWGDE is expected to abide by certain regulations and “best practices” of operation. I am humbled and honored to have been asked to act as a member of an external audit committee that annually reviews SWGDE document development protocols, meeting and attendance requirements and other mandated procedures to ensure the highest level of quality in creating these standards and guidelines. I’m happy to say the SWGDE is meeting every milestone and abiding by their mission.   

Standards are only beneficial if they are used, and FSSPs are encouraged to implement the standards on the OSAC Registry. Given the importance of standards implementation, I want to take a moment to share a little more about that effort through this article. Many (if not all) of us are aware of the deficiencies identified in the 2009 National Academy of Sciences (NAS) report “Strengthening Forensic Science in the United States: A Path Forward”. One of these deficiencies included a lack of standardization across a number of disciplines. Application of standards that have been developed through a consensus-based process goes a long way toward addressing that gap. OSAC’s mission includes outreach, communication, and support to forensic science service providers (FSSPs) to encourage them to implement standards on the OSAC Registry. Those FSSPs who have embraced implementation are encouraged to complete a declaration form and declare as implementing bodies. By completing a declaration form, agencies, organizations and other FSSPs are awarded a certificate and become integral members of the growing cohort of OSAC implementers.

OSAC’s initial Registry implementation outreach focused on “traditional” FSSPs, the 423 forensic laboratories that the Bureau of Justice Statistics recognized as publicly funded facilities. However, over the last eighteen months, OSAC’s outreach efforts expanded to “non-traditional” FSSPs and engaging these stakeholders in the implementation of standards on the OSAC Registry. The practitioners that work outside of these “publicly funded” labs have been estimated to number in the tens of thousands and come from a variety of backgrounds ranging from small municipal or county laboratories to individual practitioners that contract to law enforcement. In an ideal forensic system, they would all be working in accordance with the same best practices and standards to ensure consistent results are achieved and delivered.

In the fall of 2022, OSAC added a specific position to assist with the implementation outreach effort and expand it to these non-traditional FSSPs such as the digital evidence discipline. I was awarded the opportunity to be part of the team, joining Mark Stolorow (the past OSAC Director) as a “standards ambassador” to engage with forensic science stakeholders. By the time I had begun this journey with OSAC, the organization had identified and acknowledged 100 FSSPs who had implemented (either fully or partially) 129 standards on the OSAC Registry. Up to that point, many FSSPs had been solicited through surveys sent out to the forensic science community over the last two years. Since coming on board in the fall of 2022, another 52 FSSPs have added their names to the list of implementers, and the OSAC Registry has grown to nearly 180 standards. Although there are currently only five digital evidence standards on the Registry, there are over 100 pre-existing or “in development” standards, guidelines, and other documents, many of which will find their way to it. All told, there are standards and guidelines at some level of in the standards development process, including another 60+ published standards that are eligible for the OSAC Registry, 170+ standards that are under development at a standards development organization (SDO), and 165+ standards that are being drafted within OSAC. Suffice to say, a great deal of progress has been made since OSAC’s inception in 2014! 

Speaking of 2014, in February of this year, OSAC will celebrate its tenth birthday! I am proud to say that I’ve been involved with the organization since before its establishment, dating back to the first joint meeting of the initial forensic association representatives in the winter of 2014 at the National Institute of Standards and Technology (NIST). It was at that meeting that the seeds of this enterprise were initially sown. Since then, I believe the growth of OSAC from its seedling stage to what it has become today, has borne important fruit and continues to bring scientifically sound standards to the forensic science community and its stakeholders. Starting as the IAI’s representative to OSAC’s Forensic Science Standards Board (FSSB) in 2014, getting elected chair of the FSSB in the fall of 2017, becoming (and continuing to serve as) a member of the Facial & Iris Identification Subcommittee, and now serving as a Standards Ambassador for OSAC, I’ve been on the front lines of this enterprise and am excited for the future of the standards implementation effort.

OSAC has initiated a number of outreach efforts to engage with FSSPs and to evaluate the impact of standards implementation. The OSAC Registry Implementation Survey, distributed in the summers of 2021 and 2022, provide summaries for each discipline where standards were available on the OSAC Registry. Since then, OSAC has established an “open enrollment” period during which FSSPs were encouraged to update or initiate their implementation activities using the OSAC Registry – Standards Implementation Declaration Form. To better streamline and simplify the declaration process, we are currently establishing an electronic platform that will give current and potential new implementing FSSPs the opportunity to easily update or complete a declaration form online. Additionally, over the past year, the FSSB formed an Implementer Cohort Task Group made up of a diverse group of professional organizations, authoritative bodies (e.g., accrediting and certification bodies), practitioners, OSAC leadership and OSAC Program Office staff to help support future implementation initiatives.

Looking ahead, OSAC will expand its engagements to include more communications with legal, academic, and corporate stakeholders. In the meantime, as the 2024 fiscal year is well underway, there are more volunteers that have come on board to support the OSAC enterprise. Many members, whose terms recently expired, have brought their volunteer services and passion for forensics to other sections and subcommittees where they will continue to support OSAC as affiliates. In my service as Standards Implementation Ambassador for OSAC, I intend to reach out to as many stakeholders in the digital evidence community as possible since the thousands of FSSPs in this discipline have as much invested in the process as anyone. I encourage digital evidence practitioners and FSSPs to follow the progress of OSAC and consider getting more involved in the organization to help shape the future of forensic science standardization. After all, improving the forensic sciences through the development of sound standards and guidelines is critical to the successful performance of thousands of forensic science service providers, especially those providing forensic support in the fast-growing cyber world.

Steve Johnson is a retired law enforcement supervisor with a background in latent print examination, crime scene investigation, forensic art and facial identification.  He is a Senior Advisor for Ideal Innovations and is currently contracted to the National Institute of Standards and Technology (NIST) Organization of Scientific Area Committees for Forensic Science (OSAC).  Mr. Johnson is a Past President and Board Chair of the International Association for Identification (IAI) and was the IAI representative to the OSAC Forensic Science Standards Board (FSSB), serving as Chair from 2017 to 2020.  The OSAC Mission is devoted to strengthening “the nation’s use of forensic science by facilitating the development and promoting the use of high-quality, technically sound standards”.

Digital Forensics Standards In Q1 2021

The items in our roundup this spring build on many of the updates from our January roundup, including new drafts available for public comment and additional work on standardization projects in the United Kingdom and European Union.

Additionally, a development in the South African digital forensics industry highlights the complexities of standardization, including some of the industry and political forces at work in ensuring the highest quality digital forensic evidence.

SWGDE drafts available for public comment

In advance of its June meeting, the Scientific Working Group on Digital Evidence (SWGDE) posted six draft documents for public review and comment:

  • Best Practices for Drone Forensics v1.0
  • Best Practices for Forensic Audio v2.4
  • Best Practices for Vehicle Infotainment and Telematics Systems v3.0
  • Establishing a Quality Management System for a Digital and Multimedia Organization under ISO-IEC 17025 or 17020 v1.0
  • Technical Overview for Reverse Projection Photogrammetry v1.0
  • Best Practices for Teleworking and Digital Forensics v1.0

Instructions for submitting comments are on the first page of each draft document. All feedback received prior to SWGDE’s next meeting will be reviewed by the appropriate subcommittee at that meeting.

In addition, at January’s meeting, SWGDE voted to release the following Approved documents:

  • 2021-01-14 SWGDE Guidelines for Video Evidence Canvassing and Collection v1.0
  • 2021-01-14 SWGDE Overview Artificial Intelligence Trends in Video Analysis v1.0 *

                (*formerly titled SWGDE Informational Overview: Computer Vision)

Project LOCARD joins EU criminal network analysis project

In March, Project LOCARD announced that it had joined with sister project ROXANNE, a consortium of law enforcement agencies, industry and academia, towards “promoting common collaboration frameworks… and developing more advanced tools” in addressing cybercrime.

Again, LOCARD’s goal is to develop a chain of custody framework that relies on blockchain technology to secure digital evidence; ROXANNE’s is to develop tools to accelerate law enforcement investigative processes, including an artificial intelligence-driven interactive platform combining network analysis with advanced text, speech, and language technologies to identify large scale criminal organizations.

One aspect of this: research and tools to detect deviant online behaviour via natural language processing (NLP), which researcher Constantinos Patsakis said in a presentation delivered “very good results when handling large chunks of data.”

Forensic Capability Network announces new project developments

The United Kingdom’s Forensic Capability Network (FCN) reported a new development from the Transforming Forensics programme with regard to its CSE Automate Project. According to the FCN’s March 2021 newsletter, the project encompasses three different technology options:

  • An Amazon Web Service orchestration platform to automate data ingestion, processing, and analysis so that end users can review the results and report from a single system.
  • Robotic Process Automation (RPA) to handle repetitive tasks in the CSE workflow, saving time and effort for human users.
  • API/CLI linking multiple digital forensics examination tools together, via Magnet Automate, to produce an automated workflow for both computer and mobile device examination.

According to service development manager Adam Korol, each of the three is being trialled within separate police forces.

“We aim to create a variety of examination workflows with different applications dependent on the case specifics,” Korol said. “These are likely to include methods of live examination, rapid examination, standard examination and extended examination of forensic images.”

The CSE Automate project overall seeks to deliver an entire child sexual exploitation casework system solution. Besides the three automated technological solutions, it covers training and competency frameworks, policy and guidance input — in part to improve CSE investigators’ mental health, as well as validation/verification of the automated service — and measured benefits and evaluation mapping across the project’s lifecycle.

“Placement activity runs until October 2021 and work to finalise the service elements described above is scheduled to complete in April 2022,” said Korol. “At that time we plan for the service to be hosted on the FCN Exchange platform.”

Another project involves amending FCN’s draft guidance for forces on handling legacy digital forensics data. With the draft submitted to the National Police Chiefs Council (NPCC) Quality Board in March, the final publication is expected later in the year.

Finally, the FCN Xchange platform is now live and undergoing testing. Developed by the FCN in conjunction with the NPCC Transforming Forensics programme, the cloud-based Xchange is designed to “provide nationally consistent, standardised processes,” thereby improving both turnaround times and evidence quality.

Its browser-based interface facilitates connections between FCN’s members in England and Wales so that they can share data and services and access new digital forensics tools.

Although digital forensics data isn’t yet being shared on the platform — a digital fingerprint capability is its first technical release — FCN’s electronic quality management system is scheduled to move to the platform by November this year.

Debate in South Africa highlights standardization’s complexities

Finally, a recent attempt by South Africa Chapter 91 of the Association of Certified Fraud Examiners (ACFE) to mandate Professional Standards for Digital Forensic Practitioners in South Africa demonstrates the tension between the need for standardization, and who is in the best position to oversee it.

Added to its other standards for fraud investigation professionals — including healthcare fraud, document and polygraph examiners, and others — the new document “adopts and underwrites the International Organisation of Standardisation’s (ISO/IEC) 27037 [Security Techniques − Guidelines for identification, collection, acquisition and preservation of digital evidence] and 27043 [Standard on Information Technology − Security techniques − Incident investigation principles and processes] for digital forensics in South Africa.”

The framework for investigation methodology and reporting includes fairly standard digital forensic best practices, such as the need to follow established law and standard operating procedures (SOPs), maintain chain of custody, make forensic copies of evidence, ensure an examination is repeatable and reproducible, etc.

But Jason Jordaan, founder and managing director of DFIR Labs, has expressed some concerns about the move. In a statement delivered to the chapter, he wrote that existing international digital forensic standards covering accepted scientific methodologies — including ISO/SANS 27037, ISO/SANS 27043, ISO/SANS 27041, ISO/SANS 27042 and ISO/SANS 27050 — should suffice, given that “at this stage no legal imperative to change anything in this regard.”

Jordaan further cautioned that part of a draft Cybercrimes Bill, once signed, would codify the adoption of official SOPs for digital forensics practice throughout South Africa. This, he wrote, would be a more appropriate means of standardizing the industry.

Drawing a distinction between the science of digital forensics and the investigation of fraud, Jordaan welcomed ACFE-SA’s suggestions for digital forensic practitioners. However, he argued that including the concepts from various standards doesn’t, in itself, constitute a standard. “My concern with this is that it acts to simplify the entire digital forensics process, and the scientific and legal methodologies that must be employed,” he wrote.

Other concerns included ACFE-SA’s attempt to define functional positions — and their qualifications — differently from the way “relevant scientific literature” defines them, as well as overemphasis on provider capabilities versus what he called “the key aspects of capacity and infrastructure as set out in appropriate ISO standards.”

Forensic Focus is interested in covering more stories about the implementation of new technology and standards in different countries and regions across the globe. If you know of an initiative in your region that you think we should cover, please email christa@forensicfocus.com with more information!

Digital Forensics Standards Update: Calls For Training And Public Comment

As 2020 drew to a close, demand for digital forensics and investigations had perhaps never been higher. The COVID-19 pandemic continued to accelerate many forms of digital crime, particularly crimes against children and various types of fraud.

At the same time, the technology used to investigate and analyze these crimes continues to evolve. With the potential for profound impact on people’s livelihoods, lives, and liberty, these tools and the processes they facilitate are still the subject of efforts to standardize them, ideally to improve the entire industry.

This quarter we look at updates from Project LOCARD, FORMOBILE, the National Institute of Standards & Technology (NIST), the Scientific Working Group on Digital Evidence (SWGDE), and the Forensic Capability Network (FCN).

Upcoming Project LOCARD webinar

A February 23 webinar will discuss current issues and challenges with digital forensic evidence handling, in particular chain of custody. “A new common approach to manage cross-border digital evidence” will cover how European law enforcement agencies transfer digital evidence between different European countries — currently an inefficient and laborious process.

The webinar will also offer a platform demonstration of Project LOCARD, which is currently in its development phase. Its data workflows, storage and analyses are in the process of being enhanced, tested, and validated in realistic environments. Interconnectivity and the creation of an information exchange standard will be areas of focus in 2021, with the project targeted for a 2022 completion date. Read more about the project’s public deliverables here.

FORMOBILE: Call for training participants

At the end of September, Europe’s FORMOBILE project began to seek participants for its novel training pilot program. First responders, analysts, experts, investigators, prosecutors, judges, and managers are all invited to participate, with an eye toward evaluating both content and efficiency.

This is an opportunity not just for personal professional development, but also for participants to learn more about the standard and tools created in FORMOBILE and to have a direct impact on the delivery of mobile forensics base knowledge to law enforcement, relevant non-profit organisations, and academia.

For more information, including answers to frequently asked questions and additional information about available courses, visit FORMOBILE’s site here.

In October FORMOBILE also took part in the European Commission’s Directorate-General for Migration and Home Affairs (DG-Home) Community of Users Workshop on forensics. About half of the 200-attendee workshop attended the break-out session for digital forensics, which covered 11 total projects.

Topics covered included challenges / solutions in research, involvement / deployment in operations, synergies / transfers / adoption of standards in terms of exploitation. In its blog, FORMOBILE reflected the need for continued collaboration with the other projects and expressed plans to communicate new cooperative efforts.

SWGDE seeks public comment on 4 drafts

The Scientific Working Group on Digital Evidence (SWGDE) has posted draft documents for public review and comment: 

  • SWGDE Best Practices for Forensic Audio v2.3
  • SWGDE Best Practices for Teleworking and Digital Forensics v1.0
  • SWGDE Guidelines for Video Evidence Canvassing and Collection v1.0
  • SWGDE Informational Overview: Computer Vision v2.0

SWGDE’s policy is to post draft documents for a minimum of 60 days for public comment. Comments are accepted via email per the instructions on the first page of each draft document.  All feedback received prior to the group’s next meeting in June 2021 will be reviewed by the appropriate subcommittee at that meeting.

(Want to know more about how SWGDE works? Read our recent article!)

At the conclusion of its September meeting, SWGDE additionally voted to release numerous Approved documents. They are available for download on the Current Documents page of the website. Among them are several documents of particular interest to the digital forensics community:

  • Best Practices for Archiving Digital and Multimedia Evidence_v1.0
  • Best Practices for Digital Evidence Acquisition from Cloud Service Providers_v1.0
  • Best Practices for Examining Magnetic Card Readers_v3.1
  • Best Practices for Mobile Device Evidence Collection & Preservation Handling and Acquisition_v1.2
  • Best Practices for Mobile Device Forensic Analysis_v1.0
  • Core Competencies for Embedded Device Forensics_v1.0
  • Practical Considerations for Submission and Presentation of Multimedia Evidence in Court_v1.0
  • Technical Notes on Internet of Things Devices_v1.0
  • Test Method for Bluetooth® Module Extraction and Analysis_v1.1
  • Test Method for Skimmer Forensics – Analog Devices_v1.0
  • Test Method for Skimmer Forensics – Digital Devices_v1.0

For those working in audio/video forensics:

  • Best Practice for Frame Timing Analysis of H.264 Video Stored in ISO Base Media File Formats_v1.0
  • Best Practices for Enhancement of Digital Audio_v1.2
  • Considerations for the Use of Time-Based Analysis of Digital Video for Court_v1.0
  • Core Technical Concepts for Time-Based Analysis of Digital Video Files_v1.0
  • Fundamentals of H.264 Coded Video for Examiners_v1.0
  • Video and Audio Redaction Guidelines_v2.0

SWGDE encourages stakeholder feedback, and suggestions for modifications to any document are welcome. Please use the “Submit Comments” link beside the listed document to provide feedback.

Forensics @ NIST: Digital & Identification Evidence segment

In November, NIST held its annual two-day forensics symposium, which drew 1750+ registrants from around the world. The virtual Digital & Identification Evidence segment covered updates to its six major programs. Of note:

  • Barbara Guttman shared a brief update on the NIST Digital Forensics Black Box Study, which she described for Forensic Focus in July. Registration to participate closed on October 31, but NIST is in the process of collecting data. If you registered but haven’t returned your data, please send it before the November 30 deadline!
  • Having noticed some discrepancies in the way some tools reported deleted or modified SQLite data, the NIST team is now expanding its existing mobile forensic tool testing specification. To ask about beta testing the prototype, email NIST.
  • The Computer Forensics Reference Dataset (CFReDs) has a new portal in beta. It will make technology, functionality, and scenario-based datasets easier to share and modify (subject to administrator approval). Datasets will be easier to search, too, with a new taxonomy tree, tagging, and a search bar to support large amounts of data. Currently it contains 160 entries. The CFReDS team seeks feedback, so check out the beta portal here.

This ongoing work is part of NIST’s goal to provide trustworthy, useful, and timely information that helps forensic examiners deal with both the volume and variety of material, including its constant rate of change. Additional presentations included:

  • Improvements to the Computer Forensic Tool Testing (CFTT) federated testing. As of v5.1 coming next year, a self-contained Windows 10 application will make mobile federated testing easier, and logfiles will be stored to USB or desktop.
  • String search testing via the Computer Forensic Tool Testing (CFTT) Project. Project leader Jim Lyle described testing that’s relevant to a lab’s work as well as what the user expects the tool to do. He also covered common challenges like ligatures, diacritics, formatted text search, and stressed that unexpected results are an opportunity to learn.
  • The National Software Reference Library (NSRL). Project leader Doug White covered how to customize the NSRL hashes / reference data sets — updated quarterly — to fit investigative needs by using metadata to identify software and its versions, and/or to create data subsets of notable software classes that individual investigators commonly see.

Virtual Hansken Community Day highlights DFaaS

In September’s digital forensics research roundup, we described Hansken, the Netherlands Forensic Institute’s “digital forensics as a service” (DFaaS) platform. In December, the NFI held its first Hansken Community Day. A series of webinars explained the project vision, collaborative aspect, forensic knowledge, extraction plug-ins, and training manuals.

This event saw more than 100 participants representing 23 organizations from eight different countries — including the United States, Belgium, Australia, Spain, Germany, Norway, the UK and the Netherlands — come together online to exchange their knowledge and experience, whether they are current or prospective Hansken users.

A Hansken Discord channel is open to encourage continued information sharing, and the next virtual Hansken Community Day is scheduled for March 24-25.

Forensic Capability Network / Transforming Forensics

Designed and developed under the Transforming Forensics programme of the United Kingdom’s National Police Chiefs’ Council (NPCC), the Forensic Capability Network (FCN) launched in summer 2020. It’s designed to provide operational support to front-line law enforcement in need of forensic evidence.

The FCN recently reported that since launching new Streamlined Forensic Reporting guidelines in July 2020, the documentation — “the first ever time SFR documentation was publicly available to download and hosted in one place” — has been accessed 4,800 times.

That metric is key considering SFR’s purpose: since 2012, it has enabled investigators and scientists to enter forensic evidence into the criminal justice system in such a way as to focus on “key conclusions that are simple for juries to understand, and which speed up cases by allowing the defence to quickly accept or challenge evidence.”

The documentation consists of both guidance and templates. FCN additionally reported that since SFR’s July launch, “technical readiness work… is taking place to assist forces in embedding the new forms into their case management systems.” That’s in advance of a new release planned for January, which will include updated digital guidance and reformatted forms.

Another FCN project, CSE Automate, “is developing opportunities to automate and enable remote viewing to speed up and simplify CSE workflow with a view to providing this as a service through the FCN…. based on the principle of doing things once for the benefit of many.”

That’s the result of a finding that child exploitation cases demand about 60 percent of overall digital forensics unit capacity, with an outcome of longer turnaround times for all digital forensics cases. Having invited Expressions of Interest (EOI) from forces that want to help design the automated CSE workflow, the FCN is currently working through all EOIs before implementing the next phase in the project.

On a different topic, FCN reported its development of “an agreed, consistent and national approach to handling legacy [digital forensics] data, producing operational guidance to interpret existing policy and legislation.”

That’s in response to an observation in the NPCC Digital Forensic Science Strategy (DFSS), which highlighted difficulties in determining digital forensics data retention requirements, resulting in a patchwork of “largely manual” processes. An ongoing consultation with policing and other strategic stakeholders is expected to result in new guidance sometime in 2021.

Forensic Focus is interested in covering more stories about the implementation of new technology and standards in different countries and regions across the globe. If you know of an initiative in your region that you think we should cover, please email christa@forensicfocus.com with more information!

Community In Collaboration: The Scientific Working Group On Digital Evidence

In September, Forensic Focus was invited to sit in on a meeting of the Scientific Working Group on Digital Evidence (SWGDE). The third and final meeting of 2020 was held virtually — just as the previous meeting, in June, had been — and brought together participants from all over the world.

What are the Scientific Working Groups? The Facial Identification Scientific Working Group (FISWG) sums it up this way: “Since the early 1990s, American and International forensic science laboratories and practitioners have collaborated in Scientific Working Groups (SWGs) to improve discipline practices and build consensus standards.”

The SWGs work independently of, though symbiotically with, the National Institute of Standards and Technology (NIST)’s Organizations of Scientific Area Committees (OSACs) for Forensic Science, which are focused on “facilitating the development and promoting the use of high-quality, technically sound standards” to “address a lack of discipline-specific forensic science standards”. SWG documents are often relied on as references in OSAC standards development organizations (SDOs).

Since 2006, SWGDE has published nearly 100 documents covering a wide range of topics: quality assurance, data acquisition, and analysis for various types of digital media are all covered, along with two position papers — one on teleworking, and the other on the use of hash algorithms — and archived documents. At any given time, some drafts may also be open for public comment.

September’s meeting, said SWGDE Forensics Committee Chair Steve Watson, represented an “unusual place” for the organization: a completely clean slate. Normally, he explained, documents are carried over from meeting to meeting.

That’s why this article isn’t a “recap” like those we’ve done for other events. While it did feature a Forensic Committee special topic presentation — Nilay Mistry of the University of Gujarat, India presented on volatile memory forensics — its main focus is a highly collaborative process of document brainstorming, scoping, and drafting.

Looking ahead to 2021, readers who seek ways to give back to the community might consider bringing their expertise to SWGDE — as guests, members, or commenters on public drafts. Membership requires attending two consecutive meetings, which typically occur in January, April or May, and September. 2021’s first meeting is planned for January 11-14.

Document drafting: A collaborative process

Traditionally, SWGDE meetings are held in person. The reasoning is to ensure meetings are productive and free of distractions. Previous meetings have been held in cities around the United States; future ones will be held at the U.S. Secret Service’s National Computer Forensics Institute (NCFI) in Hoover (Alabama).

Productivity turned out to be more challenging in 2020’s all-virtual meetings, with not just colleagues but also family members in the mix. Additionally, the virtual format meant most people could only attend for about half a day — not the usual full day.

Still, participation was robust — as it is every meeting, said SWGDE member James Howe, a detective with the Columbus (Ohio) Police Department who’s been involved with the organization for about seven years. As with many events, SWGDE meetings have their “regulars” as well as their brand-new guests. 

To ensure an appropriate mix of perspectives, membership demographics are set by the group’s bylaws. The blend includes local, state, and federal law enforcement, as well as private-sector members and academic researchers. Watson said these are people who tend to hold senior positions in their organizations.

Attorneys and vendors also sometimes participate, especially, added Howe, if the topic is of particular interest to them. Katherine Hansen, a deputy district attorney and digital evidence specialist with Colorado’s Denver District Attorney’s Office, joined SWGDE in 2019. 

“One of the things that I can contribute as an attorney is knowing the requirements for getting evidence introduced in court,” said Hansen. “When they’re putting together the best practices for how to collect or process evidence, I can look at that and think, okay, we’re going to have to establish reliability and authenticity. Applying my legal skills helps maximize the chance that the evidence will be admissible and persuasive at trial.” 

On the meeting’s first day, members and guests gather to discuss items left outstanding from previous meetings, as well as to brainstorm topics for the current session. Once a consensus is reached on which three topics to focus on, participants split off into smaller groups.

Each group scopes and outlines its topic, using Google Docs to track participants’ comments. This ensures not just a visual record of what’s happening, but also a fully collaborative experience no matter where participants are in the world.

Howe said the collaboration process has changed over time and is more efficient for it. “When I first started, we would painstakingly go over every single line of every single paper as an entire group,” he explained. “Now, we break off into smaller working groups based on interest and expertise on a particular subject.”

One key element to a working collaboration: open-mindedness. “We may argue and fight over words and small nuisances, no one ever walks away mad or with hurt feelings,” Howe said. “It is a very thick skinned group.”

Collaborations that remain unfinished are held over until the following meeting(s). “Depending on the level of paper we are doing, drafting the document usually takes two to three meetings if it is something new or a heavy lift from a previous document,” Howe explained.

Because documents are shared within Google Docs, it’s possible for members to continue to work on them in between meetings. The challenge, though, is continuity: busy people aren’t always available to review changes. That’s why, Howe says, most of the work is done at the dedicated meetings.

Once topics are ready for review, they’re presented to the larger group. Part of ensuring a document’s success is its “champion,” a person chosen to present completed drafts to the broader group, answer questions, and keep the team on track.

Then, the committee votes whether to release it for public comment. “Once it has passed that measure, it will be placed on our website for a minimum of 60 days to allow time for the public to weigh in,” Howe added.

Public comments are addressed and incorporated at the following meeting. At that point, the document is ready for public release.

Choosing — and scoping — the topics

Topics tend to be chosen based on their relevance: what members bring to the table, both from their own experience and on others’ behalf. Topics don’t have to be totally new; the committees often consider whether to update older or less detailed documents, or deprecate outdated ones.

Of the total number of brainstormed ideas, Watson estimates that about 30 percent end up becoming a document. In September 2020, for example, the topics that came up for review included:

  • An expansion of the existing vehicle forensics document with a broader set of data
  • An older document on peer to peer (P2P) software forensics
  • Geofencing
  • Drone forensics
  • Cryptocurrency
  • Digital evidence and investigative techniques, including open source intelligence (OSINT) collection
  • Website / social media collection
  • Virtual reality headsets (a topic one member reflected wasn’t being seen yet, but would be wise to “get ahead of”)
  • Industrial Control Systems (ICS) forensics
  • Mis- and false information

Both group feedback and group size limits mean that not every topic can be tackled in one meeting, so the topic list is narrowed down. Once a document is started, it’s followed through to the end, but the time this takes can vary. Watson said those documents that are limited by a dearth of industry expertise — or have the opinions of “too many cooks” — can take longer.

These aren’t necessarily setbacks, though. The group is driven by a need to deliver the right guidance to a broad range of practitioners. That can be affected by new research, case law, or legislation, any of which may be ongoing while a document is in progress or under review.

“The documents that we tend to write are things that have a longer shelf life to them,” said Watson. “We’re not sharing the latest tips and tricks that someone has identified, or just found a new way to break into a particular phone. It really is those overarching, foundational guidance documents.”

Topic in focus: geofencing

September’s topics — geofencing, drone forensics, and online collections — were just technical enough that Forensic Focus opted to “embed” in just one of the breakout groups — the geofence meeting — rather than dilute our attention across all three.

Geofencing is currently under review in U.S. courts, so the goal was to craft a document that could dispel myths and misunderstandings by explaining how Google collects geolocation data, as well as how the search warrant process could prevent overbroad data collection across a three-step process.

This group consisted of Watson; Howe; Jim Cook, a cell site analysis expert; and Joseph Remy, an assistant prosecutor with the Burlington County (New Jersey) Prosecutor’s Office. All scoped the document on Day 1. They were joined on Day 2 by Hansen — the deputy DA from Denver — who brought two colleagues experienced in geofencing technology.

It was an example of a dynamic collaboration that sometimes includes last-minute guests. That can happen, said Howe, when participants realize the draft would benefit from additional perspective and know who can provide it. “[Having the right personnel in the room] has greatly increased the quality of the documents,” he said.

Watson said those perspectives are critical to informing practitioners who are often “in the weeds” of technology — and to getting documents in front of others who need them. “How do we educate the attorneys and the judges on a technology that we can barely keep up with the pace ourselves, because of how fast things are changing?” he questioned. “Frankly, [attorneys and judges are] the ones that are making the decisions that affect us all. So [they] should be a part of this process while we’re driving science forward.”

For example, the geofencing group identified one challenge early on: geofence request processes are provider-specific (and often focus on Google), but a “best practices” document would need to offer a generalized process that could remain useful enough to cover legal bases, without naming specific providers.

Hansen worked with her associates to bring to bear their collective experiences revising Denver’s geofence search warrant template, replacing some of the language with more precise verbiage.

“What should the parameters be? What are we looking for? How many target identifiers?” she explained. “The more target identifiers you have, the more likely you’re going to see a consistent device and that’s going to help your investigation.”

(Disclosure: Forensic Focus was asked for, and provided, our editorial expertise on grammar and style, but was not in a position to contribute technical guidance.)

Getting involved in SWGDE

Participants’ experience matters less than a willingness to come and work. Howe’s experience is a good example. In 2013, just a year after starting in digital forensics at his agency, he was online, looking for training opportunities, when he happened across the SWGDE website.

“I was still just pushing buttons at this point and praying the software would spit out data,” Howe said. Impressed by the SWGDE documents he read online, Howe decided to attend a meeting as a guest, even though he didn’t know anyone there. At that time, the forensic committee was working on the chip-off extraction of data from a credit card skimmer.

Both Hansen and Howe expressed feeling “floored” and “blown away” by the caliber of participants. “While the meetings are not considered ‘training,’ I have learned so much while attending all of the meetings and talking with the other members,” Howe said. 

Watson concurs. “You get to a place in your career that a lot of the sessions at the conferences are not as helpful as they are when you’re more junior in your career,” he explained. “What I found in SWGDE was this group of very highly skilled people in a variety of organizations, coming together, wrestling over these really hard topics. It challenged me in a way that many of the other professional development conferences just don’t anymore.”

Hansen agreed. Her contribution is to combine her 25 years as a prosecutor — including writing for appellate courts — with her digital forensics knowledge to bring to the different SWGDE papers. “Even [when] I can’t provide any additional substantive comment, I can still read [them]. I’m then able to overlay those substantive comments with my legal knowledge to help improve cases.”

Another form of personal benefit: what Hansen calls “this pooling of resources and knowledge… these connections that allow you to reach out to someone and say, ‘Hey, I have this problem. I spent my available thoughts on this and it hasn’t been fruitful. Do you have any other suggestions?’”

“I have made connections that stretch across the globe, and friendships that I still carry on to this day,” said Howe, who added that he overcame both a case of impostor syndrome and a fear of flying to attend his second meeting, where he was voted in as a member. “In this group, your answer is always one or two calls a way if you have a question or any issues. There are members I stay in contact with at least once a week on various things.”

Those connections are critical to digital forensics practice. “I would not be where I am now if I had decided not to get on that plane,” Howe said. “We touch on so many different subjects and disciplines, it can really light an interest in those.”

That’s partly the result of including professionals with deep specialist experience, such as video forensics or in Watson’s case, extensive research on severely damaged devices.

More than personal benefit, though, is the chance to contribute to something that had, in Watson’s words, “a much broader impact than some of the other efforts that I would spend my time in.”

Howe has noticed SWGDE documents used as references in training classes, books, and agency policies; Watson says agencies around the world use the documents for guidance in developing their own individual processes and procedures.

That has implications for trials, said Hansen, returning to her observation about the need to build the best possible case. Best practices for collection, ensuring reliability, and authentication are all part of admitting evidence and making it persuasive, and as a result, can be built into the best practices.

The geofence paper is a good example because of its potential to help judges understand how the process works in reality — how probable cause to obtain the geofence warrant might look very different from probable cause to obtain subscriber account information, and the steps it takes in between to move from one to the next. “It’s important that attorneys down the line can get the evidence admitted into court and have confidence that the evidence will stand up,” Hansen explained.

Howe’s advice to prospective members: “It is most definitely not a vacation from work if you take it seriously. I think the thing that always surprises me the most is the level of work that is put into a document during the week we are together in person. It is quite literally a brain drain,” he said.

But even those who are unable to participate at that level can get involved. “The public comment process is vital to the success of the document,” Howe added. “We love when people comment on the documents or ask us for clarification on something… if the comments are relevant and correct, we will discuss them as a group and implement them in the paper if needed.”

Changes To Forensic Laboratory Accreditation Requirements – ISO/IEC 17025

by Tim Alcock

ISO/IEC 17025:2017 – General requirements for the competence of testing and calibration laboratories is the principal international standard for the accreditation of laboratories performing testing (including sampling) and/or calibration. Originating from ISO/IEC Guide 25, the standard has been through several iterations culminating in the latest version released in November 2017.  

ISO/IEC 17025:2017 enables laboratories to demonstrate competent operation, validity and confidence in results.

This article provides a brief background to the application of ISO/IEC 17025 together with an overview of the new standard, highlighting the major changes.

Laboratory Accreditation in Forensics

Accreditation is the independent evaluation of conformity assessment bodies (in this case Forensic Laboratories working to ISO/IEC 17025 and crime scene investigation units working to a ‘sister’ standard ISO/IEC 17020).  Accreditation bodies are normally themselves peer assessed through an international system administered by the International Laboratory Accreditation Cooperation (ILAC).  Accreditation bodies are established in many countries with the primary purpose of ensuring that conformity assessment bodies are subject to oversight by an authoritative body.

Laboratory accreditation differs from ‘Quality Management Certification/Registration’ to standards such as ISO 9001 in that the assessment teams employ subject matter experts who directly evaluate the practical aspects and technical operation of the laboratory which increases the scientific rigor of the assessment.  Having said this, being accredited cannot guarantee that mistakes will not happen and there are well documented cases of errors and failures even in accredited facilities. It can however provide confidence that the laboratory operates an effective management system with rigorous requirements for ensuring competency, technical operation and reporting of results and be used as the basis for continuing improvement of the laboratory’s management systems.

Accreditation is generally voluntary, but in the UK, whilst not mandated in law, the Forensic Science Regulator has required that Forensic units will be accredited to relevant standards (ie ISO/IEC 17025, 17020 or ISO 15189 for clinical laboratories).   Some states in the USA also require their laboratories to hold accreditation. As the standards mentioned above were not specifically written for forensics applications, supplemental documents have also been published (such as ILAC G-19) and in the UK, the FSR Codes criteria of Practice and Conduct and associated documents as well as specific accreditation body requirements.

ISO/IEC 17025 Changes

The new standard, introduced last November, contains the following major changes:

    • Re-structuring of clause numbers to be more ‘process-based’
    • Closer interaction with ISO 9001
    • Introduction of the need to perform an impartiality risk assessment
    • Assessment of risks/opportunities to the operation of the laboratory
    • Enhanced requirements relating to complaints and confidentiality
    • ‘Risk-based’ Internal audits
    • Management review – agenda additions
    • Enhanced requirements relating to reporting
    • Statements of conformity and ‘Decision Limits’
    • Data/ Information management.

These changes are discussed below.

Structure

The new standard has been completely re-formatted in terms of clause numbering as outlined in the schematic below.  This provides more logical sequencing of laboratory activities from review of customer requirements to issue of reports, together with associated support and resource provision, management aspects and audit, corrective action and improvement activities.

Integration with ISO 9001

ISO/IEC 17025:2017 has been developed to align with ISO 9001 (the most commonly applied Quality Management Systems standard).  There are common elements in both standards and, if a company is ISO 9001 certified, it will address these common elements already.  Without ISO 9001 certification, these additional requirements will need to be included (defined in Section 8 of the standard). Unfortunately, there are differences between these ‘common’ areas in ISO/IEC 17025 and ISO 9001, so laboratories who also operate ISO 9001 systems will need to review the detail of these aspects to ensure compliance.

‘Overlap’ between ISO/IEC 17025 and ISO 9001.

Management aspects

The standard requires that organisations have some form of documented management system which includes policies, objectives and procedures, however, it no longer specifies that the laboratory has a ‘quality manual’ as such.  Planned reviews are required ensuring the system’s continuing suitability and effectiveness. Clause 5, entitled “Structural requirements,” covers legal identity, management structure and organisation of laboratory and related support activities including scope of services offered.  Specific position titles relating to “technical and quality management” have been removed, although responsibilities for these functions are still required to be defined.

Enhanced requirements requiring assessment of risks to impartiality together with controls for mitigation have been introduced. Additional requirements relating to confidentiality are also added.

Process requirements

One improved aspect of the new standard is the logical structuring of laboratory process activities.  Core business activities are set out, commencing with the determination and review of customer requirements whether stated in a ‘Service Level Agreement’ or a direct request from an Investigating Officer, through method selection/development, sampling (if applicable), handling of the exhibits, records, uncertainty, quality control and reporting. Content of these clauses remains substantially the same, but has been simplified in wording. However, there are a number of detail changes (for example aspects relating to agreement to subcontract, customer requested deviations, additional requirements relating to sampling and content of final reports etc) which need to be considered when upgrading systems.

One significant introduction is the term “decision rule”, defined as a ‘rule that describes how measurement uncertainty is accounted for when stating conformity with a specified requirement’.  Laboratories who report “conformity with a specified requirement”, eg, stating that a product complies with a specification or otherwise, need to agree with the customer how measurement uncertainty is taken into account when reporting conformity and make this clear on reports.  This is critical where reported results are near specification limits and uncertainty could affect the pass/fail decision. This would be the case where reporting pass or fail criteria with legal limits, such as blood/alcohol limits or the length of a sawn-off shotgun.

Support activities

“Resource requirements” are addressed in the form of personnel training, competence, facilities/environment, equipment and consumables, metrological traceability, purchasing and subcontracting (the latter two clauses being renamed “control of externally provided services”). Again, additional requirements have been added relating to the content of procedures and review of adequacy of facilities and calibration intervals.  A new clause has been added enhancing requirements relating to Information and data management. As with all quality management system standards, controls need to be applied to the control of documents and records to ensure that personnel are up-to-date with test methods, protocols, procedures, legal requirements etc. and for the maintenance of records, whether these are case files, equipment, personnel, recorded data or anything else.

Quality management processes

Finally, “quality management” issues are addressed, namely, performance of internal audits, control of non-conforming work, an enhanced section on complaints as well as soliciting customer feedback. The corrective action section deals with actions resulting from non-conformities, identification of root cause and action to eliminate cause. In line with ISO 9001, the “Preventive Action” clause has been renamed “Risks and Opportunities” and the laboratory is required to identify and implement improvements on an ongoing basis – this represents a major change, requiring the laboratory identify risks that could affect results and objective, and identify and implement improvements.

Transition to the new standard

A three-year transition period is allowed for accredited laboratories, with accreditations to the old standard ceasing to be valid from 30 November 2020. UKAS in the UK has published transition guidelines on its website (ukas.com).

Summary

The changes represent great improvement in terms of structure and clarity, with the possible exception of the clause “Control of external products and services” which, in the author’s opinion, are not as clearly set out as the “Subcontracting” and “Purchasing” clauses of the previous version. Key new aspects, such as the identification of risks and opportunities will provide additional challenges for both laboratories and assessors alike.

About The Author

Tim Alcock, CQP FCQI MASQ is an IRCA Registered Lead Auditor and Managing Director of Qualimetric Ltd. He has 30 years’ experience in the application of quality management systems, specialising in Laboratory and Inspection Body accreditation.   

2010 report of digital forensic standards, processes and accuracy measurement

Joshua Isaac James, Pavel Gladyshev

{Joshua.James, Pavel.Gladyshev}@UCD.ie

Centre for Cybercrime Investigation
University College Dublin
Belfield, Dublin 4
Ireland

1. Introduction

From December 7th 2010 to December 12th 2010 a survey on Digital Investigation Process and Accuracy was conducted in an attempt to determine the current state of digital investigations, the process of examination (examination phases), and how those examinations are being verified as accurate. An online survey was created in English using SurveyMonkey.com (2010), and consisted of 10 questions. Two groups were solicited: a control group from the University College Dublin (UCD) Forensic Computing and Cybercrime Investigation (FCCCI) MSc Programme (2010), and members of the Forensic Focus (FF) (2010) online community. The control group consisted of known digital forensic investigators, of which four replies were received. The second group consisted of anonymous replies from the Forensic Focus online community. Forensic Focus is a publically accessible online forum and information site on the topic of computer forensics that primarily uses the English language. 28 replies were received from this community, making 32 replies in total. The average responses from the control group were consistent with the average responses from the Forensic Focus community. For the analysis in this paper, all responses will be considered together. The collected survey data can be found in appendix A.

2. Survey Analysis

To determine if trends were sector or region specific, the following questions were asked:

Question 1 was to identify the associated work sector of the respondents.

· Which of the following best describes your organization?

78.1% of respondents claimed to be Law Enforcement, 12.5% claimed to be with a corporate entity, and 9.4% claimed to be contractors. No other sectors were specified, as seen in figure 1.

Fig. 1. Distribution of respondents’ by work sector

Question 2 was to identify the region where the respondents were located.

· What general region best describes your organization’s location?

68.8% or respondents claimed to be from Europe, 21.9% claimed to be from North or South America, 6.3% claimed to be from Asia and South Pacific (ASP), and 3.1% claimed to be from the Middle East and North Africa (MENA) (fig. 2). This distribution is comparable the FF ‘members map’, with slightly less representation from North and South America. Also FCCCI has slightly more members from Europe than other regions, which could account for some of the over-representation in Europe. Given that the survey was in English and was posted to limited English speaking sources, there is an inherent bias towards English speaking regions. For this reason, any conclusions should be generalized as more relevant to regions where English is the preferred working language e.g. Europe, North America and Australia, rather than a truly global view.

Fig. 2. Distribution of respondents’ by region

The next questions were created to determine an average caseload, and how well departments are keeping up with the workload.

Question 3 was to approximate the number of investigations per month.

· Approximately how many digital investigations does your department conduct per month?

43.8% of respondents claimed that 21 or more cases were being conducted per month, 34.4% claimed between 1-10 cases per month, and 21.9% claimed 11-20 cases per month (fig. 3). Each respondent claimed his or her department investigated at least one case per month. The scope of the answers in this question was also found to be too low, resulting in a loss of some specificity above 21 cases.

Fig. 3. Approximate number of digital investigations per month conducted per department

Question 4 was to approximate whether each case investigated involved a suspect device such as a computer or cell phone.

· Approximately how many digital investigations per month involve examining a suspect device (computer, cell phone, etc)?

37.5% claimed that 21 or more cases per month involved a suspect device; another 37.5% claimed only 1-10 cases involved a suspect device; 25% claimed 11-20 cases per month (fig. 4). When compared to question 3, there is a reduction of suspect devices analyzed vs. the number of cases, but suspect devices are still analyzed the majority of the time. This question does not consider the number of devices per case.

Fig. 4. Approximate cases per month involving a suspect device

Question 5 was to determine the current length of case backlog the respondents were experiencing.· How long is the current backlog of cases for your organization?

37.5% responded with 1 month to 6 months, 34.4% responded 0 to 1 month, 18.8% responded 6 months to 1 year, 6.3% responded with 2 years to 3 years, and 3.1% responded with 3 years or more (fig. 5). From these results, 71.9% of respondents have a case backlog between 0 and 6 months. The overall (very approximate) mean is ~9 months. These reported times are similar to those reported in 2004 (EURIM-ippr 2004), and are as much as 1.25 years less than have been recently reported in the US and UK (InfoSecurity 2009; Gogolin 2010; Raasch 2010). While there were responses over 2 years, there were no responses between 1 year and 2 years.

Fig. 5. Approximate distribution of case backlog times

The following questions were to determine common standards and guidelines, as well as analysis techniques used.

Question 6 was to determine what standards or guidelines are commonly used in the digital forensic process. The respondent could have chosen multiple answers.

· What are the main standards or guidelines your organization uses to ensure quality forensic processes?

58.1% claimed their standard operating procedure (SOP) was developed in-house, 35.5% claimed to use standards from the National Institute of Standards and Technology (NIST), 29% claimed to use standards from the Association of Chief Police Officers (ACPO), 19.4% claimed to use standards from the International Organization for Standardization (ISO), 9.7% claimed to use other standards, such as European Cybercrime Training and Education Group (ECTEG) operating procedures, and guidelines from the Nederlands Forensisch Instituut [Netherlands Forensic Institute] (NFI), and 6.5% used standards from the International Association of Computer Investigative Specialists (IACIS). One respondent declined to answer, making the total respondents 31. The data suggests that the most departments are developing their own SOP, but are also implementing other standards to supplement. NIST (US based) and ACPO (UK based) were the most popular after in-house developed SOP.

Fig. 6. Standards commonly used in digital investigation processes

Question 7 was to determine if any preliminary analysis is done before an in-depth analysis.

· Is a preliminary analysis (preview) of a suspect computer conducted before an in-depth analysis?

40.6% of respondents claimed that a preliminary analysis was done sometimes (49% or less), 28.1% claimed a preliminary analysis was done most of the time (50% or more), 21.9% claimed a preliminary analysis was never done, and 9.4% always conduct a preliminary analysis (fig. 7). The data suggests that choosing to conduct a preliminary analysis very much depends on the case, which is consistent with claims by (Casey, Ferraro et al. 2009) that “case management… involves tailoring forensic examination of digital evidence to the type of crime or case under investigation”. In some cases, such as a murder investigation, a preliminary analysis may not be conducted. This could account for why the majority of responses were ‘sometimes’ instead of ‘most of the time’. This question did not consider the types of cases being investigated or departments the respondents were involved with.

Fig. 7. Graph of how often a preliminary analysis is conducted before an in-depth analysis

Question 8 was to determine how often on scene triage or preview techniques are used.

· Does your department use on scene triage or preview techniques?

41.9% of respondents claimed on scene preview or triage was conducted sometimes (49% or less), 29% claimed preview was never conducted, 19.4% claimed preview was done whenever possible, and 9.7% claimed on scene preview was done whenever possible (fig. 8). One respondent declined to answer, making the total respondents 31. The responses are similar to question 7, with more respondents either attempting on scene preview or not previewing at all.

Fig. 8. Graph of how often on scene preview or triage is conducted

Question 9 was to determine how often live forensic techniques are used.

· Does your organization use live forensic techniques?

37.5% of respondents claimed they use live forensics when possible, 34.4% use live forensics sometimes (49% or less), 18.8% claimed to never use live forensics, and 9.4% claimed to use live forensics most of the time (50% or more) (fig. 9). When compared to question 8, live forensics is used more often than on scene preview or triage, but not in every possible case.

Fig. 9. Graph of how often live forensics techniques are used

Question 10 was intended to determine the interpretation of ‘accuracy’ in digital examinations. It was expected that each respondent would have a different interpretation of what it means for an examination to be accurate. The word ‘calculated’ was chosen to determine if their definition of accuracy is objectively measured.· Is the accuracy of digital examinations calculated? If yes, please briefly specify the technique(s) used.

76.7% of respondents claimed that the accuracy of digital examinations is not being calculated, while 23.3% claimed accuracy was being calculated (fig. 10). Two respondents declined to answer, making the total respondents 30. The majority of respondents that replied yes (4 out of 7) also claimed that accuracy was verified by peer review, followed by hashing the evidence (2 out of 7). Please see appendix A for more information. From the data, accuracy of a digital examination is most commonly interpreted as verification of the findings by review. No mention was given to ‘calculation’ i.e. a formula for objective measurement.

Fig. 10. The percentage of respondents claiming to calculate accuracy of digital examinations

3. General Analysis

· Of the respondents, North and South America reported the largest number of investigations with 57% of respondents claiming 21 or more investigations per month, while 50% of respondents in ASP and 41% of respondents in Europe claimed 21 or more per month.

· All corporate and contractor respondents reported backlogs of 0 to 1 month regardless of region and caseload.

· European Law Enforcement (LE) reported the longest backlogs with 18% claiming backlogs of over 1 year. The next longest was ASP LE with 50% of the respondents claiming backlogs of 6 months to 1 year.

· 100% of ASP LE claimed backlogs of 1 month to 1 year; 100% of North/South American LE and 100% of MENA LE claimed backlogs of 1 month to 6 months; and 88% of European LE claimed backlogs of 1 month or more.

· 100% of LE respondents that claimed case backlogs greater than 2 years selected only NIST as the used guidelines. Alternatively, 100% of corporate that selected only NIST as the used guidelines claimed case backlogs of 0 to 1 month.

· 80% of LE that selected only that their SOP was developed in-house (no additional standards) reported case backlogs of 1 month to 6 months.

· LE with longer backlogs were more likely to conduct a preliminary analysis before an in-depth analysis.

· 71% of corporate and contractor respondents conducted preliminary analysis more than half the time, and were also the only respondents to report always (42%) conducting a preliminary analysis.

· The length of the case backlog was more of an indicator as to whether a preliminary analysis would be done than the amount of cases per month.

· Unlike preliminary analysis, respondents with longer backlogs were not more likely to use on scene preview or triage.

· 83% of respondents that used on scene preview or triage whenever possible claimed a backlog of 0 to 1 month as compared to only 11% of respondents who claimed to never use on scene preview.

· The use of live forensic techniques does not appear to correlate to the number of cases per month or case backlog.

4. Conclusions

The data provided gives insight into the current state of digital investigations. It has been shown that organizations are using many different standards, and most are developing their own SOP. Whether this works for the department or not, it indicates that there is a considerable amount of duplicated effort worldwide. Concerning case backlogs, the overall backlog time appears to not be as long as has previously been reported. The use of on scene preview or triage appears to have a positive effect on case backlog, but only 18% of those who use on scene preview or triage – and 16% who use preliminary analysis – claimed to attempt to calculate the accuracy of investigations (via peer review, hashing, etc. See question 10). It appears that some improvements can still be made with current digital investigation methods; however, further research needs to be done concerning the types of cases being investigated and the resources available. By comparing responses from corporate and law enforcement, and considering their traditional differences (budget and case types), those differences appear to be factors in backlog and the use of preliminary analysis.

Acknowledgments

Thank you to the UCD Forensic Computing and Cybercrime Investigation group as well as the Forensic Focus community for your participation.

References

Casey, E., M. Ferraro, et al. (2009). “Investigation Delayed Is Justice Denied: Proposals for Expediting Forensic Examinations of Digital Evidence*.” Journal of forensic sciences 54(6): 1353-1364.

EURIM-ippr. (2004). “EURIM – IPPR E-Crime Study: Partnership Policing for the Information Society.” Third Discussion Paper. Retrieved 23 Nov., 2010, from http://www.eurim.org/consult/e-crime/may_04/ECS_DP3_Skills_040505_web.htm.

ForensicFocus. (2010). “Forensic Focus: Computer Forensic News, Information and Community.” Retrieved 12 Dec., 2010, from http://www.forensicfocus.com.

Gogolin, G. (2010). “The Digital Crime Tsunami.” Digital Investigation 7(1-2): 3-8.

InfoSecurity. (2009, 08 July). “Digital forensics in a smarter and quicker way?” Info Security Retrieved 25 Sept., 2010, from http://www.infosecurity-magazine.com/view/2473/digital-forensics-in-a-smarter-and-quicker-way.

Raasch, J. (2010, 12 July). “Child porn prosecutions delayed by backlog of cases.” Retrieved 25 Sept., 2010, from http://www.easterniowanewsnow.com/2010/07/12/child-porn-prosecutions-delayed-by-backlog-of-cases/.

SurveyMonkey. (2010). “SurveyMonkey.” Retrieved 12 Dec., 2010, from http://surveymonkey.com.

UCDCCI. (2010). “Forensic Computing and Cybercrime Investigation (FCCCI) MSc Programme.” Retrieved 12 Dec., 2010, from http://cci.ucd.ie/content/online-forensic-computing-and-cybercrime-investigation-msc-programme.

Appendix

 

1. Which of the following best describes your organization? 2. What general region best describes your organizations location? 3. Approximately how many digital investigations does your department conduct per month? 4. Approximately how many digital investigations per month involve examining a suspect device (computer, cell phone, etc)? 5. How long is the current backlog of cases for your organization?

 

Contractor Europe 21 or more 21 or more 0 to 1 month

 

Law Enforcement Europe 21 or more 21 or more 1 month to 6 months

 

Law Enforcement Europe 21 or more 21 or more 6 months to 1 year

 

Law Enforcement North/South America 21 or more 21 or more 1 month to 6 months

 

Law Enforcement Europe 11-20 11-20 1 month to 6 months

 

Contractor North/South America 1-10 1-10 0 to 1 month

 

Law Enforcement Europe 21 or more 21 or more 6 months to 1 year

 

Corporate North/South America 11-20 11-20 0 to 1 month

 

Law Enforcement Europe 21 or more 21 or more 3 years or more

 

Law Enforcement Europe 1-10 1-10 2 years to 3 years

 

Law Enforcement North/South America 21 or more 21 or more 1 month to 6 months

 

Law Enforcement Europe 1-10 1-10 1 month to 6 months

 

Law Enforcement Europe 1-10 1-10 1 month to 6 months

 

Law Enforcement Asia and South Pacific 21 or more 21 or more 6 months to 1 year

 

Law Enforcement Europe 1-10 1-10 1 month to 6 months

 

Law Enforcement Middle East and North Africa 1-10 1-10 1 month to 6 months

 

Law Enforcement Asia and South Pacific 11-20 11-20 1 month to 6 months

 

Law Enforcement Europe 11-20 11-20 6 months to 1 year

 

Law Enforcement Europe 21 or more 21 or more 0 to 1 month

 

Law Enforcement Europe 1-10 1-10 0 to 1 month

 

Corporate North/South America 21 or more 21 or more 0 to 1 month

 

Corporate North/South America 1-10 1-10 0 to 1 month

 

Corporate Europe 11-20 11-20 0 to 1 month

 

Law Enforcement Europe 21 or more 1-10 6 months to 1 year

 

Law Enforcement Europe 1-10 1-10 0 to 1 month

 

Law Enforcement Europe 21 or more 21 or more 6 months to 1 year

 

Law Enforcement Europe 11-20 11-20 1 month to 6 months

 

Law Enforcement Europe 1-10 11-20 2 years to 3 years

 

Law Enforcement Europe 21 or more 21 or more 0 to 1 month

 

Law Enforcement Europe 1-10 1-10 1 month to 6 months

 

Contractor Europe 11-20 1-10 0 to 1 month

 

Law Enforcement North/South America 21 or more 11-20 1 month to 6 months

 

6. What are the main standards or guidelines your organization uses to ensure quality forensic processes? 7. Is a preliminary analysis (preview) of a suspect computer conducted before an in-depth analysis? 8. Does your department use on scene triage or preview techniques? 9. Does your organization use live forensic techniques? 10. Is the accuracy of digital examinations calculated? If yes, please briefly specify the technique(s) used.

 

International Organization for Standardization (ISO) Sometimes (49% or less) Sometimes (49% or less) When possible Yes – By peer review (technical), then by QA process (non technical)

 

Association of Chief Police Officers (ACPO) Sometimes (49% or less) Sometimes (49% or less) Sometimes (49% or less) No

 

International Organization for Standardization (ISO) Most of the time (50% or more) Most of the time (50% or more) Sometimes (49% or less) No

 

National Institute of Standards and Technology (NIST), Developed in-house Sometimes (49% or less) Sometimes (49% or less) Sometimes (49% or less) No

 

Developed in-house Sometimes (49% or less) Never When possible No

 

Developed in-house Always Sometimes (49% or less) Never No

 

NIST, ISO, In-house Most of the time (50% or more) Sometimes (49% or less) Sometimes (49% or less) No

 

In-house Always Whenever possible Most of the time (50% or more) No

 

National Institute of Standards and Technology (NIST) Sometimes (49% or less) Sometimes (49% or less) When possible No

 

National Institute of Standards and Technology (NIST) Most of the time (50% or more) Sometimes (49% or less) When possible No

 

Developed in-house Never Never Sometimes (49% or less) No

 

Developed in-house Never Never Sometimes (49% or less) No

 

NIST. ISO Never Never Never Yes – 2 technical correctings by other experts of the domain

 

Developed in-house + additional standards Sometimes (49% or less) Sometimes (49% or less) Sometimes (49% or less) Yes – Work is vet through by supervisor before releasing the examination result to the Investigators.

 

IACIS, In-House Sometimes (49% or less) Never When possible No

 

N/A Sometimes (49% or less) Sometimes (49% or less) Sometimes (49% or less) N/A

 

IACIS, ACPO Sometimes (49% or less) Sometimes (49% or less) When possible Yes – images hashed after acquisition

 

ACPO, Developed in-house Most of the time (50% or more) Most of the time (50% or more) Never Yes – don’t get the question

 

Association of Chief Police Officers (ACPO) Never N/A Sometimes (49% or less) No

 

ISO, In-house Never Sometimes (49% or less) Most of the time (50% or more) No

 

National Institute of Standards and Technology (NIST) Always Whenever possible When possible No

 

National Institute of Standards and Technology (NIST) Most of the time (50% or more) Sometimes (49% or less) Never Yes – MD5 Hash values calculated of evidence and image

 

NIST, ACPO, In-House Sometimes (49% or less) Whenever possible When possible No

 

ACPO, Developed in-house Sometimes (49% or less) Never Sometimes (49% or less) No

 

NFI, Developed in-house Most of the time (50% or more) Whenever possible When possible N/A

 

Developed in-house Sometimes (49% or less) Sometimes (49% or less) When possible No

 

Developed in-house Most of the time (50% or more) Whenever possible When possible No

 

National Institute of Standards and Technology (NIST) Most of the time (50% or more) Most of the time (50% or more) When possible No

 

ACPO, ECTEG SOP Never Never Never No

 

NIST, ISO Never Never Never Yes – 2 technical verifications of all the analysis

 

ACPO, Developed in-house Most of the time (50% or more) Whenever possible Most of the time (50% or more) No

 

NIST, ACPO, In-House Sometimes (49% or less) Never Sometimes (49% or less) No