±Forensic Focus Partners
|New Today: 1||Overall: 35670|
|New Yesterday: 8||Visitors: 236|
2010 report of digital forensic standards, processes and accuracy measurementBack to top Back to main Skip to menu
2010 report of digital forensic standards, processes and accuracy measurement
Centre for Cybercrime Investigation
University College Dublin
Belfield, Dublin 4
From December 7th 2010 to December 12th 2010 a survey on Digital Investigation Process and Accuracy was conducted in an attempt to determine the current state of digital investigations, the process of examination (examination phases), and how those examinations are being verified as accurate. An online survey was created in English using SurveyMonkey.com (2010), and consisted of 10 questions. Two groups were solicited: a control group from the University College Dublin (UCD) Forensic Computing and Cybercrime Investigation (FCCCI) MSc Programme (2010), and members of the Forensic Focus (FF) (2010) online community. The control group consisted of known digital forensic investigators, of which four replies were received. The second group consisted of anonymous replies from the Forensic Focus online community. Forensic Focus is a publically accessible online forum and information site on the topic of computer forensics that primarily uses the English language. 28 replies were received from this community, making 32 replies in total. The average responses from the control group were consistent with the average responses from the Forensic Focus community. For the analysis in this paper, all responses will be considered together. The collected survey data can be found in appendix A.
2. Survey Analysis
To determine if trends were sector or region specific, the following questions were asked:
Question 1 was to identify the associated work sector of the respondents.
· Which of the following best describes your organization?
78.1% of respondents claimed to be Law Enforcement, 12.5% claimed to be with a corporate entity, and 9.4% claimed to be contractors. No other sectors were specified, as seen in figure 1.
Fig. 1. Distribution of respondents’ by work sector
Question 2 was to identify the region where the respondents were located.
· What general region best describes your organization’s location?
68.8% or respondents claimed to be from Europe, 21.9% claimed to be from North or South America, 6.3% claimed to be from Asia and South Pacific (ASP), and 3.1% claimed to be from the Middle East and North Africa (MENA) (fig. 2). This distribution is comparable the FF ‘members map’, with slightly less representation from North and South America. Also FCCCI has slightly more members from Europe than other regions, which could account for some of the over-representation in Europe. Given that the survey was in English and was posted to limited English speaking sources, there is an inherent bias towards English speaking regions. For this reason, any conclusions should be generalized as more relevant to regions where English is the preferred working language e.g. Europe, North America and Australia, rather than a truly global view.
Fig. 2. Distribution of respondents' by region
The next questions were created to determine an average caseload, and how well departments are keeping up with the workload.
Question 3 was to approximate the number of investigations per month.
· Approximately how many digital investigations does your department conduct per month?
43.8% of respondents claimed that 21 or more cases were being conducted per month, 34.4% claimed between 1-10 cases per month, and 21.9% claimed 11-20 cases per month (fig. 3). Each respondent claimed his or her department investigated at least one case per month. The scope of the answers in this question was also found to be too low, resulting in a loss of some specificity above 21 cases.
Fig. 3. Approximate number of digital investigations per month conducted per department
Question 4 was to approximate whether each case investigated involved a suspect device such as a computer or cell phone.
· Approximately how many digital investigations per month involve examining a suspect device (computer, cell phone, etc)?
37.5% claimed that 21 or more cases per month involved a suspect device; another 37.5% claimed only 1-10 cases involved a suspect device; 25% claimed 11-20 cases per month (fig. 4). When compared to question 3, there is a reduction of suspect devices analyzed vs. the number of cases, but suspect devices are still analyzed the majority of the time. This question does not consider the number of devices per case.
Fig. 4. Approximate cases per month involving a suspect device