Digital Forensics C...
 
Notifications
Clear all

Digital Forensics Capability Review

8 Posts
3 Users
0 Likes
279 Views
(@fcarlysle)
Posts: 4
New Member
Topic starter
 

I am looking for some help from the Forensic Focus community as part of a project for the Forensic Science Special Interest Group. In 2013 the SIG produced a capability review of digital forensics and now I am looking to update it as I am sure that the needs and technology will have moved on over the past two years.

I was hoping to get some feedback on the existing capability review to identify those areas that need to be updated. The capability review can be viewed at the following link http//www.researchgate.net/publication/269332581_Digital_Forensics__Capability_Review

Please bear in mind that I am not a technical specialist so please use layman's terms in any comments at this stage. Happy to answer any further questions on the review and SIG activities.

Thanks for your help

 
Posted : 29/07/2015 6:53 pm
jaclaz
(@jaclaz)
Posts: 5133
Illustrious Member
 

The capability review can be accessed at the following link https://connect.innovateuk.org/documents/3144739/3824737/FoSci+SIG+Digital+Forensics+2013+Final/6065d12a-97b0-494a-a633-7a115e003e31

Please define "can be accessed", all I get on that link is a redirect to a log in page.

You mean the document that is listed here
https://connect.innovateuk.org/web/forensics

but that just like your link redirects to login.

Should be the same that that can be actually accessed here
http//www.researchgate.net/publication/269332581_Digital_Forensics__Capability_Review
and downloaded from
http//www.researchgate.net/profile/Angus_Marshall/publication/269332581_Digital_Forensics__Capability_Review/links/5486ddfb0cf268d28f0509f5.pdf?inViewer=true&disableCoverPage=true&origin=publication_detail

jaclaz

 
Posted : 29/07/2015 10:19 pm
(@fcarlysle)
Posts: 4
New Member
Topic starter
 

Thanks for pointing that out - it seems to be an issue with the privacy settings that I had thought had been resolved a while back. I have now updated the link using the information you provided.

 
Posted : 30/07/2015 1:12 pm
jaclaz
(@jaclaz)
Posts: 5133
Illustrious Member
 

Thanks for pointing that out - it seems to be an issue with the privacy settings that I had thought had been resolved a while back. I have now updated the link using the information you provided.

Good ) .

But now what is the plan? ?

I mean that document was made out of the results of a poll/questionnaire with intermingled the summing up of a workshop or some other conference/meeting.

Having had a quick glance at it, it seems to me rather "generic" (no offence intended ) I do understand how the scope is/was very wide) and "timeless" in the sense that I don't think that anything has been done in the two years since it's publication to solve any of the main problems/issues found relevant that thus remain exactly as they were listed, maybe today a new poll would change the relative position in the perceived priority list but I doubt very little more.

jaclaz

 
Posted : 30/07/2015 2:35 pm
(@fcarlysle)
Posts: 4
New Member
Topic starter
 

What is next is an excellent question!

The plan is to try to produce a newer version as a resource for the community that has sprung up around the SIG, as we want to try and ensure that we are not handing out 'out of date' information - but if it turns out that the challenges/priorities are still the same then that saves me a bunch of work.

Longer term it would be great if we could use the review to try and drive some discussion and hopefully some outputs towards overcoming the issues discussed in the report, as a group the SIG is good at the stimulating discussion part but at the moment still working on getting some outputs.

You mention that the review is rather "generic" (no offence taken I would probably agree with you due to the scope of the topic) - would you see a benefit in trying to look at some more specific areas in more detail, or would that just be creating more things that people read but didn't act upon?

 
Posted : 30/07/2015 2:55 pm
jaclaz
(@jaclaz)
Posts: 5133
Illustrious Member
 

The plan is to try to produce a newer version as a resource for the community that has sprung up around the SIG, as we want to try and ensure that we are not handing out 'out of date' information - but if it turns out that the challenges/priorities are still the same then that saves me a bunch of work.

Well, if there is consensus that there is a need to update the document, that would mean that the document is itself to be considered invalid, this would create a nice Catch22 😯
Page 24

It was also noted again that participants chose not to differentiate between challenges arising in the immediate future (1-2 years) and those that may arise in the medium-term (2- 5 years). This could indicate that whatever poses a problem now will likely remain a problem for the foreseeable future.

Longer term it would be great if we could use the review to try and drive some discussion and hopefully some outputs towards overcoming the issues discussed in the report, as a group the SIG is good at the stimulating discussion part but at the moment still working on getting some outputs.

There is very little to discuss, the overall findings are not only logical but IMHO a good representation of the status quo.

You mention that the review is rather "generic" (no offence taken I would probably agree with you due to the scope of the topic) - would you see a benefit in trying to look at some more specific areas in more detail, or would that just be creating more things that people read but didn't act upon?

The difficult part when making this kind of "official" documents that "touch" several different spheres, i.e. Law/Public servants, private practitioners, commercial software houses, education/universities and get their opinions (obviously each one focused on their own *needs* or *experience* or *wishes* or *expectations* and most probably highly contrasting) is to sum up them with an assigned priority (as the list is obviously very wide, probably wider than reality) and then do (WHO will?) something about the most relevant items.

That is if the final scope is practical as opposed to theoretical/research/whatever.

Screening/filtering the results is difficult and most probably it cannot really be done (at least not in a "politically correct" way).

I mean - as an example - reading the answers to

Question 8 What other innovations, relating to technology, services or any other
issues affecting digital forensics do you think would be useful?

My instinct would be to find and meet the people that answered to it

• Pure digital forensics research (as opposed to applied).
• Quantum computing.
• Nano-technology.

to be able to (metaphorically) kick them in the a*s, hard wink .

But apart from this extreme example, it seems to me that the contents of the document are accurate and more or less reflect the personal (often frustrating) experiences that every member of the forensicfocus had, has or is having in his/her narrower specific field or role, re-focusing on each of the listed items will only produce a number of more detailed papers all containing more or less the same two key points, whining and blaming

Whining
things are not as good as they could be, someone should do something about it

Blaming
the Government
more funds should be given for this (or that)

the Lawmakers
we need a new Law allowing us to do this (or that)

the industry
we need a new standard for this (or that) and they change things too fast

the academics
they are too theoretical, they teach students outdated info

etc.

jaclaz

 
Posted : 30/07/2015 3:57 pm
(@athulin)
Posts: 1156
Noble Member
 

was hoping to get some feedback on the existing capability review to identify those areas that need to be updated.

Just some ideas that occurred to me as I was reading …

* Tool testing and validation. So many problems in the document are 'solved' by 'new and better tools', that the question of acceptance testing of those tools (at the very least, in a number of key areas) seems inescapable.

At the very least, those key areas need to be identified. File systems is almost certainly one of them. Document containers and emails are likely to be two other.

* Result validation is closely related. 'We know these results were valid for Windows XP – are they still valid for Windows 10'? Does anyone test? Really? – i.e. have the tests been published? (I imagine forensic pathology may have it easier – there's not a version 10 coming out anytime soon in that particular field.)

One thing I believe is needed (though not necessarily wanted) is test data for the purpose of such testing.

* Education and training is very closely related. If someone who has only seen client NTFS file systems is confronted by a Windows 2012 R2 deduplicated file system, interpretation is not sure to be according to reality. This is not 'forensic education', mind you – this is basic IT platform education. (And this is one area where certification may be applicable as well as useful IT platform certification.)

* The question of pure research (mentioned in Q7 responses) is related, though I would not make any hard and fast division between pure and applied research in this case the same principles need to apply in both. Unfortunately, the requirements this makes on the work in the field is not always understood.

In other question, education is listed as one way to solve some of the identified problems. To the extent that such problems are fundamentally lack of knowledge – which I think is more often than is generally admitted – additional research and/or dissemination of the results from such research may be a way to provide a basis for such education.

One of the basic question here would be 'what _don't_ we know well enough?'

Q7 seems very closely related to the 'acquisition or recovery, examination, processing and analysis' mentioned in the various definitions. I suspect that – to some extent – reflects the agenda of the group.

But what then? Analysis is not an end to itself. The results must be presented, understood, disputed, and generally get a chewing-over; conclusions must be presented to some degree of confidence and accuracy. To some extent that must be done within the field of 'digital forensics', but a part of it is done outside it. What do the people in *those* fields think needs to be done as to forensic science capability? Are the current results the field is producing satisfactory or not? I suspect that is a area where improvements may be wanted.

The 'Strengthening Forensic Science in the United States A Path Forward' (2009) document suggests that there is much to be done in the US. I suspect that may be equally true for the UK. The Recommendation 4 of that document (roughly speaking, remove forensics laboratories from the control of law enforcement or prosecutor's offices) had me surprised. That's one example of the kind of 'improvements' that outsiders to the field may consider important. (Though I don't know the status is in UK, it may not be relevant at all.)

 
Posted : 02/08/2015 11:49 am
(@fcarlysle)
Posts: 4
New Member
Topic starter
 

Thanks for your thoughts - please keep them coming!

 
Posted : 03/08/2015 1:06 pm
Share: