±Forensic Focus Partners

Become an advertising partner

±Your Account


Username
Password

Forgotten password/username?

Site Members:

New Today: 4 Overall: 36205
New Yesterday: 1 Visitors: 145

±Follow Forensic Focus

Forensic Focus Facebook PageForensic Focus on TwitterForensic Focus LinkedIn GroupForensic Focus YouTube Channel

RSS feeds: News Forums Articles

±Latest Articles

±Latest Videos

±Latest Jobs

ACPO Principles Revised

Computer forensics discussion. Please ensure that your post is not better suited to one of the forums below (if it is, please post it there instead!)
Reply to topicReply to topic Printer Friendly Page
Forum FAQSearchView unanswered posts
Page Previous  1, 2, 3 
  

dan0841
Senior Member
 

Re: ACPO Principles Revised

Post Posted: Nov 06, 19 20:04

- Rich2005


I don't believe it's not do-able at all or unuseful.

I think you're missing the point as the intention isn't to get a tool that's validated as being perfect. That cannot possibly happen.

The point was that, instead of hundreds of labs running extremely limited tests, likely designed not to fail, with the sole aim of passing ISO17025, wasting a huge amount of collective time and money, some co-ordinated testing of major tools, as they're released, would be far more efficient. You could get the central body to run the kind of limited test that everyone would be doing to achieve ISO17025 and report the results for everyone rather than duplicating the work 100 fold (or more). .


You absolutely hit the nail on the head here. Some of the. current limited testing is a token effort which is designed to pass 17025 and is farcical. Even from accredited organisations.

It validates very little and is barely worth the paper it's written on. Mass duplication much of it devised to fudge a 'pass' of a tool/method.  
 
  

tootypeg
Senior Member
 

Re: ACPO Principles Revised

Post Posted: Nov 06, 19 20:53

In terms of testing, can someone give me an example of what should be done as I accept there is likely as you all state 'tests designed to pass'

How for example do we test a mainstream tool (X-ways, FTK, Encase etc etc). Is this what you mean?  
 
  

trewmte
Senior Member
 

Re: ACPO Principles Revised

Post Posted: Nov 07, 19 08:53

- dan0841
- Rich2005


I don't believe it's not do-able at all or unuseful.

I think you're missing the point as the intention isn't to get a tool that's validated as being perfect. That cannot possibly happen.

The point was that, instead of hundreds of labs running extremely limited tests, likely designed not to fail, with the sole aim of passing ISO17025, wasting a huge amount of collective time and money, some co-ordinated testing of major tools, as they're released, would be far more efficient. You could get the central body to run the kind of limited test that everyone would be doing to achieve ISO17025 and report the results for everyone rather than duplicating the work 100 fold (or more). .


You absolutely hit the nail on the head here. Some of the. current limited testing is a token effort which is designed to pass 17025 and is farcical. Even from accredited organisations.

It validates very little and is barely worth the paper it's written on. Mass duplication much of it devised to fudge a 'pass' of a tool/method.



Worth a (re)read:

An analysis of digital forensic examinations: Mobile devices versus hard disk drives utilising ACPO & NIST guidelines
Digital Investigation 8 (2011) 135 - 140 [doi:10.1016/j.diin.2011.03.002]


- 4.1. NIST tool evaluation
The guidelines requirements set out clear expectations of what a tool claims to do and what it can actually do after the vigorous testing phases.


- 6. Conclusion
Both NIST and ACPO guidelines need to be updated quite frequently as mobile devices are constantly evolving and their features becoming more ubiquitous. The forensic regulator for the UK, Mr Andrew Rennison, appointed in 2008, and his committee are being tasked with reviewing the principles of ACPO. It was stated by a senior police officer at CFET, 2009 that he is aware that the current ACPO principles require “modernising” to cope with the rapid changes in technology.



MTEB UK SEMINARS 2016 II v03- QA Lab Accreditation.pdf
www.dropbox.com/s/kun2...tation.pdf
_________________
Institute for Digital Forensics (IDF) - www.linkedin.com/groups/2436720
Mobile Telephone Examination Board (MTEB) - www.linkedin.com/groups/141739
Universal Network Investigations - www.linkedin.com/groups/13536130
Mobile Telephone Evidence & Forensics trewmte.blogspot.com 


Last edited by trewmte on Nov 07, 19 14:23; edited 1 time in total
 
  

jaclaz
Senior Member
 

Re: ACPO Principles Revised

Post Posted: Nov 07, 19 11:16

- tootypeg
In terms of testing, can someone give me an example of what should be done as I accept there is likely as you all state 'tests designed to pass'

How for example do we test a mainstream tool (X-ways, FTK, Encase etc etc). Is this what you mean?


Look no further than the (already linked to) NIST/DHS tests on Encase and FTK (limited to their Registry search/parsing functions, i.e. a very small subset of what they can do).

www.forensicfocus.com/...6/#6600956

Look at the various (many) entries in the "test dataset" and at the (huge) number of features tested, it is a LOT of work to:
1) create all kinds of "common" and "uncommon" items in the test dataset
2) check all the features
3) compare results with the actual (known) contents of the data sample at hand

I believe that *any* laboratory will take a couple machines/real cases, check a small subset of the features (i.e. the most common features on a very common test dataset, which will most probably will pass fine [1]) and call it a day.

N.B.: if you check the "defects" found, none of them are actually particularly "serious", but - only as a single example for Encase - the:

External Device
◦ Partial external device related data was reported. [Windows: ALL]
- The tool identified all USB storage devices, but it did not report several device
related metadata such as ‘Last Connected Date’.

might (or might not) make a difference in a real case.

jaclaz


[1] It is not like FTK and Encase are not tested tools (and it is years that they are around and - speaking of a Windows Registry - it is basically the same since NT 3.1), of course they are tested and work just fine on - say - 99.99% of the registries you can find.

Imagine the corresponding issues in mobile forensics where each and every manufacturer and/or each and every phone model changes something continuously ...
_________________
- In theory there is no difference between theory and practice, but in practice there is. - 
 
  

athulin
Senior Member
 

Re: ACPO Principles Revised

Post Posted: Nov 08, 19 09:13

- tootypeg
In terms of testing, can someone give me an example of what should be done as I accept there is likely as you all state 'tests designed to pass'


Some kind of basic functionality tests. File metadata, particularly that which influences forensic questions. Timestamps, ownership, access rights, and perhaps metadata that affect other things that are of particular forensic interest. Similar metadata from file archives, from backup files, restore points, what have you.

Does the tool extract them correctly? (EnCase managed to get confused exFAT timestamps pretty early, claiming that one timestamp was another, and vice versa. Something I felt should have been caught if even minimal quality assurance had been present.) Are they interpreted correctly, or at least unambiguously (You don't want to find that a tool reports some few hundred different timestamp (from a binary perspective) as the same timestamps (when it's reported to the user)).

Basic interpretation for additional data: e.g. is ownership information (typically some kind of binary data) converted to correct readable unambiguous user information? What about access rights? What about correct handling of any version changes? (A bit like tools that know about /etc/passwd, but can't cope with /etc/shadow. I don't know of any, but any change in OS implementation affecting these areas must also be treated correctly. (Like: add an /etc/shadow file to an operating system that doesn't use it only confuses the forensic analyst along with any tool he/she may rely on.)

The basic principle, I think, is to identify the information that must be handled correctly and reliably if it is to be of any forensic value. And then test that.

Add to that: correct identification of failure situations. This is difficult to explain, but ... there are ISO 9660 volumes that are correctly formatted according to the standard, but which most forensic tools will not recognize as ISO 9660 volume, typically saying that this is not a correctly formatted file system, or in bad cases, simply crash over. If the tool tells you that is not an ISO 9660 file system incorrect, further processing of an anomalous (bur correct) filesystem may be stopped short. If the tool had said 'this looks like an ISO 9660 file system, but I can't make sense of ... whatever' , an FA would be in a much better position to make a correct decision.

(Anyone asking 'really? how often does that really happen?' only proves my point. For reasons that should be obvious.)

RAID reconstitution tools probably belong here as well: do they work in situations where they shouldn't?
Human error in general is closely related area of testing.

A related form of testing is meta-testing: what defects have been reported and corrected (or remains uncorrected) over the past year or so. The manufacturer will simply have to provide that information. Can any conclusions be drawn about system errors in tool development, in areas of functionality, in quality assurance? Those conclusions could easily tell us where additional testing may be required.

(Case in point: the Danish problem of cell-phone forensics: www.forensicfocus.com/.../t=18014/)  
 

Page 3 of 3
Page Previous  1, 2, 3