±Forensic Focus Partners

Become an advertising partner

±Your Account


Forgotten password/username?

Site Members:

New Today: 0 Overall: 36768
New Yesterday: 0 Visitors: 150

±Follow Forensic Focus

Forensic Focus Facebook PageForensic Focus on TwitterForensic Focus LinkedIn GroupForensic Focus YouTube Channel

RSS feeds: News Forums Articles

±Latest Articles

±Latest Videos

±Latest Jobs

Tool Vendors and Testing (an open study)

Computer forensics discussion. Please ensure that your post is not better suited to one of the forums below (if it is, please post it there instead!)
Reply to topicReply to topic Printer Friendly Page
Forum FAQSearchView unanswered posts

Senior Member

Tool Vendors and Testing (an open study)

Post Posted: Dec 18, 19 21:01

Bit of a curious one, I assume vendors operate on here.

I wonder if there would be he possibility to engage with any who might be willing to take part in a tool-testing study which will test tools against each other (comparable functions etc.), to evaluate performance? This is very vague at the moment but lets assumed:-

Vendor A, B & C - tested against functions D, E, & F (all do this task) and jointly report on results?

Would any be interested in collaborating on these activities? And if not, what barriers might there be?  

Senior Member

Re: Tool Vendors and Testing (an open study)

Post Posted: Dec 19, 19 00:29

If it was a reputable agency running the tests, then I am sure some would be happy to take part.

Except for the really basic functions, I think an apples to apple comparison would be difficult.

Example of basic function
Image a hard drive to make a E01 disk image.
But even this could be problematic, if levels of compression where different, or if there was a verification step, or extra hashing performed by one of the tools, or differences in handling of disk errors.

Example of high level function.
Index the files on the hard drive, search for files containing 50 different words and phrases, then export the results to CSV.
This is really hard to keep consistent. What file types are indexed, is OCR performed, are you testing in a low RAM environment, are you testing with hardware with a large number of CPU cores, is string extraction done binary files, is the indexing recursive (a PDF in a Zip in a Zip in a PST), what disk image format was used, what is the mix of file types, is unallocated space on drive indexed, etc...
In short there are dozens of variables & permutations.  

Senior Member

Re: Tool Vendors and Testing (an open study)

Post Posted: Dec 19, 19 10:08

Prof. Buchanan and team from Napier University published a paper on:

Evaluating Digital Forensic Tools ( DFTs)

6 Conclusion
This paper has outlined evaluation and validation methodologies, where some of these are too complex to be used by digital forensics investigators such as Carrier’s abstraction layers model [19], and others do not cover all aspects of the tools [32]. For all them, none has been implemented in such a way that enable automations of the validation process. This means that testing may need to be performed manually. This is obviously an issue as it takes away a significant amount of time from investigators.

Beckett’s [3] methodology can be used to define the requirements to validate digital forensics functions. This is a good methodology which covers all aspects in the definition of the validation process. However, the methodology does not cover the actual implementation of the validation process. Therefore, another methodology is needed. A good candidate is the methodology of Wilsdon [13] based on black-box testing.

Institute for Digital Forensics (IDF) - www.linkedin.com/groups/2436720
Mobile Telephone Examination Board (MTEB) - www.linkedin.com/groups/141739
Universal Network Investigations - www.linkedin.com/groups/13536130
Mobile Telephone Evidence & Forensics trewmte.blogspot.com 

Page 1 of 1