Standardisation in ...
 
Notifications
Clear all

Standardisation in forensics

6 Posts
6 Users
0 Likes
419 Views
(@tootypeg)
Posts: 173
Estimable Member
Topic starter
 

Hi guys,

I have posted this as a result of indi's post not along ago and the responses of Trewmte and Jhup. I am a research academic in digital forensics, based in a UK university, and one of my areas of research is standardization of processes and result output in DF. I am not asking for research ideas, I am just curious on everyone's thoughts, considering it was also raised in a recent interview on here of Eoghan Casey.

I mean ideally, what do we actually want to achieve by standardization in the field?

I suspect we will always have different tools which will all maintain some aspect of a niche output, no matter how small in order to maintain a need for the product itself. Are we looking for some form of additional process that sanitize all data formats in order to produce one global form - sort of like how the phone industry wants all handsets to adopt one connection type? Will this ever be realistic? It would require mass cooperation of organizations in this field, with arguably some form of governing body to regulate and oversee the process of this?

Even if we could achieve this, what advantage would be achieved? Increased higher standards of investigation? Facilitation of the transfer of knowledge between parties? would this realistically happen and would separate organizations be willing to share anyway? Im sure it would provide some advantages in speeding up the interpretation of data but will the advantages out-way the time and costs that would need to be spent developing and implementing a specific standardized process.

Just thought I would throw a few thoughts out there given that its early morning evil

I think this is a fascinating area for debate and would be interested to hear everyone's motives for achieving standardization in the field.

 
Posted : 19/05/2015 1:04 pm
(@athulin)
Posts: 1156
Noble Member
 

I mean ideally, what do we actually want to achieve by standardization in the field?

Who's 'we'? It needn't be the DF practitioners who want standardization, it could very well be their customers who aren't satisfied with the results they get.

For example, ISO 9001 is a standard, but it doesn't mean two organizations will produce the same implementation of it. But they have to create a quality system, and that's something by which they can be compared. Not by themselves, but by their customers.

Terminology is a probable field of standardization. Indeed, the lack of a homogeneous terminology could very well be an indication of lack of a proper base to work on.

Competence is another. One thing I would very much like to see is a professional certification in the IT field – unrelated to the DF field. For example, a DFA who investigates a case based on Windows should be able to show a certain level of Windows competence, perhaps similar to that of a sysadm of the platform in question, not just a 'I took a Windows XP forensics course five years ago'. If Office is involved, add some useful Office certification. And so on.

Similar levels of competence in DF are also needed – far too many tools as well as analysts get confused when more than one time zone is involved. (I do myself …)

Even if we could achieve this, what advantage would be achieved? Increased higher standards of investigation?

Higher standards of competence in IT generally and specifically. Lower risk for variant terminology, and consequent confusion.

Facilitation of the transfer of knowledge between parties?

A large part of that lands in the IT field. It should not be pulled into digital forensics that field has its own problems.

… would this realistically happen and would separate organizations be willing to share anyway?

That's another level of standardization if DF should be accepted as field based on a scientific foundation, there must be some degree of traceability and reproducibility. However, that could be done in other ways – for example by NIST-like product validation of well-defined fields.

Me, I'd like to see all kinds of synthetic test data so that tools can be evaluated. Say, test data for testing mail extraction and manipulation from mail databases. Or for handling different types of ZIP or RAR or TAR archives …

 
Posted : 19/05/2015 10:55 pm
Adam10541
(@adam10541)
Posts: 550
Honorable Member
 

I'll be the devils advocate here and say it will never happen, there are simply too many variables in any given data set.

You can have a 'test' PST file for example and run all the tests you want to show what can and should be recovered, but the second you try to apply that to the real world it falls down. Different configurations and settings on servers/pcs etc will have an effect on how tools run, not to mention the literally hundreds of different email clients out there.

There are certainly aspects that can and have been standardised, ie MD5 hash matching, E01 image formats, sector wiping for sanitization, to name a few.

Perhaps it would be possible to have a 'minimum standard' that all tools should meet, ie using the PST analysis there would be an accepted minimum standard of information any forensic tool should be able to extract whilst maintaining meta data etc…

But, who's the arbiter to decide what those levels should be? And taking that argument further how do you quantify the many 'non forensic' tools that forensic practitioners use on a regular basis.

I've noticed that those with a purely academic background tend to favour open source Linux based tools, generally because universities don't have massive budgets so that's what people are taught to use. These tools are more than capable of performing most of the tasks the enterprise level suites can, however their very nature of being open source means there is no way to standardise what they are doing or how they are doing it. Further to that the many bespoke tools that people develop. I've had help on this very forum from a couple of those clever people who can write their own software, how do we control that.

Standardise training and force DF practitioners to actually undergo some tool independent training would be something I'd like to see.

 
Posted : 20/05/2015 5:45 am
(@mscotgrove)
Posts: 938
Prominent Member
 

The problem with standards is that the world changes very quickly. When I started in the world of IT, the standard for word processing was Word Star, CP/M (8 bit) and 8" floppy disks. At the same time a 20MB hard drive (no typing error) cost about £800, and 256KB of RAM was a lot.

10 years ago tablets only really existed on Star Trek (iPad 1 was released in 2010).

I think the only type of standard will be good working practises

 
Posted : 20/05/2015 1:53 pm
jaclaz
(@jaclaz)
Posts: 5133
Illustrious Member
 

10 years ago tablets only really existed on Star Trek (iPad 1 was released in 2010).

Well, I had a Compaq Concerto in 1993
http//en.wikipedia.org/wiki/Compaq_Concerto
http//criggie.org.nz/laptop/concerto/mvc-269f.jpg

(only to show how the MS Surface may be new technology but surely not a new idea).

There is nothing new under the sun.

I think that - like we have for documents or spreadsheets some de facto interchange standards - the only aspects that can be bettered are

  1. terminology
  2. output format
  3. [/listo]

    I mean, a (say) file has a number of standard attributes and each of these attributes should have a "same" standardized name, right now each and every tool may output anything among a .txt, a .csv, a .tsv, a .xls, a SQlite database, containing (for the same "object", let's say a file entry in the $MFT of a NTFS volume) more or less the same "attributes" rigorously called slightly differently.

    Completely Off Topic 😯 but to have an idea of what I am talking about, check this (old) thingy here
    http//jaclaz.altervista.org/Projects/USB/USBstick.html
    around half I inserted a "Rosetta Stone" to cross reference how exactly the same things were called slighlty differently in 4 programs that essentially had the same scope and parsed exactly the same RAW data.

    jaclaz

 
Posted : 20/05/2015 3:20 pm
(@trewmte)
Posts: 1877
Noble Member
 

All good feedback above.

The reason why I mentioned Metrology is to actually see whether it is possible to have a minimum standard. In other words, start small and work in areas where commonality in agreement is high amongst those working in digital forensics.

Even before even writing test scripts or anything else start with e.g. the humble physical leads/cables and terminating plugs. They interfaces with the test tool and the target device. What forensics requirement should there be for these cables/leads/plugs e.g. VGA, DVI, HDMI, Ethernet etc etc. How many people keep a traceable record of what is being used to acquire evidence in the test lab.

iso9001 has been mentioned and this standard provides a useful guide on record keeping. In most cases user take for granted that the cable/lead/plug is ok and just swap it out if it is deemed not working? Simple questions

1) Is there a cable/lead tester on the market?
2) What results can be obtained?
3) How to determine output results?
4) Compare manufacturing guidelines for MTTF and MTBF?
5) Can the results scrutinised be improved?
6) Can a minimum standard be achieved.

Mundane and tedious testing is never welcomed, but long before digital forensics raised its head these tests were going on. My own earlier experiences were in telecomms manufacturing. We worked with factory type approval guidelines BABT340 and iso9001. Record keeping and testing of tools was fundamental and mandatory to retain quality. Devices were subjected to standards such as bs6301, bs6305, bs6317, bs6789 etc. I still believe that BABT340 and other standards and guidelines for the manufacturing and supply of telecomms and datacomms products for placing on the marketplace are far more aligned to digital forensics and provided industry-specific stepping stones guidance towards minimum standards because all manufacturers were being channelled through the same process.

Just because some of the examples given by the above standards have been replaced with EU or other standards, doesn't mean to say we cannot learn from those industry-specific experience and adopt a similar system.

 
Posted : 22/05/2015 10:52 am
Share: