Notifications
Clear all

The need for "under the hood" knowledge

39 Posts
8 Users
0 Likes
2,575 Views
keydet89
(@keydet89)
Posts: 3568
Famed Member
Topic starter
 

All,

I found this article very interesting
http//www.unixreview.com/documents/s=9943/ur0512i/ur0512i.html

Specifically
"Does this mean that a point-and-click GUI application is the answer? No — you still need to be able to explain what is behind the GUI. This comes along with some extra training outside the tools themselves, such as with forensic books or classes."

What this tells me is that GUI tools like EnCase are force-multipliers…one can more easily make a process repeatable; ie, open the case, fill in these fields, run these e-scripts.

However, regardless of the tools used, one needs to be able to explain what happened when they pressed that button. I've always found that the more I understand something, the better I am able to explain it.

Harlan

 
Posted : 09/01/2006 5:54 pm
(@fatrabbit)
Posts: 132
Estimable Member
 

I've not read the article yet but I totally agree with your point about some of the better GUI tools being force multipliers. A comprehensive, well written tool can only make you more efficient in your work once you’ve learnt how to use it, and knowing what’s going on under the hood from a conceptual perspective can only help you ‘drive’ it better as well as explaining to others what’s going on.

 
Posted : 09/01/2006 6:42 pm
keydet89
(@keydet89)
Posts: 3568
Famed Member
Topic starter
 

Agreed. While I do see the GUI tools as "force multipliers", I also believe that at some point, there needs to be greater education to better use the tools.

Right now, it appears to me that there's little going on to develop better tools. For example, LEOs generally seem to be looking for a tool that meets their needs, rather than contacting someone to write the tool they need. Now, I know that this isn't always the case…the creators of EnCase and ProDiscover get feedback all the time, and there are contractors building tools to meet specific needs.

However, you need to go beyond the looking glass. GUI tools do a great job of telling you that there are alternate data streams available within the file system. However, they tell the analyst very little about them…for example, what is that ADS named "zoneID" and is it important that it's bigger than 28 bytes? Or, is it important that a Word document has an ADS named "\05SummaryInformation" attached to it?

It's things like this that can be important. Is it a matter of concern with your typical CP case? Perhaps not. However, it could be…more specifically, it could be more important as cybercrime becomes more sophisticated.

Harlan

 
Posted : 09/01/2006 7:55 pm
(@fatrabbit)
Posts: 132
Estimable Member
 

Interesting and valid points, where do I start.

Agreed. While I do see the GUI tools as "force multipliers", I also believe that at some point, there needs to be greater education to better use the tools.

Do you mean for instance that there is a certain deficiency with regard to how people use the tool itself, or are you saying that there needs to be a better understanding of those ‘under the hood’ concepts? I agree completely with the latter argument, better knowledge of what’s going on behind the GUI can only lead to better analysts and therefore better analysis and help defeat those arguments against so called point and click investigators. Also a side effect of this greater understanding and awareness should be an impetus for the community to strive for and demand those much needed better tools.

GUI tools do a great job of telling you that there are alternate data streams available within the file system. However, they tell the analyst very little about them…

I find this an interesting subject. A software tool, although very complicated it may be, can only ever have finite functionality or utility otherwise it risks becoming an automated, comprehensive, end to end procedure. There has to be a line somewhere between what the software offers and tells the analyst and the responsibility that the analyst must take upon themselves to find those extra things out. And part of this, as you say, has to be education, especially in the face of the ever increasing sophistication of cybercrime.

 
Posted : 09/01/2006 8:40 pm
arashiryu
(@arashiryu)
Posts: 122
Estimable Member
 

I agree.

For example, I have been using WinHex for sometime. But as I read "File System Forensic Analysis" by Brian Carrier I have realized that there is so much information right in front of you in WinHex and yet you don't see it.

Recently at the Techo Forensics 2005 conference Chris Taylor did a presentation on NTFS and FAT forensics using WinHex. He clearly demonstrated that how valuable it is to master the fundamentals of a file system you are examining. I clearly saw how a automated/tailored tool would have missed/ignored certain artifacts of a file.

 
Posted : 09/01/2006 10:54 pm
(@armresl)
Posts: 1011
Noble Member
 

What would be missed or ignored by an automated tool?

 
Posted : 10/01/2006 3:16 am
(@fatrabbit)
Posts: 132
Estimable Member
 

Wouldn't that depend on what the programmer forgot to code for?

 
Posted : 10/01/2006 3:35 am
keydet89
(@keydet89)
Posts: 3568
Famed Member
Topic starter
 

> What would be missed or ignored by an automated tool?

Quite a bit, actually, but depending on what the programmer forgot to code for. For example, most utilities that get metadata from Word documents use the MS API to do so. However, if you understand where the data is kept, you can pull it out, plus other information, such as whether the document was written or revised on a Mac.

Also, automated tools are great at pointing out ADSs, and they tell you the size of the ADS, but they don't throw up a flag if the ADS named "zoneID" is greater than 28 bytes.

Harlan

 
Posted : 10/01/2006 7:26 am
arashiryu
(@arashiryu)
Posts: 122
Estimable Member
 

Example from the lab was to recover a corrupted file (not deleted). From what I recall seeing, corrupt files can be put together without knowing the name, attributes, or even file size by just following the chain. According to my notes you start with the first addressable cluster and follow it to completion. Keeping a tally of the each cluster crossed save the data extracted with some kind of annotation.
Upon return from the conference I tried to reproduce the results in my lab on a hard drive with FAT file system which has some of the files I have always wanted to recover that are definetely corrupted. I had already tried using FTK to see if I can recover the files but no such luck. FTK did recover the deleted files but not the corrupted ones that I was sure of being present.

Armed with the limited knowledge I opened the disk in WinHex. I was quickly humbled. I could not locate a corrupted file in WinHex due to lack of depth of the knowledge of a file system this exercise requires. Got some light reading to do. File System Forensic Analysis by Brian Carrier sounds like a good start.

Although I did not grasp the complete presentation, but I do see the value of 'under the hood knowledge' to be gained. IMO, need of 'under the hood knowledge' is absolutely neccessary if I were to explain to the client how a corrupt file of evidentiary value was recovered manually and the one that was missed by point and click GUI tool.

Please pardon the lengthy post.

PS I am trying to gain permission to share the presentation with the forum. It is good reference of 121 pages of pdf.

 
Posted : 10/01/2006 9:48 am
(@gmarshall139)
Posts: 378
Reputable Member
 

I think Harlan hit the nail on the head with the term "force multiplier". It's really a fallacy to say that an automated tool is flawed because it allows an unqualified examiner to get some results. The flaw is with the examiner.

 
Posted : 10/01/2006 6:30 pm
Page 1 / 4
Share: