JAD IEF vs Belkasof...
 
Notifications
Clear all

JAD IEF vs Belkasoft, opinions?

22 Posts
11 Users
0 Reactions
3,852 Views
jaclaz
(@jaclaz)
Illustrious Member
Joined: 18 years ago
Posts: 5133
 

yep images

Let's add this to the list of acronyms
http//acronyms.thefreedictionary.com/YEP

Stands for? ?

and yes, I am particularly picky 😯 , but really hate the use of unreferenced acronyms, just as soon as I learned about IIoC you come out with a new one?
http//www.forensicfocus.com/Forums/viewtopic/t=9693/

Back to topic, the problem (as I see it) with all this kind of tools/utilities is that. given a same disk image, to do a comparison test between two competing tools, which often offer to the user several different "options" or "approaches" is not that easy, and while you can say something like

  • tool "a" found 15 items with "option x"
  • tool "a" found 16 items with "option y"
  • tool "b" found 14 items with "option w"
  • tool "b" found 17 items with "option z"

you don't know how many "items" are actually there.
Sure, a third tool "c" may (automatically or manually) find 31 "items", but then one would need to understand which among the 31 found were the 15 found by tool "a" with "option x", and if those 15 all belonged to the 31, etc., etc.

Additionally there is no guarantee whatsoever that on the disk image there are a total of 31 "items", there may be 47 of them and you manually found only 31 (unless you actually "planted" those 31 items starting from scratch from a wiped disk, but in this case the image would not be "representative" of a "real life disk image").

I presume that all Authors/programmers of these softwares do this kind of tests which represent of course "good practice", and a "valid testbed", but that are unlikely to be exhaustive.

jaclaz


   
ReplyQuote
PaulSanderson
(@paulsanderson)
Honorable Member
Joined: 19 years ago
Posts: 651
 

This is an interesting topic for me, last week I had a small evidence E01. I was quite amazed at the results using different applications.
This enforced the understanding as a practitioner what is actually happening. Because the case was IIoC obviously images were of paramount importance.

Now for years personally the best carving tool i have come across is BLADE my own opinion, however its not like IEF. ( I do not personally know Graig BTW)

The first 2 results I did

Results

1. FTK = 0
2. IEF = 0

I then MANUALLY looked at the data, and soon realised that yep images should be there.

3. EnCase = 14
4. C4P = 16
5. BLADE = 32

The point im trying to make here is dont just click away with applications to FIND THE EVIDENCE do not depend upon applications … Look, work it out, try other methods. If a application gives results that in your mind are not correct, then question this to yourself.

Your point is good - but it is also beneficial to understand what a particular tool does. I am only going to comment on C4P as I have done more work with that than others but one of the options with C4P for JPGs is to extract embedded thumbnails, this is turned off by default. Taking your stats at face value the discrepancy between Blade and C4P may just be that Blade is simply carving any jpgs that it finds (nothing wrong with this approach) whereas C4P (have a look at the enscript source) *could* be excluding the thumbnail.


   
ReplyQuote
Page 3 / 3
Share: