anyone knew more ab...
 
Notifications
Clear all

anyone knew more about TD2 e01 Compression Methodology?

9 Posts
6 Users
0 Likes
757 Views
(@chemical349)
Posts: 24
Eminent Member
Topic starter
 

i want to know more about the methodology of Tableau forensic duplicator TD2 e01 compression methodology.

how's the real time compression did in the Tableau?

i have no doubt that TD2 will not remove a single bit of evidence from my suspect disk, but should this sound specious if a registry show a XP machine had been using for over 2 years on a 750G disk but after compression, it only shown 30G of e01 file after acquisition with a TD2?

hex show most of them are '0'. This is the only reason that the tableau could compress over 90% of the disk from 750G to 30G, right?

i can think of 3 possibilities and would like to seek more opinion.

1. The Tableau remove or distorted some of the evidence.
2. The suspect had use a sterilize software to wipe out most of the data.
3. The compression methodology that Tableau use is so sophisticated that it is normal for it to compress the evidence files down to 10% of the suspect disk.

anyone had been using tableau with compression, can you shred me some light?

highly appreciated.

 
Posted : 18/08/2013 11:17 am
jaclaz
(@jaclaz)
Posts: 5133
Illustrious Member
 

NOT what you asked for 😯 , but some "generic" considerations.

I don't know which "kind" of systems you are used to image, but a "normal" "business PC" rarely has more than 15 or 20 Gb of space actually used.

I mean

  • XP at the most 2 Gb, let's make it 3 with a SP and all KB updates
  • a largish 3 Gb pagefile
  • an office suite 1 Gb
  • a "vertical" program of some kind, even a very large one, say Autocad or Photoshop, another 1 Gb
  • 12 Gb of word, excel and autocad (or whatever) files, including the never cleaned Outlook folders (BTW noone can actually produce this amount of data in 2 years)

make a nice round 20 Gb.

Consider also (check the production date on the hard disk) that the original machine (smaller) hard disk may have failed (losing all data in it) and that the one you imaged may be a more recent replacement.

If you prefer, take a "filled" "home" PC and exclude from it

  • downloaded Linux Distro .iso's
  • downloaded previews .iso's of MS Operating Systems
  • downloaded p0rn (movies and pics)
  • downloaded (pirated) movies
  • downloaded/ripped music
  • the attempts to use (with appalling results) Microsoft Movie Maker to put together a movie of the last vacation and of (you choose) marriage/son's school play/daughter's ballet dance
  • any senselessly downloaded crappy programs (never used if not to realize they are not what the user wants)

and what does remain? wink ?

jaclaz

 
Posted : 18/08/2013 12:37 pm
(@chemical349)
Posts: 24
Eminent Member
Topic starter
 

thanks for the reply jaclaz,
ur information is very helpful.

We have use encase 6.0 to verify that registry that the machine with this same disk (unless the user use a clone machine to clone it's old disk to this new disk) otherwise, the log said this earliest login was created date back 2011. and all the login show it's a pc that the suspect use it on a regular basis.

but then the remaining question is, shall a harddisk be sterilize all bit into '0' ( i have never try to duplicate a brand new harddisk before, so i don't know). My forensic teach told me, hard disk must be sterilize otherwise, they will not be in all '0', at any time.

a brand new HDD from manufacture will not be sterilized. is that true? or it's can't be sure?
i also want to know more a about the compression methodology of the Tableau TD2.

 
Posted : 18/08/2013 2:44 pm
jaclaz
(@jaclaz)
Posts: 5133
Illustrious Member
 

My forensic teach told me, hard disk must be sterilize otherwise, they will not be in all '0', at any time.

a brand new HDD from manufacture will not be sterilized. is that true? or it's can't be sure?

Normally a "brand new" hard disk will actually be "all zeroes" or "nearly all zeroes" (there may be some sectors written by some manufacturer tool/quality control process, but it's rare.

The exception may be (actually will be) any of those external USB or firewire disks (the ones that you buy as external case with a hard disk already in there) as they will be normally partitioned to a single partition (and formatted - often as FAT32 or exFAT) and containing a number of (mostly completely unneeded) "bacjkup tools"/Utilities, etc..

If you start writing on a hard disk that is all zeroes, and you do not delete data you have written, and you write to only a fraction of it, a large part of it wiill remain 00's.
You can still analyze the contents and check for allocated files vs. unallocated areas, if you find really NO unallocated areas containing file fragments (or anyway non-zeroes) it would be logical to suspect that some wiping software has been run.
BUT, merely as an example, if the user had a crash "yesterday", replaced the hard disk with a brand new one and restored the system from a (file-based) backup, that's more or less what you would find (all or nearly all unallocated areas being 00's).

The "standard procedure" is (for a brand new hard disk used to image/clone another hard disk) to wipe it, but that doesn't mean that it is not already "all zeroes" or that the very few non-zero sectors won't be replaced anyway by the partitioning and/or formatting, see
http//www.forensicfocus.com/Forums/viewtopic/p=6559978/#6559978
(and following discussion)

jaclaz

 
Posted : 18/08/2013 5:06 pm
(@chemical349)
Posts: 24
Eminent Member
Topic starter
 

one hundred million thanks for your explanation.
now i am crystal clear. )

 
Posted : 18/08/2013 7:12 pm
(@hommy0)
Posts: 98
Trusted Member
 

If we return to the original question that related to compression applied via the TD2.
If you apply compression when creating an E01, this does not effect the validity of the material that you are acquiring.

When an E01 is produced of a source device, the TD2 can calculate an MD5 hash value of that source media (i.e. the HDD from a suspect computer) This MD5 will describe the contents of the source in an uncompressed state. This is then embedded into the E01 itself.

When you load the E01 into an EnCase case, this will automatically begin a verification process - including a recalculation of the MD5 hash value representing the contents of the uncompressed source device.

If you wish to verify this for yourself might I suggest you produce an E01 of a piece media, acquire if first uncompressed then repeat the process and apply compression. In both instances (with and without compression) the MD5 verification hash value will be identical.

If you are looking for a good reference regarding the construction of the E01 evidence file, compression, and the verification mechanisms that are in place, you could read the "EnCE Study Guide" written by Steve Bunting.

 
Posted : 21/08/2013 6:13 pm
EricZimmerman
(@ericzimmerman)
Posts: 222
Estimable Member
 

the downside to compression by most tools is that it tries to compress everything vs just compressible things. when that happens, the image isnt stored as as efficiently as it could be

 
Posted : 22/08/2013 6:46 am
(@joachimm)
Posts: 181
Estimable Member
 

i want to know more about the methodology of Tableau forensic duplicator TD2 e01 compression methodology.

As far as the format E01 goes check
https://googledrive.com/host/0B3fBvzttpiiSMTdoaVExWWNsRjg/

The compression method is just zlib/deflate, one of more commonly used compression methods.
http//en.wikipedia.org/wiki/DEFLATE

Also see
http//www.forensicswiki.org/wiki/Encase_image_file_format

can think of 3 possibilities and would like to seek more opinion.

Don't rule out hardware/other issues during the acquisition phase, double check. As indicated by others the E01 is just a container format. That can store one or more integrity hashes of the source disk during acquisition.

 
Posted : 22/08/2013 9:47 am
(@chad131)
Posts: 63
Trusted Member
 

I don't have a TD2 to test with, but the TD1 uses empty block compression when imaging to E01. Basically, when it reads in a chunk of data, it evaluates it to see if it is all zeroes. If so, it will compress that chunk/block, if not it writes it out as is (no compression). Note that the entire chunk/block read must be all zeroes or no compression is used.

You can test this yourself by wiping a disk to all zeroes and imaging, then wipe the disk with another bit pattern (0xFF for example) and image again. The zero wiped disk will compress to almost nothing and the 0xFF disk will have no compression at all.

 
Posted : 23/08/2013 6:53 pm
Share: