I was wondering if anyone could provide some guidance or advice on tools (Windows or Linux) for creating digital forensic data sets for research and testing. Basically, I am trying to create a raw image with a variety of file types and file structures (contiguous, fragmented etc.) for testing file carving tools.
Some examples of similar data sets are
1) DFRWS 2006 Challenge http//
2) DFRWS 2007 Challenge http//
3) Digital Forensic Tools Testing Images (Basic Data Carving Test #1) http//
I have made a couple of very simple data sets using a hex editor, and copy/pasting the contents of files into the new raw image file (originally created using dd and /dev/urandom). Although this method works, it is very time consuming editing the image in hex. Also, I am worried that the structure of the image file will not be representative of "real-world" data using this technique.
I have also tried creating a FAT partition on a USB device, adding files, imaging the device, then deleting the File Allocation Table so only file carving recovery methods are able to be implemented (due to lack of file system metadata). This method also works, but I am unable control the position or location that the file system stores files. This makes it very difficult to add testing scenarios for file carving tools, such as fragmented files.
Additionally, all other data sets listed above provide a sector location value for files contained within a data set (see http//
Is there a tool similar to a hex editor but for file systems/partitions that would work better? Something that provides a disk layout structure (like FTKImager/EnCase) but allows editing and mapping of the internal structure.
Any help would be appreciated!
–TLaurenson
This won't totally solve your problem. But here are some tools to make it quicker.
Use OSFMount (free app) to make a RAM drive, then after dropping all your files into the RAM drive, save it to a disk image from the 'Drive Actions' menu. Should be faster than using USB.
We also wrote a tool to fragment files (also free). You can pick the number, size and distribution of the fragments. See,
http//
It isn't as good as total manual control of each fragment. But way faster than doing it all by hand, and it also ensures you don't end up with a corrupted (impossible) file system.
It is also worth noting that the file system is managed in the operating system as clusters, and not sectors. So some types of sector fragmentation wouldn't naturally occur (as clusters are larger than sectors in general).
I have also tried creating a FAT partition on a USB device, adding files, imaging the device, then deleting the File Allocation Table so only file carving recovery methods are able to be implemented (due to lack of file system metadata). This method also works, but I am unable control the position or location that the file system stores files. This makes it very difficult to add testing scenarios for file carving tools, such as fragmented files.
Additionally to the previous suggestion, you can use allright a small tool (I would call it "side effect" or "collateral damage") included in mydefrag to "artificially" fragment files (and know where the fragments are)
http//
http//
or the other mentioned tool getfileextents (only to find extents)
http//
http//
You may also find of some use the tools here
http//
jaclaz
The following tool will do what you want for data set creation
http//