±Forensic Focus Partners

Become an advertising partner

±Your Account


Username
Password

Forgotten password/username?

Site Members:

New Today: 0 Overall: 36783
New Yesterday: 2 Visitors: 183

±Latest Articles

±Follow Forensic Focus

Forensic Focus Facebook PageForensic Focus on TwitterForensic Focus LinkedIn GroupForensic Focus YouTube Channel

RSS feeds: News Forums Articles

±Latest Videos

±Latest Jobs

Thoughts on forensic software development

Thoughts on forensic software development



by Dominik Weber

Dominik Weber
About the Author

Dominik Weber is a Senior Software Architect for Guidance Software, Inc.

Working late on a Thursday night in an otherwise pretty empty building, I pause for a moment while the debugger is stopped at a breakpoint. I am thinking of the big difference between doing it right and just making it work. Often, this subtle difference cannot be easily seen by the users.

Computer forensics has been a fascinating field to me, ever since I started working as a developer on one of the world's leading forensic products in 2001. Forensic grade software is very unique and different from many other types of software. Having worked on embedded crypto software, video games, real-time animation and motion capture makes me very aware of this disparity. In addition to the usual issues with delivering complex applications, there are several other unique items to contend with. As an example, I have to be aware of forensic methodologies like data acquisition, the internals of file systems and disk formats and the internals of operating systems. Also, I have to assume that any data can be corrupt at any point, and therefore not act the same as properly formatted data.

Robustness is not the only issue. Memory usage, processing speed, data quantity and data quality are also important. In order to write code that will fulfill all of these needs, some research is needed. This research sometimes leads to highly interesting forensic finds like the ObjectIDs on NTFS file systems (I will write about this in an upcoming article). Any research and the intricacies of the implementation also need to be documented. Aside from documentation, I work with many other departments; Quality Assurance, Technical Services and fellow application developers, sometimes debugging a crash, updating our bug tracking system, writing a sample script, creating a regression test, making a presentation - oh, and yes, I do work on the code as well: implementing new features, reviewing re-factoring and occasionally improving some old code!

Furthermore, time permitting, I try to read several forensic message boards. I appreciate the work that forensic examiners do. Thus, I like to answer questions that are in areas where I consider myself knowledgeable and with something useful to offer.

Since I joined the forensic field over eight years ago, I have seen it grow and change significantly. Live forensics, memory analysis, eDiscovery and multi TB hard disks have grown in prominence along with new operating and file systems with seamless encryption. These new things bring to light new facets of forensics. One of the biggest impacts I think is reflected by Moore's law. The data explosion as a result of increased computing capability is still ongoing and the biggest problem is how to handle all of the information gathered. The software and the algorithms need to be scalable, much past the original limits. A TB of data is becoming the new default for a single computer. Users now have a large OS, large caches, Volume Shadow Copies, MP3s, downloaded videos, pictures and games. All of this adds up and for a thorough low-level forensic analysis, every byte of that TB needs to be inspected.

A lot of investigators no longer have the resources available to do this. Automatic classification, indexing and analysis will become essential. Also, the number of cases and the workload have gone way up. As a developer one of my goals is to provide software tools to enable this processing.

I see automation and scripting as the central tools that will be picked on a case by case basis from a larger script library. These will use indexing, hash and file signature analysis in order to reduce the amount of data that will be processed, stored and ultimately reviewed. And a lot of those tools are still waiting to be developed, improved, tested and debugged!

As I return to debugging I think to myself: One thing is certain - developing forensic-grade software is not easy...




--

Dominik Weber is a Senior Software Architect for Guidance Software, Inc. He has a Masters of Computer Science from the University of Karlsruhe, Germany and worked for video game companies (Activision) and on computer animation / motion-capture projects (Jay Jay the Jet Plane) before joining Guidance Software in 2001. He can be reached at [email protected]


Guidance Software is recognized worldwide as the industry leader in digital investigative solutions. Its EnCase and Enterprise platforms provide the foundation for government, corporate and law enforcement organizations to conduct thorough, network-enabled, and court-validated computer investigations. Worldwide there are more than 30,000 licensed users and thousands attend its renowned training programs annually.