Hello,
I am trying to put together the timeframe for the real push into memory forensics. I know that HB Gary started around 2003/2004 timeframe and Volatility around 2006/2007.
Also, was there a driving force that that made people think about what might be located in memory and move away from traditional dead box forensics? Such as disk encryption, active network connections / process list information etc?
Just curious what those in the industry think?
Thanks,
Mark,
The first Memory dump tool I became familiar with was George M. Garner Jr's Forensic Acquisition Utility.
Take a look at his sites
http//
http//
and the challenge here
http//
You may want to look at some of the crash dump analysis work that has been done. This may have been a forerunner to the RAM Dump we use today on scene.
Regards,
Chris Currier
In the UK, Nick Furneaux and Jim Gordon gave a joint presentation on Live Forensics to F3 (First Forensic Forum) back in 2006. I believe that was an important moment as far as changing mindsets was concerned.
In the UK, Nick Furneaux and Jim Gordon gave a joint presentation on Live Forensics to F3 (First Forensic Forum) back in 2006. I believe that was an important moment as far as changing mindsets was concerned.
Thanks Jamie,
Certainly in the UK I think our presentation to F3 (although naturally I standby to be corrected) was one of the first mainstream presentation on Live Forensics/RAM collection and did indeed generate alot of discussion. Nick and I demo'd using Helix and at the time just used what is now very basic commands such as strings on a dump file to create a custom dictionary for password cracking and also data carving using foremost.
I know that Dan Haggman of 7Safe was someone who inspired me in this area. He gave a presentation on 'what we are missing if we pull the plug' or something simlarly titled which predated the presentation that Nick and I gave. Dan was someone who championed network forensics in the UK and the use of WFT.
As Chris mentions the first that I became aware of Live Forensic (in particular Physical Memory acquisition and analysis) was the Digital Forensic research Workshop challange of 2005.
Two tools were developed during the challange; the Memparser Analysis Tool by Chris Betz and Kntlist Analysis Tool by George M. Garner Jr.
In respect of Volatile data collection I'm not certain how long Monty McDougall's Windows Forensic Toolchest has been about but I understand he demo'd it at Sans during 2005. The Incident Response Collection Report by by John McLeod is probably from about the same era. In one of his earlier books Harlen revealed his own tool for the collection of volatile data "the First Responder Utility".
I would thoroughly recommend Harlan Carvey's Windows Forensic Analysis book which has three chapters devoted to collection, analysis of volatile data together with a Memory analysis.
Hope that helps a little with the timeline.
Jim
In the UK, Nick Furneaux and Jim Gordon gave a joint presentation on Live Forensics to F3 (First Forensic Forum) back in 2006. I believe that was an important moment as far as changing mindsets was concerned.
I would thoroughly recommend Harlan Carvey's Windows Forensic Analysis book which has three chapters devoted to collection, analysis of volatile data together with a Memory analysis.
Hope that helps a little with the timeline.
Jim
For me it was the work of Harlan Carvey that made me aware of the issue as a whole and that it was an important one to pursue. Later it was the work of people like Rob Lee and Nick Furneaux who made me really start thinking about the possibilities from a non-incident response digital forensics standpoint.
I think it was a few things
Malware and similar threats became more mature (for lack of a better word) and became almost a living organism in memory with process threads, NIC traffic, shell exploits, etc.
And everything being online all the time.
Really after XP was out for a while (using MS barometer because that is the bulk of most investigations) and was running stable-er on the NT core without reboots (see breath wrong on 95, 98, ME) on an always "on" Internet connection that area of data became as important as the static file system information.
I guess I just look at it as a natural progression that did have an inflection point at some time a few years ago.
I am trying to put together the timeframe for the real push into memory forensics. I know that HB Gary started around 2003/2004 timeframe and Volatility around 2006/2007.
The need for information from memory has been around a lot longer than that…I pushed for it while you and I worked together. Before there was a way to collect and analyze the full contents of memory, many were running batch files and scripts to collect active processes, etc.
Also, was there a driving force that that made people think about what might be located in memory and move away from traditional dead box forensics? Such as disk encryption, active network connections / process list information etc?
All of that, and more. Obfuscated/encrypted PE files on disk that were "opened" in memory in order run them. Also, over time, organizations are learning more and more to ask the right questions as a result of an intrusion or data breach, and network connections need to be correlated with network information and device logs.
Thanks everyone, that is just what I was looking for.
Hi, sorry I hadn't given any input, bit of a family drama this week.
Mwade, it would be very interesting to understand your motivation for pushing RAM forensics. Although the technology and tools are improving (no-one has mentioned Volatility yet I dont think) one of the primary problems I come up against are the officers at the 'coal face', who simply cant get the power's-that-be to include acquisition of RAM in their search and seizure processes.
If you think that there may be encryption in use, Firefox running from a USB key, Deep Freeze in an Internet Cafe etc etc, RAM acquisition becomes very important and quite simply kicking the door in at 4 am when the machine is turned off just doesn't cut it.
So often I have Officers on my RAM course that say 'this is brilliant, but I will never see a RAM dump'. One of the issues in the UK is that HiTech chaps rarely get to leave the office, leaving them looking very similar to 1800's pit ponies, blinking when they see sunlight. Conversely, the chaps that do the actual door kicking rarely have the right size hands for plugging in a USB drive and pressing a button, typing with your knuckles not being too effective.
I'm afraid I dont know if you are private or Law Enforcement but would be very interested to know why and how you want RAM analysis to form part of your processes?
Anything I can do to help technically please let me know.
(No disrespect meant to the door-kickers, if I tried it my ankle would snap like a twig!)
Cheers
Nick Furneaux
Nick,
In that case suggest the examiners offer training to the officers bringing the computers into their labs. Develop and offer a Best Practices digital evidence collection course, Guide with resources and contact numbers, and send out flyers to the agencies in their jurisdiction. Work with the High Tech Crime investigators to reach out to specialized units such as fraud, drug (cell phones!), and Crimes Against Children Units. It is a win win for everyone. Develop that Cyber/Computer Crime local network.
The examiner and officers get to know each other in a different environment. The folks going on the searches get to know the examiner and now how the evidence should be collected and packaged. From emails, websites, cell phones, computers, etc… Hopefully this way the Search warrant team has someone trained to collect digital evidence. Have quarterly meetings sponsored by different agencies to put together a brief training and information sharing session. You may have already suggested this as I know I am preaching to the choir here.
Regards,
Chris Currier