Performing live res...
 
Notifications
Clear all

Performing live response

55 Posts
12 Users
0 Reactions
7,192 Views
keydet89
(@keydet89)
Famed Member
Joined: 21 years ago
Posts: 3568
Topic starter  

> I'm unsure, however, if it would produce different reults if you capture
> memory associated with a specific process.

You mentioned my book, and the answer to your question is trivial to validate based on my first book. Follow these steps

1. Open Notepad and type something in. Save the contents to a file, leaving Notepad open.

2. Run pmdump.exe against the process PID, saving the output to a file.

3. Wait x minutes, and repeat, giving the new file a different name.

4. Check the sizes of the files, and then use md5deep to generate hashes of the files. Compare them.

Your results may vary. I don't know enough details about pool allocations, etc., to know how specific applications deal with them, but it would stand to reason that if a networking application were left unattended for several minutes, the network connections may time out, altering the process's memory contents.

On a similar note, I recently completed code for reassembling executable binaries from RAM dumps, such as those provided in the DFRWS 2005 Memory Challenge. It turns out that the reassembled image file, though identical in size to the original file, does not run correctly. I still have testing to do to narrow down where the issue is, but I believe, based on what Andreas Schuster has posted, that it may be in the initialized code section of the PE file itself.

If this is the case, then it's clear that once the program is run, those values will change…as will the contents of process memory. Still, testing is required to determine what stimulus, if any, is required to elicit a change in memory contents.

Harlan


   
ReplyQuote
(@echo6)
Trusted Member
Joined: 21 years ago
Posts: 87
 

At one time, computer forensics wasn't used in court at all, because it wasn't understood. Then some folks did some hard work, and got it accepted…and now there are aspects of computer forensics that are generally accepted in court proceedings.

In the UK the courts expect adherence to the Good Practice Guide for Computer based Electronic Evidence. http//www.acpo.police.uk/asp/policies/Data/gpg_computer_based_evidence_v3.pdf

Principle 1 No action taken by law enforcement agencies or their agents should change data held on a computer or storage media which may subsequently be relied upon in court.

Principle 2 In exceptional circumstances, where a person finds it necessary to access original data held on a computer or on storage media, that person must be competent to do so and be able to give evidence explaining the relevance and the implications of their actions.

You could argue that Principle 2 allows for live system examinations to take place, however it is vague and there is little or no guidance to how such an examination should be conducted. If I understand the thread of your argument you're basically saying we should go ahead get current methodologies tried and tested in court, so we can move on.

Today, technology (larger storage capacity, drive encryption, etc.) is changing the face of computer forensics, and the age of "Nintendo" forensics is gone. More and more, the old methodology is becoming more of a hinderance. As responders, we have to adapt. If we all wait around for the acquisition of RAM as evidence to become commonplace in the courtroom, without doing anything, nothing will get done.

Agreed, and as always the courts are slow to catch up. You could argue that by using some of the available methods that there is a risk that the courts will not accept them and important cases may be lost. Prior to it getting to that stage what can be done to minimise those risks?

unallocated space. One has to look at the totality of the evidence.

Good point, well made.

Consider this a "call to arms". The age of "Nintendo" forensics is in it's twilight. Yes, we'll still use the techniques, but they will no longer be the totality of what we do.

Agreed, I look forward to your next book -)


   
ReplyQuote
keydet89
(@keydet89)
Famed Member
Joined: 21 years ago
Posts: 3568
Topic starter  

> If I understand the thread of your argument you're basically saying we
> should go ahead…

Exactly! What's the other option? Paralysis.

The forensics community at large can sit there and wait for something to happen, or we can start trying things. Sit down, develop a methodology, and if it get denied, find out why…and make the necessary corrections.

>…as always the courts are slow to catch up.

Are they? How many courts have seen cases with RAM dumps being analyzed as evidence? If you can show me a significant number of cases going to court with RAM dumps containing evidence and being denied, I'll agree with you. However, if no cases are going forward like this, then are the courts slow? Or are the prosecutors (and in some situations, defence) the ones being slow? Or the cops?

> You could argue that by using some of the available methods that there
> is a risk that the courts will not accept them and important cases may
> be lost.

And? The same was true with fingerprint evidence, DNA, and the first cases involving computer forensics. Unfortunately, you're looking at this (and please excuse me, I don't mean to be offensive) like a techie nerd who never comes out of his closet. Of course there's the risk that the evidence won't be accepted, but you won't know until you try. And if the evidence from a RAM dump doesn't get accepted, then you've got the other evidence (remember…"totality"). No prosecutor is going to rest his entire case on a single bit of evidence…it's a single point of failure.

As far as important cases go…well, again, the prosecutor is not going to rest his entire case on just one piece of evidence, particularly an "important" case. However, what would be wrong with having several pieces of evidence that all support each other, one of them being the results of RAM dump analysis? That way, if it gets accepted…we've got our case law (or the beginning of it). If it doesn't…we still have our other evidence AND we know *why* it wasn't accepted.

> Prior to it getting to that stage what can be done to minimise those risks?

The same thing I've been saying all along…process, documentation, methodology. The analyst should obtain the knowledge to know, understand, and weigh the risks. If it's determined that a RAM dump is necessary, the options at that point are limited. Justify the decision and document it. That's what our forefathers in this business did.

Harlan


   
ReplyQuote
(@jimmyw)
Trusted Member
Joined: 20 years ago
Posts: 64
 

1. Open Notepad and type something in. Save the contents to a file, leaving Notepad open.

2. Run pmdump.exe against the process PID, saving the output to a file.

3. Wait x minutes, and repeat, giving the new file a different name.

4. Check the sizes of the files, and then use md5deep to generate hashes of the files. Compare them.

Thanks again, Harlan. I ran a few such tests. Depending on activity between the first and second run of pmdump, and given about a three-minute interval, the file size was exactly the same from #1 to #2, but the files hashed differently. Generally, no activity between tests resulted in same-size files. Normally, if I were to acquire such memory in the field, there would be minimal user activity before I ran my tool.

I noted the same results, regardless of whether I saved the file before running my tests. Also, text that I entered into Notepad did not show up in the dump, but I didn't type very much. I would have thought that some of the text would have appeared. I was also somewhat surprised to see that the memory dumps for a tiny text file were consistently over 25MB in size. It may be interesting to go back and run a comparison of the dumps.


   
ReplyQuote
(@echo6)
Trusted Member
Joined: 21 years ago
Posts: 87
 

Unfortunately, you're looking at this (and please excuse me, I don't mean to be offensive) like a techie nerd who never comes out of his closet.

LOL…no offence taken -)

If it doesn't…we still have our other evidence AND we know *why* it wasn't accepted.

Perhaps I'm being over cautious, but I would rather do my best to ensure the evidence was accepted in the first place. Which again comes down to all your very valid points.


   
ReplyQuote
Page 6 / 6
Share: