I personally would prefer that there were no more applications, pieces of software, suites, etc. for people with "minimal skillsets"
There are wayyyyyy to many people who call themselves examiners who fall back on a script to parse out all the information for them, not knowing anything about the results, how the results were obtained (other than pressing enter) or even how to interpret the results.
While I'm pretty sure these people will weed themselves out within a few years, there is no need to have the industry blanketed with point and click examiners.
Would you mind sharing what you do?
Being in LE, my cases are generally single desktops and centered on CP. Along with what has already been mentioned, I grab the user accounts key and last logon times. I also look for the last MAC times in an attempt to establish that the computer hasn't been used after the time of seizure. On most cases I've also found the qStringIndex key from the PSSP to be helpful.
I personally would prefer that there were no more applications, pieces of software, suites, etc. for people with "minimal skillsets"
I don't think that in this particular issue, it would be a problem. I think much of what Harlan is looking at is already done manually. An automated process would benefit those examiners.
However, I do agree and think that examiners should be able to explain what's going on underneath the automated process. Point and click tools are excellent if you can also explain how to do the same thing manually.
> …examiners should be able to explain what's going on underneath the automated process.
This is the point of the training/workshops that I'll be providing starting next week.
Harlan, any chance you'll be able to audio/video tape that? Some of us won'be be there but in spirit.
What I have seen is a backward process for a lot of "new" examiners. They are turned on by the idea of CF, so they get a software, get the vendor training and go to work. Then, when it's time to make a report or testify, they try to go back and learn the fundamentals of what the software did so they can explain it. It's hard to learn Hardware, Software, Networking, Security etc this way.
Every case is different, but my basic methodology is the same. I look at the case, what do we THINK happened, what evidence are we trying to locate, wheer is that evidence most likely to be found, how best to get it without destroying its usefulness, what tools do I need, do I needa knowledge or method I don't have or know, and what will constitute PROOF when I'm done. The last leads to a rash of gotta dos upfront like others have mentioned, the date, date settings, usnew events or evidence is found that leads to a different path.ers, net connections etc etc of the suspect machine. And of course, 90% of the time the methodology changes and adapts as the case progresses and
So what would you think of an application that extracted this sort of information automatically?
one word plugins.
followed by several more words
see, if the software can't evolve with the technology, you're writing yourself into obsolescence. If you release regular updates, that's very cool, but one day you'll give up, or get lost or squashed. And wouldn't it be nice to have a whole community sharing code (and therefore techniques) through your software? Just like firefox!
> one word plugins.
Now that there's some discussion along these lines, let me throw this out there for consideration…
Plugins would require an API, and then someone would have to actually write something in order to keep the tool up-to-date. How many EnCase users know how to write EnScripts? How many Nessus users write their own plugins, or even read the ones that are already available?
I'm thinking that there has to be some kind of happy medium. Having the ability to modify or add to code is a great ideal, but how practical is it really?
Harlan
Plugins aside, one of the things I wanted to also mention about a tool like this is that the author could include explanations and references in the report as to why something is important, what it means, or how it can be correlated. For instance, certain Registry keys and values have unique characteristics or circumstances that will provide context to the report, and hence, to the analyst.
I'll have more to throw at this when I have time…but for now, why not use prodiscover as the framework for this? It's certainly perl extensible.
Some of the things that others haven't mentioned..
When dealing with incidents I
Establish T0 (Time of compromise). This is done through correlation and analysis of log files(network and system/application), MAC times, Prefetch files,
Establish TC (time of containment). This is obviously..when the system is contained.
TC-T0=Window of Risk(WoR). This is the time the system was at risk of loss of sensitive data. We can now work statistical probabilities to assist with concluding our level of certainty. These statistical probabilities are based on a number of factors (network flows, unexplained flows, files accessed during the WoR,The Wor itself, profile of attacker, and a few others)
Other things I do
Binary analysis of unknown exe's and submittal(virustotal,norman,jotti).
Internet History analysis(Yes, I've seen attackers load up Internet Explorer and use it to download programs)
Look at mapped drives and MRU keys to determine if other systems are in play
Check the SAM for new accounts
If a specific account was compromised and used, I'll analyze the .dat file for that account to determine activity.
Interview the users of the system and the administrator.
I grab deleted files as there are typically files that the attacker didn't want us to know about..
and as usual…it depends on the case as to how far I go and what exactly is done.