Log file analysis: ...
 
Notifications
Clear all

Log file analysis: tool requirements

4 Posts
3 Users
0 Reactions
523 Views
 vito
(@vito)
New Member
Joined: 16 years ago
Posts: 2
Topic starter  

Hi everybody,

I am currently working on my master thesis about forensic analysis of log files. As I am developing a software tool and a methodology to support the process of log analysis, I would like to ask you for some input to keep it practically relevant.

The scenario A major security incident occurs and log files of one or multiple sources have to be analyzed (access logs, firewall, whatever). You have a huge amount of log data and attack traces are burried deep inside "background noise", i.e. regular and irrelevant log entries, maybe scattered over several log files.

My questions are

  • What requirements do you have on such a tool in general?
  • What kind of mechanisms do you need to reduce the amount of data to a manageable level (e.g. filters)?
  • What kind of mechanisms do you need to follow attack traces and to reconstruct attacks (e.g. highlighting)?
  • What do you need to correlate and combine log entries (i.e. information) of multiple log files?
  • What requirements do you have on filter mechanisms (e.g. regex)?
  • How flexible do you need the support of log formats? "Completely flexible" by being able to define own formats or just supporting 3 or 4 "common" formats?
  • What kind of mechanisms do you need to comment and document findings and to prepare a report?

I would appreciate any help and any input/comments. Thank you in advance!

best regards


   
Quote
 vito
(@vito)
New Member
Joined: 16 years ago
Posts: 2
Topic starter  

Still no feedback? (

best regards


   
ReplyQuote
jaclaz
(@jaclaz)
Illustrious Member
Joined: 18 years ago
Posts: 5133
 

Still no feedback? (

best regards

Maybe your questions are not entirely clear/too "wide".

Maybe you could expand on things like intended supported OS (both for running and as "source" of the logs), defined "security incident", type of attack, etc. and also define "huge amount of data".

I would say that definitely a filter is useful, with a number of "IF" conditions selectable, but I am not sure to understand if you are asking for actual "needed/used on the field" search/filter terms/entries or about the "plain" usefulness of it.

jaclaz


   
ReplyQuote
imk54831
(@imk54831)
Active Member
Joined: 19 years ago
Posts: 17
 

Vito,

If you haven't already, you may wish to check out the following papers I came across in my malware forensics research

Methodology and tools of is audit and computer forensics - The common denominator, 2009, by Szezynska et al

Error, uncertainty, and loss in digital evidence, 2002, by Casey

These are just a couple of several examples in this area. Resources such as Scopus, Google Scholar, Science Direct and EThOS (for MSc and PhD dissertations) should provide plenty of others.

Regards

Ian


   
ReplyQuote
Share: