±Forensic Focus Partners

Become an advertising partner

±Your Account


Username
Password

Forgotten password/username?

Site Members:

New Today: 0 Overall: 34837
New Yesterday: 1 Visitors: 160

±Follow Forensic Focus

Forensic Focus Facebook PageForensic Focus on TwitterForensic Focus LinkedIn GroupForensic Focus YouTube Channel

RSS feeds: News Forums Articles

±Latest Articles

±Latest Webinars

A $LogFile parser utility for NTFS

Forensic software discussion (commercial and open source/freeware). Strictly no advertising.
Reply to topicReply to topic Printer Friendly Page
Forum FAQSearchView unanswered posts
Go to page 1, 2, 3  Next 
  

A $LogFile parser utility for NTFS

Post Posted: Wed May 01, 2013 9:26 pm

github.com/jschicht/LogFileParser

Features:
Decode and dump $LogFile records.
Decode many attribute changes.
Recreate a dummy $MFT
Decode all index related transaction (from IndexRoot/IndexAllocation).
Resolve all datarun list information available in $LogFile.
Configurable verbose mode (does not affect logging).
Optionally decode $UsnJrnl.
Logs to csv and imports to sqlite database with several tables (read included readme for details on each csv/table).

Most of the work have so far been on reconstructing datarun information to aid in datarecovery based entirely on the $LogFile. Though certainly interesting stuff, it has its clear limitations. Documentation is found in link at the top.
_________________
Joakim Schicht

github.com/jschicht 


Last edited by joakims on Tue Mar 04, 2014 9:41 am; edited 1 time in total

joakims
Senior Member
 
 
  

Re: A $LogFile parser utility for NTFS

Post Posted: Fri May 10, 2013 2:13 pm

Joakim,

Thanks for sharing this. There aren't that many tools (available for free) capable of parsing the $LogFile so this fills a huge void in the current tools. The ability to parse the $UsnJrnl file as well is icing on the cake. I only started to test and learn about the tool. Quick question though, do you have any plans to produce a log2timeline csv formatted output similiat to your mft2csv tool? It would be nice for it to be an option for both the $Logfile and $UsnJrnl file to make it easier to incorporate it into a timeline.

Thanks once again for writing this.


Corey Harrell
"Journey Into Incident Response"
journeyintoir.blogspot.com/  

corey_h
Member
 
 
  

Re: A $LogFile parser utility for NTFS

Post Posted: Fri May 10, 2013 8:42 pm

Good stuff! thanks for the link  

EricZimmerman
Senior Member
 
 
  

Re: A $LogFile parser utility for NTFS

Post Posted: Sun May 12, 2013 8:57 pm

- corey_h
Joakim,

Thanks for sharing this. There aren't that many tools (available for free) capable of parsing the $LogFile so this fills a huge void in the current tools. The ability to parse the $UsnJrnl file as well is icing on the cake. I only started to test and learn about the tool. Quick question though, do you have any plans to produce a log2timeline csv formatted output similiat to your mft2csv tool? It would be nice for it to be an option for both the $Logfile and $UsnJrnl file to make it easier to incorporate it into a timeline.

Thanks once again for writing this.


Corey Harrell
"Journey Into Incident Response"
journeyintoir.blogspot.com/


Hi Corey

Currently I am not sure how to best implement an optional log2timeline csv. It may happen but I am not sure yet. Inputs are welcome.

For instance I am wondering how a decoded INDX record as found in $LogFile should be put into that particular format. And some transactions don't have timestamps tied to them (although it usually may be resolved into something roughly close to the truth, now excluding the usage of $UsnJrnl totally). Because of the timestamps being present in $UsnJrnl (as well as having a small number of variables in each records), it is much easier to implement (log2timeline or anything basically) for that file, than $LogFile. Then you have challenges like partial information about an $attribute change, where all you have is a fraction of the new attribute, without necessarily having information about the original attribute. For instance in 1 record, there is information that an attribute was changed and all you have is the file reference number at the time of transaction (which may have been overwritten since then), plus 2 fully decoded $STANDARD_INFORMATION fields/values, and 1 partial which could not be resolved (actually it can under certain circumstances, but not at all easy). So, how would that fit in?

I definetely need to think more about this before going about to implement somehting like that.

Just updated to new version, with some important bugfixes that previously caused it to crash on some systems.

Btw, the NTFS File Extractor was also updated, now supporting the extraction of files directly from shadow copies (like for instance the $MFT and $LogFile). Maybe interesting.
_________________
Joakim Schicht

github.com/jschicht 

joakims
Senior Member
 
 
  

Re: A $LogFile parser utility for NTFS

Post Posted: Wed May 15, 2013 12:44 am

Joakim,

I’m not as familiar as I should be about the $Logfile and how certain transactions makes it difficult to present the information in different formats. I didn’t know what was involved when I asked my question so thanks for explaining the challenges with outputting the $LogFile into the log2timeline format.

- joakims
I am wondering how a decoded INDX record as found in $LogFile should be put into that particular format


This is a tough one because the INDX records are for the item when the transaction was recorded. Some of the timestamps (last accessed and modified) may not be as relevant since they only represent a certain point in time. For example, it may show File X was last modified at 11 while the $MFT shows the file was last modified at 12. I can’t think about any cases where knowing the file was changed at 11 before changing at 12 would be useful. As such, I’m not sure how valuable it would be to put the INDX records in the l2t format. It might be easier to leave it the way you currently have it and query it when needed.

- joakims
Then you have challenges like partial information about an $attribute change, where all you have is a fraction of the new attribute, without necessarily having information about the original attribute


I’m not sure how feasible this approach would be but could the focus be on only including certain types of events in l2t format. One option would be to only include the $LogFile transactions that have timestamps tied to them and for only the transactions related to file creations, deletions, and renaming. Even if this was the only information put into L2T format it would still be useful when combined with the data in the $MFT and $UsnJrnl. Here is a partial view showing a file getting created from all three.

journeyintoir.blogspot...-data.html

- joakims
So, how would that fit in?


If my previous suggestion isn’t an option then I’m not sure about this one. All I can think about is now I’m even more impressed with tool developers who need to tackle this issues and present the info in a way people can understand.

- joakims
NTFS File Extractor was also updated


Thanks for the pointer. It’s on my list to test out tomorrow. I’m going to mention your tools in my next post since it’s a linkz for tools post.

Corey  

corey_h
Member
 
 
  

Re: A $LogFile parser utility for NTFS

Post Posted: Wed May 15, 2013 9:24 pm

I will have to think a little more about how to best output sensible information from $LogFile to different formats. A good link by the way. Thanks.

All suggestions are welcome.

Created a standalone $UsnJrnl parser, UsnJrnl2Csv, out of the already present code in the $LogFile parser. Will add the different formats to that one too shortly.
_________________
Joakim Schicht

github.com/jschicht 

joakims
Senior Member
 
 
  

Re: A $LogFile parser utility for NTFS

Post Posted: Sun Nov 17, 2013 11:31 pm

The tool have been updated with lots of fixes and other configurable options to make it much more accurate and userfriendly. Import of output from mft2csv must be from latest mft2csv version (also updated). It is a big step ahead, and now next thing on todo may be to implement more analysis of extensive amount of data present in the outputted databaase file.

Suggestions always welcome.

(it's a shame this one still is the only such open source tool...)
_________________
Joakim Schicht

github.com/jschicht 

joakims
Senior Member
 
 

Page 1 of 3
Go to page 1, 2, 3  Next