Again, I don't see how the translation is being done incorrectly. You've said that granularity to the nearest second is fine.
If you need granularity to the nearest 100 nanosecond epoch, I can't disagree with that…that's your requirement, and it's fine. But for me, granularity to the nearest second is sufficient, so the translation isn't incorrect.
keydet89, interesting thread and enjoyed the article at your blog.
keydet89 wrote
"once you have 30+ systems in a DB, you're good to go…right?"
Yep, once it's in the db it's great, it's just the process of mounting all the image files and collecting the data that takes time.
Something interesting has been to use the data from the first sector of a burnt CD/DVD, which often shows the time/date and software used to burn the disc. Add that data to a timeline with suspected PC's can help identify which PC was likely to have been used to create the disc. Along with other information such as verifying that the burning software was installed, etc. All adds to the overall picture.
Darren,
Agreed. Right now, my scripts output to a five-field format, which should be relatively easy to add to a mysql database.
Darren,
Agreed. Right now, my scripts output to a five-field format, which should be relatively easy to add to a mysql database.
Heh, you could use Oracle as a database, and become a competitor for FTK 😉
Seriously, though, I think automated timelines can work, as long as there's an analyst using it, not as an end-all solution for people who can't read timestamps..
Roland
Roland,
I had some conversations with a buddy yesterday on IM that led to some interesting ideas that could lead to a means of representing data graphically, but it would, as you've indicated, still require an analyst 'behind the wheel'.