Thoughts on testing...
 
Notifications
Clear all

Thoughts on testing tools

47 Posts
8 Users
0 Reactions
5,187 Views
(@Anonymous 6593)
Guest
Joined: 17 years ago
Posts: 1158
 

Do we have a resource anywhere the encapsulates times stamps and their associated knowledge together in one place. For example, that actions X, Y and Z change timestamps A, B, and C.

They exist, but they seldom provide references to tests or test protocols. Which means that it's usually difficult to know if they apply to one particular version of an operating system, or a range of versions.

You can find, for example, stuff such as this

https://digital-forensics.sans.org/blog/2010/04/12/windows-7-mft-entry-timestamp-properties

but … unfortunately it raises more questions than it answers. On the other hand, a blog entry should probably not be used as a source for this kind of knowledge.

A fairly good paper is Chow et. al "The Rules of Time on NTFS File System" (published in the congress/workshop proceedings of SADFE 2007). (Also available on the net.)

It's limited in that it refer to GUI actions (Windows Shell) only, and it is fairly old by now (Windows XP!), but it at least describes their test methodology, software versions etc. in decent detail.

(Added and you can usually find contradicting information if you look carefully. For example, the first page cited by Jaclaz says that on Windows the NtfsDisableLastAccessUpdate controls time stamp update. Another reference – and one which I would be more apt to believe – says it only controls time stamp updates on directories https://technet.microsoft.com/en-us/library/cc959914.aspx. But that source refers to Windows 2000 … so things probably changed at some point in time. And it may be relevant to know just when that was.


   
ReplyQuote
(@willbarton)
Active Member
Joined: 9 years ago
Posts: 6
 

Willbarton - i suppose in some cases or lab certification yes, but now I am thinking almost just in general. Would it be enough to just examine the process and trust the tool?

Apologies for the late reply. I believe so yes. It seems that the main area will focus on the risks and limitations to the approach which is used for retrieval of the evidence, hence why whether the user is competent and has had sufficient training will play a big part. As long as we recognise what the tool can and can not do, the overall process which the tool does would effectively be out of our control, but providing we know why the tool didn't extract (for example) a particular file type (like emails on iOS), then it seems that that will aid with our competence.


   
ReplyQuote
(@tootypeg)
Estimable Member
Joined: 18 years ago
Posts: 173
Topic starter  

Hi WIll,

But is the issue not this - "As long as we recognise what the tool can and can not do"

I think what a tool says it does might not always be what it is doing if we have no transparent tool testing? is testing which focuses on procedure sufficent? Im trying to think of a good examples and this is the best i can do. Using an xray machine to seek broken bones. The procedure could be vettted (place limb in place, change settings and press go - etc etc)…but if we havnt thoroughly tested that it can pick up all breaks in bone or indeed all bones, then its flawed?

Are we not in position with the likes of carving? prep media, idenitfy the headers/footers of target files and press go - procedure is sound. But if the algoruythm is flawed - then we have an issue? Is this not what insufficent testing has now created in this area?


   
ReplyQuote
(@rich2005)
Honorable Member
Joined: 19 years ago
Posts: 541
 

Prof Sommer has added his 2p on this subject
https://www.theregister.co.uk/2017/06/08/digital_forensics_standards_push/

Obviously I agree.


   
ReplyQuote
minime2k9
(@minime2k9)
Honorable Member
Joined: 14 years ago
Posts: 481
 

Using an xray machine to seek broken bones. The procedure could be vettted (place limb in place, change settings and press go - etc etc)…but if we havnt thoroughly tested that it can pick up all breaks in bone or indeed all bones, then its flawed?

And this is why any kind of logic like this is massively flawed. The difference that bones haven't changed is the last 100 years, our techniques have, but an old technique would still identify breaks in bones but probably not as well as a new one.

With carving and Digital Investigation techniques, old techniques may no longer work for newer devices/data. New filesystems, compressed files, database files and file formats may make the old technique completely obsolete.

As for the above reply referencing Peter Sommer, he is spot on in this case.


   
ReplyQuote
(@Anonymous 6593)
Guest
Joined: 17 years ago
Posts: 1158
 

With carving and Digital Investigation techniques, old techniques may no longer work for newer devices/data.

How would you know? Are you guessing? Sooth-saying? Prophesying?

As far as I can see, the only way to come to a valid conclusion that 'technique X does not work for device/data Y' is to test X on that and other material, and evaluate the results. If tests show a falling off (in correctly carved files, or whatever) for Y compared with other devices/data, that's what's you need to come to the conclusion. And if those are tests that anyone can do, and get similar results, the conclusion would, I think, be considered scientifically sound.

But you can't claim such soundness without those tests. No guessing or chiromancy or extispicy or augury. Or voices in your head telling you what is truth.

Yes, old techniques may no longer work. But then again, they may.

And 'you may be faced by difficulties and sorrows, yet there may be gold and happiness at the end.' Or you mayn't or there won't.


   
ReplyQuote
jaclaz
(@jaclaz)
Illustrious Member
Joined: 18 years ago
Posts: 5133
 

With carving and Digital Investigation techniques, old techniques may no longer work for newer devices/data.

How would you know? Are you guessing? Sooth-saying? Prophesying?

Maybe useful ?
http//www.dictionary.com/browse/may

1.
(used to express possibility)
It may rain.

jaclaz


   
ReplyQuote
(@Anonymous 6593)
Guest
Joined: 17 years ago
Posts: 1158
 

Maybe useful ?

Don't see how. Possibility is vacuous. Leave it. (Or ponder why you chose to prefer the possibility that it may be useful rather than the possibility than it may *not* be useful. Ignorance? Psychological predilection? Ego? Or did you flip a coin?)

If you have several methods for getting an answer and/or reaching a conclusion, is your choice of method going to be governed by that it may be better? When it also may not be better?

That's no better than flipping a coin. Or rolling dice.

You simply have to evaluate the methods. At least as long as you are not prepared to argue that flipping a coin was the appropriate method. Or any of the forms of divination that I have already mentioned.

Refusing to make en evaluation/test simply removes computer forensic from all claims of being anywhere near a forensic science.

i mean there's no reason for asking for computer forensic analysis if it all boils down to flipping a coin. Just flip the coin yourself. No expertise required. Even a jury member could do it.


   
ReplyQuote
kacos
(@kacos)
Trusted Member
Joined: 10 years ago
Posts: 93
 

… contradicting information if you look carefully. For example, the first page cited by Jaclaz says that on Windows the NtfsDisableLastAccessUpdate controls time stamp update. Another reference – and one which I would be more apt to believe – says it only controls time stamp updates on directories https://technet.microsoft.com/en-us/library/cc959914.aspx. But that source refers to Windows 2000 … so things probably changed at some point in time. And it may be relevant to know just when that was.

Actually things remain the same. NtfsDisableLastAccessUpdate exists up to Win10 with a default set to 1 (aka LastAccessUpdate time stamp is disabled) - check it yourself with an elevated cmd

"fsutil behavior query disablelastaccess"

or in the registry at HKLM\SYSTEM\CurrentControlSet\Control\FileSystem

"Disables (1) or enables (0) updates to the Last Access Time stamp on each directory when directories are listed on an NTFS volume."
https://technet.microsoft.com/en-us/library/cc785435(v=ws.11).aspx

"Each file and folder on an NTFS volume includes an attribute called Last Access Time. This attribute shows when the file or folder was last accessed, such as when a user performs a folder listing, adds files to a folder, reads a file, or makes changes to a file. Maintaining this information creates performance overhead for the file system especially in environments where a large number of files and directories are accessed quickly and in a short period of time, for example when using the BizTalk File Adapter. Apart from in highly secure environments, retaining this information might add a burden to a server that can be avoided by updating the following registry key NTFSDisableLastAccessUpdate "
https://msdn.microsoft.com/en-us/library/ee377058(v=bts.10).aspx

And as I understand it, it disables the last access time on the directory when a user, application or system activity lists or reads the files in it. A good example is the %Root%\Windows folder. Another good example is antivirus scanners if a scan is performed and access times were being updated then the whole volume would have the same or similar Last Access times.

It's also interesting to note is this regarding Access Time
"The NTFS file system delays updates to the last access time for a file by up to 1 hour after the last access."
https://msdn.microsoft.com/en-us/library/windows/desktop/ms724290(v=vs.85).aspx

and the explanation for that is summed up nicely in this

"The Last Access Time on disk is not always current because NTFS looks for a one-hour interval before forcing the Last Access Time updates to disk. NTFS also delays writing the Last Access Time to disk when users or programs perform read-only operations on a file or folder, such as listing the folder’s contents or reading (but not changing) a file in the folder. If the Last Access Time is kept current on disk for read operations, all read operations become write operations, which impacts NTFS performance."
https://technet.microsoft.com/en-us/library/cc781134(v=ws.10).aspx


   
ReplyQuote
kacos
(@kacos)
Trusted Member
Joined: 10 years ago
Posts: 93
 

To get back on the topic of tool testing, there are cases (such as Windows Media Player Database CurrentDatabase_XXX.wmdb files) where there is only ONE tool available, and no documentation whatsoever from either the tool provider or the source (in this case Microsoft). How would you test/verify/double check the results of such tools?


   
ReplyQuote
Page 4 / 5
Share: