ISO17025 Inter Lab ...
 
Notifications
Clear all

ISO17025 Inter Lab Comparison data-sets (Final year project)  

  RSS
MarcA
(@marca)
New Member

Hello,

I am a computer forensics student at the University of South Wales in the final year of my degree. I spent the last year on an internship in law enforcement. The lab that I was working in was ISO 17025 accredited.

For my final year project, I am aiming to create data sets that can be used for validation in compliance with ISO 17025. These data sets could be used in an Inter Lab Comparison scheme. As I have only worked in one lab I only know their requirements and issues faced. I would like to have some input from others as to what elements you think I need to include in these data sets

• Which devices?

• Which operating systems?

• Which file types?

• Which forensic processes?

• Which file systems?

Thank you in advance for help and advice.
Marc

Quote
Posted : 23/09/2019 9:30 am
Rich2005
(@rich2005)
Senior Member

I think this would be a good question for the forensic regulator who implemented the silly standard wink

ReplyQuote
Posted : 23/09/2019 10:32 am
athulin
(@athulin)
Community Legend

As I have only worked in one lab I only know their requirements and issues faced. I would like to have some input from others as to what elements you think I need to include in these data sets

General principle data for the most common or most important analysis methods / tools. If those methods rest on prerequisites, data that validates that those prerequisites are fulfilled.

Example You may have a tool that does something useful with NTFS file data – say, examines ADS contents. So primarily, data that contains all kinds 'files' (I use the term in NTFS sense, not in everyday sense) with ADS data. Including at least one that has the max number of ADS that NTFS supports.

But prerequisites to run that tool is a) the disk/volume does contain a NTFS file system, and b) that NTFS file system is not malformed. So additional data sets with 'other' file systems, or 'old' NTFS, and possibly also HPFS (from Window NT days), as well as more or less malformed NTFS file systems. (Haven't looked into NTFS portability – it might be an idea to provide Micorosoft NTFS from Intel platform, Microsoft NTFS from ARM and other hardware platforms, Linux NTFS from similar 'different' platforms, as well as various release versions.)

Additionally, some analyses or sub-analyses ('find all documents on this volume that have been created on it') may depend on other analyses ('find all word processors on this system, present as well as past') and closely related ones ('find tools that can convert from one word processing format to another, but aren't word processors') … and so on. ('Find all archive files' … can all archive files be identified? Can they be examined correctly, or does it drop 'unusual' data or metadata?)

Your remaining questions should not impossibly be answered by asking the lab you have worked with and ask what operating systems/file systems/ … are the most common / most important for them to get right? (That list would probably be useful to post here for additional suggestions or ideas.)

The only question I'm not sure can be answered is that about 'what forensic processes'. I don't think there are any sufficiently well established. Best way is probably to suggest a few – that has better chance of proving suggestions or criticism, and so may lead to improvements.

I don't do this anymore, but the standard things I used to do before any specific analysis was performed included verification that filesystems were reasonably sound (basically fsck or equivalent), system on/off timeline, with particular attention on nice shutdowns vs brutal power off, users (existing and past), user login/logout timelines, external devices connected/removed, external connections, identifiable software (installed as well as uninstalled), and a kind of scraping of system logs for signs of problems or such (IOC, if you like, or other signs of problems). And usually any standard 'initiate case' on whatever forensic tool platform I was using.

I think it's easier to focus on tools, but … processes are certainly the ultimate goal.

I hope it's obvious that data sets are only part of validation. The validation process itself needs to be defined or … at the very least smorgasboarded so that someone can decide what to keep and what to pass by.

ReplyQuote
Posted : 23/09/2019 11:41 am
jaclaz
(@jaclaz)
Community Legend

For my final year project, I am aiming to create data sets that can be used for validation in compliance with ISO 17025. These data sets could be used in an Inter Lab Comparison scheme. As I have only worked in one lab I only know their requirements and issues faced.

I personally have serious difficulties in visualizing what is a "data set".

Why don't you post an example "data set" limited to one of the experiences you had (i.e. consisting in one - I presume one out of the many - actually used approach limited to only the requirements and issues faced in that specific lab during your internship)?

jaclaz

ReplyQuote
Posted : 23/09/2019 1:18 pm
minime2k9
(@minime2k9)
Active Member

OK So this this is a much bigger issue that can be covered in a 1 year project (IMHO) and I would focus on a specific process.
In a proper scenario, all the processes would be mapped out, validated and then covered by an over-arching procedure that links them together.

Lets take recovery of files - though focusing on picture/video files as that is a large percentage of LE usage. For sake of this argument we will limit it to common file systems and the Windows OS only.
Have a look at the NIST testing for carving images and that would give you an idea of the depth that I would create.
Then the requirements for file recovery should be specified such as
must recover non-fragmented files from file-system
Must recover thumbnail images store on from given OS

In both of the above, filesystem and OS would be variations for each test.

Then you would need to build on this test set to create test for thumbnail files being created in all OS between say Windows 7 and Windows 10 as a representative example, or base it on justification as to when the thumbnail file format changed etc.

Then further images where pictures included in
Different types of archives
Disk images (VHD, VMDK, VDI, ISO)
SQLite databases
Shadow volumes
…. probably quite a few other things that I have missed here.

I reckon as a conservative image that you would need somewhere in the region of 100 different image files to properly cover picture and video file recovery.

This is why ISO 17205 is a joke, nobody does proper validation (that I have seen) for the investigation side but UKAS rubber stamp it and everybody pats themselves on the back like they have done something useful with their lives.

Ideally there should be a national set of requirements for each process and then organisations validate against the ones they require. This way company x validating requirements 1 - 10 would be comparable (though no necessarily as good) as organisation y validating requirements 1 - 10.

I can provide further assistance via PM if you want some guidance with some template documents.

My 2 pence/rant.

ReplyQuote
Posted : 23/09/2019 4:15 pm
jaclaz
(@jaclaz)
Community Legend

This is why ISO 17205 is a joke, …

Well, no evil , a line must be drawn somewhere, you cannot just put ISO 17025 in the category of jokes.

That is unfair to good jokes, you need to specify that it is a NOT-funny joke. wink

jaclaz

ReplyQuote
Posted : 24/09/2019 11:01 am
minime2k9
(@minime2k9)
Active Member

This is why ISO 17205 is a joke, …

Well, no evil , a line must be drawn somewhere, you cannot just put ISO 17025 in the category of jokes.

That is unfair to good jokes, you need to specify that it is a NOT-funny joke. wink

jaclaz

Perhaps we could classify it as a trick, similar to something we would expect from Loki?

ReplyQuote
Posted : 24/09/2019 11:32 am
jaclaz
(@jaclaz)
Community Legend

Perhaps we could classify it as a trick, similar to something we would expect from Loki?

Loki may be evil, but he is not stupid, you should never confuse evilness with stupidity roll .

Compare with Carlo M.Cipolla, The Basic Laws of Human Stupidity
https://en.wikipedia.org/wiki/Carlo_M._Cipolla
ISO17025 (actually the people that forced its application on digital forensics) are in the lower left quadrant, Loki (and his tricks) would definitely be on the lower right one.

jaclaz

ReplyQuote
Posted : 24/09/2019 12:53 pm
minime2k9
(@minime2k9)
Active Member

I think I need to print out that chart and put it on my wall at work!

ReplyQuote
Posted : 24/09/2019 5:28 pm
jaclaz
(@jaclaz)
Community Legend

I think I need to print out that chart and put it on my wall at work!

Then get the "right" (original) one, including the POM line dividing Helpless and Bandits in two categories.

There is a (IMHO horrible) short and simplified version illustrated by James Donnelly, easily available online, originally publiahed
https://web.archive.org/web/20110320011208/http//wwwcsif.cs.ucdavis.edu/~leeey/stupidity/basic.htm

and the "original"
Carlo M. Cipolla The Basic Laws of Human Stupidity il Mulino ISBN 978-88-15-23381-3 Copyright © 2011

Again a line needs to be drawn between Bandits with overtones of Intelligence (B1) and Bandits with overtones of Stupidity (B2), Loki would be most probably in the first category ? .

jaclaz

ReplyQuote
Posted : 24/09/2019 8:11 pm
steve862
(@steve862)
Active Member

Hi,

I also think this is too large project for one person to do as part of a degree. In terms of inter lab comparisons there are three types of 'product' you might consider producing. These would be -

1. A media device which is used to test only imaging and hashing. You create it and send it to them and they would need to get the right number of sectors and the right hash.

2. A data set containing files including deleted files. This is intended to test a forensic tool in basic data parsing and performing specific automated functions. This could include carving, file signature checks, grouping of file types, unzipping, decoding etc. It's purpose is only to establish if the tool can correctly parse the specific data and report the correct information.

3. A full analysis. You are now potentially doing both of the above, but also looking at the examiner's technical knowledge and investigative skills. This doesn't actually have to incorporate all of item 2, because the examiner might choose to manually decode all of the the data themselves, effectively not using much of the forensic tool's functionality.

Here's the other challenge to doing this project. Would you be considered a reliable cited source of material such that digital units could convince UKAS your 'samples' have been correctly constructed?

Even for the creation of test digital media, UKAS expects highly experienced digital units to use NIST or CAST sets. By using these data sets the unit can then prove their own tools can correctly image and hash digital data. Only once they have done this they can begin to create their own test media.

As a university, UKAS might be happy to recognise them as a reliable source but I cannot imagine they would recognise any individual as a reliable source. As this is your project I would assume they would be your sample sets, not the university's.

Unless the university were willing to put their name to it and give it their guarentee, I don't think digital units would utlise your sample sets.

The other option is to put the cat amongst the pigeons and create data sets intended to test digital forensic tools in a way that you know they cannot parse correctly. No tool can do everything but at least we can manually decde the raw data and validate the results of the tool.

You might instead consider looking at the accreditation issue overall. To write, implement and maintain a quality management system and ISO17025, (and 17020 from next year), takes a lot of work and I think a different skillset to the one most digital examiners have. Whilst many digital units have obtained 17025 for some part of their work, a few units have also lost their accreditation. You could look at this from a resourcing and management point of view. What does it take to deliver and retain the standard?

I think a permanent post for a technical person needs to exist to gain and in the long term, keep accredited status for most units. There's scant training out there so maybe the project could look at how do we train people to be technical leads in this? What natural skillset is needed, what training do they need? Should there be a module in digital forensic degrees and ii there should be, what do you drop off the syllabus to make room for it?

ISO17020 is the next challenge and it will be expected we will have validated processes for live data capture in the field, imaging at crime scenes/customer sites and so on. I don't know of anyone producing any guidance on how units should prepare. Another possible area for you to consider perhaps?

As I have typed this response the scale of the work involved has increased again. This is more than a one person project. I wonder if your university have an interest in being a player in this area of study. It would require a number of people to end up with something that you could publish to a wider audience.

I don't know if any of this helps. If you were leaning towards writing some python scripts for parsing some unusual data as the alternative to this, do that. It will be a much more manageable project with more easily delivered results.

Good luck.

Steve

ReplyQuote
Posted : 26/09/2019 11:54 am
Share: