In this how-to guide, we will cover validation and accreditation, the process of generating a dataset, and how to use the new Validation Tool incorporated into Amped FIVE.
Validation of forensic processes has always been an imperative task for Forensic Video Analysts. In recent years, however, since the introduction of Accreditation into the DME world, the requirement is now an industry standard.
The Validation Tool, integrated into Amped FIVE, allows users to detect if there have been any changes to filter results from one update to the next. The Validation Tool checks for differences by validating that the frame pixels generated by filters remain identical, regardless of the software version being used, or the system it is being used on. It reports this to the user through standard hash verification.
Amped FIVE contains over 140 filters, with most affecting the pixel values found within images or videos. Given the necessity for law enforcement agencies and forensic video services to perform software validation to achieve accreditation, the challenge of how to do this is now being faced worldwide.
The validation process is comprised of three stages.
The initial stage requires the user to create a database of samples and projects which reflect the filters they wish to validate. If a user wants to validate the Deinterlace filter, a video file with interlacing is required to be loaded. After applying the Deinterlace filter, with the appropriate settings to correct the issue, the project can be saved. Validation projects can contain multiple different filters in separate chains.
The second stage is running the Validation Tool on your database of projects to create the first set of results. This master set will cover the evaluation of the filter output, with every result producing a series of image hash values. It is these values that will be compared in subsequent stages.
The final stage will come once the software undergoes an update or when another instance of the software is being used on a different system than that used to create the initial data set. The user will run the Validation Tool again, this time creating a new data set, and then the results will automatically be compared. This stage compares the pixel hash values from one set of results to another, to identify any differences. With matching values, the user can be satisfied that the filters assessed are still producing a valid result.
Now that we’ve summarized the steps, let’s take a deeper look at each phase, providing more insights and additional comments.
Project Creation in the Validation Tool
Before generating an initial dataset, the user needs to create the projects which include the filters they want to be tested. The tool can process multiple projects, containing multiple filters, simultaneously. It is therefore recommended users separate projects into groups for ease of management and updating. For example, you could have a separate project for every filter, but it may be easier to have a project for a filter category. Within this project, every individual filter in that category will be used. Although it is possible to have a single project that encompasses every category and filter, it may prove difficult to maintain.
It is also worth noting that for many filters, the same sample file can be used. For example, it would suffice to utilize one image or video to evaluate each filter within the Adjust category.
For individuals who have yet to obtain a sample set, Amped offers an opportunity to start your efforts. By accessing the following link, you may obtain an Amped FIVE project file that will facilitate the creation of a Test Card video capable of undergoing validation.
In the following project, a single video has been loaded into multiple chains and each filter in the Adjust filter category has been used. We can therefore name this project “Adjust”.
Below you can see what your master folder of Amped FIVE projects would look like if you decided to create a project for each category.
The MEDIA_SET folder shown is where we can store the different samples that are used within the Amped FIVE projects.
To summarize this initial stage, the user is required to create Amped FIVE projects that contain the filters they wish to validate. These projects can cover individual filters, filter categories, or any specific groups of filters such as those requiring specific competency levels.
Dataset Generation
Moving onto the second stage of the validation process, this is where we open the Validation Tool in FIVE to generate the first dataset from the projects created in stage one.
The Validation Tool can be found within the Utilities menu.
The tool presents three distinct processes that may be selected via a dropdown menu located at the top of its interface:
- Generate: Create a new dataset.
- Generate and Compare: Create a new dataset and then compare it against a previous one.
- Compare: Compare two different previously created datasets.
Generate
After selecting your folder location that contains all your Amped FIVE projects, you can select “Ok” to generate the first dataset results.
Within the designated output folder, a new directory is created which contains the results of each generation. Subsequently, every set of results is stored within its separate directory denoted by the Amped FIVE Revision number and accompanied by a timestamp indicating the precise date and time of the corresponding generation.
Inside the folder, you will find images that were created during the moment of frame hash computation. Following this process, the results will be available in the preferred format. The tool’s interface offers two options for selection: HTML or TSV.
When testing the filters on video files, three images will be created. These include the initial frame, the middle frame, and the final frame obtained from the last filter in each chain. The resulting images are also labeled with their corresponding frame numbers. By checking the three distinct frames for videos we can rule out any changes to decoding issues such as loss of frames.
Before proceeding with the results, below you will find the filter list file.
The table provides a comprehensive breakdown of all categories and filters utilized during the dataset generation, along with their respective usage frequencies. Some filters like Audio Loader or Change Frame Rate cannot be validated through the use of this tool because these filters do not change pixel values.
Let’s look at the results and how they are presented.
The first thing we see is the date/time of the results as well as the software and hardware information.
A table lists all projects and filter chains analyzed, including any sub-directory location. In the table below, the results generated from the deinterlacing category can be seen highlighted.
The results are formatted in the following manner.
The AFP filename is followed by the Chain Name and the Chain ID. In this example, we can see that in the Deinterlacing Master Validation Set, we had a chain named “Line Doubling” where the filter Line Double was applied to a video. As the filter was used on a video, three frame numbers are presented. Frames 0,374 and 749.
The next two columns show the original file MD5 hash value, and then the hash value of the frame tested after the use of the Line Double filter.
The computation of the frame hash involves utilizing data obtained from individual pixels, and it corresponds with what is exhibited in Amped FIVE’s Tools’ Inspector window.
Finally, the time taken to process the filter and compute the hash is displayed in the table.
We have now successfully completed stage 2 and generated a filter dataset from the Amped FIVE projects created in stage 1.
To summarize this stage, the Validation Tool is used to generate their first dataset using previously created Amped FIVE projects.
Generate and Compare
The final stage of the validation process occurs after a change in the system or an update to the software.
Internal process procedures may dictate that if previously validated software is updated or changed, further testing is required to ensure that any results are still valid. With the Validation Tool, it is a simple process to generate a new dataset and then compare the results against a previous dataset to identify changes.
If there are any differences in the generated pixels from the filters applied, the frame hash values will differ, with the automated report highlighting the difference.
After selecting Generate and Compare in our validation process, the Dataset Results (1) field shall become available, allowing the input of the results file of a previously generated dataset.
After clicking the “OK” button, multiple new directories will be generated along with the results of the comparison. The new dataset is created within the Generation Results folder.
There we will now be a new folder named “Dataset Comparison Results”.
The results file, named, “Compare Results,” can be found within the “Dataset Comparison Results”.
At the beginning of the report, the software version and hardware used are documented and compared.
A table presenting the outcomes of the comparative analysis follows.
The table compares the hash values of tested frames and subsequently provides a verdict of either “PASSED” or “FAIL” for each comparison. It also provides an analysis of the time taken to process each image, with faster processing being presented in green, and slower in red. Such an assessment may assist in identifying system malfunctions concerning hardware configuration, particularly if one computer is observed to be taking significantly longer than others to execute chain processes.
Next to the “PASSED” column is the Sum of Absolute Difference (SAD). As shown above, all values are at zero owing to identical resulting images. In the case of a reported “FAIL”, this value would indicate the extent of dissimilarity between them.
We have walked through stages 1 to 3 of the validation tool, creating our projects, generating our first dataset and then subsequently generating a further dataset after a software or hardware change, followed by an automatic comparison.
There is one final validation process that a user can use which is simply to compare two previously created datasets.
Compare
The compare process simply requires two previously created datasets.
The Validation Tool released in the latest version of FIVE is there to assist users in conducting the validation process in a simple and efficient manner.
Meticulously crafted to be adaptable to all unit working practices, with a focus on flexibility, the Validation Tool is designed so that Standard Operating Procedures (SOPs) can be effortlessly composed for its use. In the same manner as the Assistant Tool, which serves as an aid in guiding users through processing SOPs, the Validation Tool is tailored to assist you with your Software Validation Processes.
For more information contact Amped Software.