Christa Miller: Online child exploitation continues to accelerate worldwide, even as technologists race to find ways to stem the tide. This week on the Forensic Focus Podcast, Si Biles and I talk with Cyacomb’s North American Vice-President of Sales, Mike Burridge; and Head of Business Development, Graham Little. Mike, Graham, welcome.
Mike: Thank you very much. Glad to be here.
Christa: So my way of introduction, Cyacomb’s mission is to help law enforcement and electronic service providers quickly find, block, and remove illegal images and videos. Tell us a little bit how you got started.
Graham: Yeah, I’ll take this one, Christa. So, we just celebrated our sixth birthday, so, easy for me to remember how old the company is.
So, six years ago we started what was then actually Cyan Forensics. We had a name change about a year ago to Cyacomb. So any of your viewers that are thinking Cyan Forensics, Cyacomb, that’s why we’re now Cyacomb.
We’ve got two co-founders: Ian Stevenson and Bruce Ramsey. Bruce is really key to this story. Bruce spent 10 years as the kind of lead digital forensics analyst within our home local jurisdiction over here in Scotland in the UK, Police Scotland.
He was that guy, that kind of chief digital forensics person for about a decade. And then he left the police to study a career in academia or to follow a career in academia. And he took what has become the core of our technology.
He took that idea, which is effectively block-level scanning of computer hard drives. And he led a four-year PhD project at the University of Napier here in Edinburgh and that’s where technology was developed.
So that technology came out, really groundbreaking stuff, and we decided to commercialize it in the area of police digital forensics. So that’s really our background story.
Christa: Okay. Okay.
Si: So, that’s your current offering. Now you’re selling into a variety of markets, rather than just being police and law enforcement. How are you finding that relationship working with service providers as opposed to dedicated forensic professionals?
Graham: Yeah, so I should maybe explain. There’s kind of two parts to Cyacomb. The first part is digital forensics for police and law enforcement, specifically with child sexual abuse material investigations, and also counter-terrorism investigations, as well.
So we’re a tool in that respect; built by police, for police. And that’s really been our bread and butter over the past six years.
But the other part of our company we call Cyacomb Safety. So that’s using that same core technology. Instead of finding legal CSAM or CT data of interest on a suspect’s computer hard drive or mobile phone, we can take that technology and we can help large kind of social media platforms find very quickly and block and remove forever that same kind of dangerous CSAM or nefarious content.
So it comes with its separate challenges, of course. You know, working and selling into police and law enforcement is one thing, working with these different platform companies is quite the other.
But the core essence of what we’re trying to achieve, basically, make the online world a safer place for all, is very much shared between those two kinds of customer groups, if you like, and those two sides of our company.
Christa: So, I’m curious about how, I mean, you just mentioned that there’s sort of, I don’t know if tension is the right word, between the private sector and the public sector. And I’ve got to imagine that has to do with some of the privacy interests and the safety interests, not just for children, but also like political dissidents for instance that, you know, I imagine the online service providers are keen to I guess balance, right? Can you tell us a little bit more about those two different groups that you’re selling into and how you’re dealing with some of those issues?
Graham: Yeah, sure. So there is definitely a requirement for privacy and also safety, and we’re proponents of both of those. We believe that you can’t do this without adhering to both. And of course there are some people in the world that are right there that would put an onus on privacy. I know those that would put an onus on safety.
But like I say, both really need to marry up. So I think from our perspective, it’s about finding that balance. It’s about educating those who are maybe in one camp or the other and trying to just bring clarity to the conversation.
It’s tricky, but it’s really important that we do that because we need people to understand where we’re coming from and hopefully educate and shine light on the fact that we don’t have to compromise on one or the other. We can do what we want to do with respect to both privacy and safety, which I think is really key to what we want to do as we go forward with safety.
Mike: If I could add to that to Graham’s comments, over the last almost a year since I’ve been with the company, I’ve attended numerous trade shows, conventions, association meetings.
And what I have noticed is many of the social media providers, and I won’t mention the companies, but I think that everybody listening to this webcast would be very familiar with them, have become very engaged with the law enforcement society.
They understand their responsibility, they understand they have a responsibility to protect their users, and I am very optimistic that as this continues and these partnerships continue, that you will see a very strong effort on the part of social media providers to make sure they’re providing a safe environment.
I mean, social media is here to stay. It has its benefits for sure, but there are some dangers, and I think those providers recognize that and recognize they have a very important role in making sure that they’re providing a safe platform for people of all ages.
Si: I think, you know, one of the issues that we’ve seen of late with social media is that when people feel that the platform is no longer supporting their perspective on things, and that might include privacy, they just jump ship and go to a different platform; not thinking of any particular platforms in the US that may or may not harbor particular political perspectives or whatever.
Do you think that there’s an issue with the fact that you are coming up with a solution that works really, really well, but it’s limited to platforms that want to engage as opposed to the others? Are we just driving people off to another platform?
Basically, if people are aware that Cyacomb is involved in a given social network, do you think that people might perceive it; I’m not suggesting that it is necessarily an invasion of privacy, but do you think that there’s a perception with people that it’s that way? I mean, I know the service providers want to engage, but do you think that the users are keen to engage, as well?
Christa: I was just thinking of the word, I’m sorry because I’m going to jump off, but surveillance, right, is really that, you know, law enforcement working with large corporations would, I think is something that a lot of users would be concerned about, whether legitimately or illegitimately.
And I would err on the side of legitimately, right? Because, I mean, just from a social standpoint, that’s potentially very scary for a lot of people. So, yeah, I would love to hear more about Graham, what you were saying before about how you think you’re able to balance these interests of privacy and safety among all of these different groups.
Graham: Yeah. And so these are incredibly important points and really core to this discussion, for sure.
So, the analogy we like to use is if you’re boarding a flight, it is a given in society and all over the world that you shouldn’t be able to board a flight carrying a gun or a knife. And so you go through a metal detector in pretty much every security zone of every airport in the world, and nobody questions that on the internet today.
It varies, but you can do some research and some smart people will tell you that you’re only, you know, a handful of clicks away from being able to view illegal and certainly very, very harmful, distressful, horrific CSAM child sexual abuse material on the internet.
Now, I think that is, well, I don’t think, I’m pretty sure that that is something that globally as a society, we can come together and we can all agree that that shouldn’t be the case.
And just like when you walk through the metal sector an airport, nobody questions that. We want to generate a society where you shouldn’t be able to send, receive, share, you know, that illegal CSAM material, nobody should really be questioning that.
And that is what we are stopping. We don’t care if you walk through that metal detector and you’re thinking about robbing a bank or cheating on your spouse or whatever. We just don’t want you to take down that aircraft because we can all agree that’s a bad thing.
Much the same on the internet. We don’t think that you should be able to easily share CSAM, a and I think pretty much everyone can get behind that. And just to be clear, we’re not worried about anything else. Our focus, laser-like focus is on that dangerous, illegal CSAM, nothing else.
Si: And counterterrorism stuff, you said counterterrorism as well when you were talking about the police. Do you do that as well on the social media side?
Graham: So, I mean, at the moment we’re focused on the stuff that is, you know, 100% illegal in that nation. So, you know, CSAM is really the key one. Of course there’s a sliding scale, you know. Counterterrorism data of interest, you know, one day might be slightly different to the second day.
But that’s not for us to decide, that’s for the governments to decide what is illegal to be on that specific platform. So, you know, we don’t get into that.
The other way that we like to look at it as well is we see the way the internet has developed in a kind of dangerous speed, really, that you are so many clicks away from being able to view dangerous CSAM or CT-related material.
And the internet has just grown so quickly, there have not really been enough sensible checks and balances. I can’t remember exactly when it was, but roughly about 50 odd years ago the seatbelt was invented, I think it was by Volvo.
And at the time when the seatbelt was released, people were up in arms. You know, “You’re not telling me that I have to buckle myself into the car. I’ll make that decision, thank you very much.” And at the time, you could probably see that, you know, you can understand where people are coming from.
We see today as being the internet’s seatbelt moment. You know, it’s grown arms and legs. Something needs to happen where it doesn’t get any worse because it’s a dangerous society out there online.
And we think it should, it should have a really good kind of firm look at itself. You know, where have we been? Where are we going? We think this is the Internet’s safety belt, seatbelt moment.
Si: Your technology, if you’re doing block level analysis, and I’m guessing this is hash-based block-level analysis of known existing material, do you have any technology that’s enabling you to detect new material?
And I realize that would be largely heuristic-based and potentially prone to issues. Because, I mean, there was a very recent case that I was aware of that Apple pulled up and blocked a guy for taking photos of his son’s genitals because he was sending them to a doctor, and that got flagged and caused problems.
Now, obviously that hasn’t been a hash match because that photo never existed before he created it. So is there anything in your technology that is doing this, or are you purely detecting existing materials or existing signatures anyway?
Graham: Yeah, so today as we stand, you know, we’re going to be able to find matches against known data as you’re alluding to there, Simon. But we have our own kind of internal skunkworks, if you like, our own R&D apartment. So we’re constantly looking at this evolving area, this evolving field.
So, you know, we are looking at other technologies but you’re quite right to raise that as a really good example of, you know, where this could potentially not work according to plan. But there’s always going to be, you know, examples on the periphery of where this, or indeed, any technology doesn’t work 100% of the time.
And I think it’s really important to the conversation that we’re aware of those, and we do our very best to make sure that we can do what we can to limit these issues. But, you know, that really is a kind of an example of, like I say, right off in the periphery.
So we’ve got to be aware of these, but, you know, show me a technology that isn’t without its slight growing pains. And I think it’s good that we can talk about these things and try to avoid them.
Si: And in that vein, I mean, doing block-level stuff, and one of the things I’ve noticed is really, there’s a quote on your website, which is, “Cyacomb gives results in less than three minutes, where our current approach takes five hours from the NCA.”
Which, I mean, apart from being a phenomenal improvement in, you know, algorithmic processing. If you’re doing it on the block level, you are going to be identifying single blocks, and there’s a high probability of false positives, at least initially until you are building a longer chain of blocks.
How are you implementing, I mean, this may be a technical question for guys we’re going to talk to later, but how are you implementing that management between a speed versus an accuracy relationship in detecting this stuff?
Because again, you know, we are talking about calling somebody a pedophile is quite often negatively viewed by them under whatever circumstances it comes through. So how are you managing that?
Graham: Yeah, so I think it’s important just to take a step back from that and appreciate that, you know, our tools are in the hands of law enforcement. Things are never are never going to be 100% the same time, every single investigation. And the police have a far wider kind of spectrum of analysis and intelligence.
So it’s not quite like, turn red, immediately send person to prison. There’s a bit more nuance to that. However, to the point that you’re making with false positives, we cannot claim that our technology is completely false-positive-proof.
However, without giving away the secret to our special source, part of the PhD research program that Bruce Ramsey, our co-founder led, part of the genius was getting that false positive rate down to manageable levels. And the mathematics will tell you that we’ve got it down to one in 2 million.
So yes, there will be false positives, a true false positive, but only once in 2 million hits. So it’s not perfect, but it’s definitely perfect enough for the law enforcement applications that we’re seeing. And I’ve been in this game now for six years, and we’ve not seen a true false positive yet.
That’s not to say that there are other false positive scenarios. So, for example, we use our contraband filter, which is our own proprietary data format that law enforcement around the world build from their local databases, they build these contraband filters and that’s what we’re finding; matches against stuff that has been put into that contraband filter.
Now, if something perfectly innocent has been put into that contraband filter then we could well get a match against that. So it’s not a false positive per se, we’re finding a match against the contraband filter, but it might not be something illegal.
So we try really hard to educate our users, make sure whatever goes into a contraband filter is indeed the bad, illegal, or perhaps indicative stuff that you would like to find in an investigation.
Si: Now, I’m going to be really picky because I 100% understand that in law enforcement, that’s not a problem because, you know, if this was presented in court, I would expect a full analysis of whatever evidence had been put down.
You know, they’re not going to turn up and go, “Cyacomb found a 50% match on this, therefore it’s illegal. We’re going to put that up.” So the law enforcement side, I completely understand where you’re coming from. Completely agree. And the contraband filters is interesting, and we might come back to that in a minute.
But this is exactly the same technology that’s driving the social media stuff. And therefore, one would expect that one has to wonder how the social media guys are going to handle it, because they’re going to be very risk-averse.
So for them, a false positive is going to be, “Right, that’s it, block that user,” because they don’t want to be carrying that material.
Now, one in 2 million, now, I understand statistics is a very funny thing, and you’re talking about one in 2 million in various sorted ways, but social media platforms are carrying multiple millions of pictures every day, which would imply using that particular statistic in totally the wrong way, almost certainly, that we would get at least several false positives a day that images match patterns that are being flagged as illicit images. Is that a reasonable consideration?
Graham: And I can definitely see where you’re coming from with this one, Simon. However, the core of our technology is the same in both, but they’re used in very different ways.
So the one in 2 million would not necessarily translate into what we’re doing with safety and kind of social media, and that really brings us back to the quality of the contraband filter, as well.
So, for example, in UK law enforcement we work with the UK Home Office who have their world-class dataset CAID, the UK Child Abuse Image Database that is being converted into a contraband filter. So that’s the kind of level of expertise and the kind of that gold standard of data that we’ll be using with law enforcement.
And that’s also the kind of data sets that we aspire to one day use in safety. And it should also stress that although we have proven that the safety side of things can work, it is still in its infancy. So we’re working with partners all over the world, you know, and to try and understand how best to deploy this solution. So it’s still early days.
Si: Okay. No, that’s really cool. The contraband filter stuff, you know, obviously the Home Office are doing this, and I assume in the US there’s a similar, and I’m sorry, I’ve hijacked the North American conversation.
Christa: Project VIC.
Si: Project VIC to do this. But you were saying that the end users have the capability to write their own contraband filters, as well. So, you know, this is an entirely flexible tool for the law enforcement community that if they’ve got a local, I don’t know, gang for want of a better word or a local mafia, and they have particular images that they’re associating with that, can they easily add those into your tools?
Graham: Yeah, so, we’re told that this is a really cool feature of what we do. So we work with, you know, large data sets. I just mentioned CAID in the UK. Our partners in the US and Canada, they’ve built their own contraband filters from CSAM, from live or previous investigations.
And that serves as a really good kind of net, you know, a nice wide net to catch and use with a contraband filter. But what they can also do is they have a cyber tip from NCMEC. They can take that data that NCMEC would give to them and they can make a bespoke, more kind of agile contraband filter that they could scan to just find those, you know, 200 images, as well.
And the other great thing about our technology is they can use that large contraband filter that they’re already using. They can actually use multiple large contraband filters.
So they’ve really got a really wide kind of selection of material that they’re looking for, but they can also use these more agile contraband filters that they can build as more data, more intelligence is found and presented to the investigator, “Hey, this suspect we think has this.”
You know, they can then go looking for that, and they can scan using multiple contraband filters which not a lot of other tools are able to do. Well, no other tools are able to do, apart from us. So that’s a really cool feature that our customers love.
Si: That’s amazing.
Christa: It’s interesting that you should mention NCMEC, because I was going to ask about the Cyber Tip Line statistics. They reported, I think, nearly 30 million Cyber Tip Line reports last year alone. And I think that was more than double what they had reported just in 2019. So obviously pre-pandemic.
So these are really daunting odds. And it sounds like the technology that you’re describing is designed to get at those increases in numbers, really. So, what are you anticipating, I guess, for the future? I mean, we’re coming to a close in 2022 here. I would like to think that there would be fewer of those reports, but like, how is that all working against those statistics?
Graham: Yeah, so it’s quite sobering reading when you see these statistics coming out of, you know, of a world-renowned organization like NCMEC who we’re very proud to say are a partner of ours. I think this year they announced their 100 millionth NCMEC cyber tip over the years.
You know, it really is quite sad, quite sobering, like I say. I think, and I kind of looked to Mike here who, you know, former Chief of Police in the US. I think it’s really interesting, the pressures that police are under here in the UK and the US, all over the world really, as this crime type, you know, kind of seems to grow and grow. Police forces aren’t necessarily being funded to meet that demand.
So, and that’s part of the reason that we came up with this tool in the first place. You know, the kind of traditional work smarter, well, they will always be working harder, but, you know you’re not going to be funded exponentially every year.
But, you know, these cyber tips almost seem to grow exponentially every year. So you’ve got to work smarter, you’ve got to adopt technologies like Cyacomb, like the other technologies that are right there because you’re not going to get another 20 digital forensics analysts coming on a conveyor belt every month.
So you need to work with these new technologies, understand them and really try and get the best out of them. But yeah, like I say, I kind of, over to Mike really, what that pressure feels like from a chief in a law enforcement agency.
Mike: I mean, I think you hit the nail on the head. You know, in law enforcement traditionally, we’re oftentimes behind the curve when it comes to being able to substantially deal with cultural issues or issues within the communities that they serve.
And funding is definitely a big part of that. As we like to say, the only time that funding comes rapidly is, you know, when both sides of the aisle agree that the US federal government needs to do something and needs to do it immediately.
You know, one of the biggest things in society today is, you know, the overdose deaths related to fentanyl, but still law enforcement struggles to get funding to help with community programs and help with enforcement actions.
Where we saw it before in law enforcement with the body-worn camera demand, you know, still agencies are struggling to get funding to meet federal mandates and state mandates for things like body-worn cameras.
With our particular tool, realizing that we need to do something quickly and rapidly to even stay close to trying to address this issue is providing a tool that doesn’t have a high demand of resources for training; the tool’s very easy to use.
You know, within less than 30 minutes, an officer in the field, a probation or parole officer or a line-level officer could quickly learn how to use the tool. As we mentioned earlier on the call, it provides rapid results and helps make a very quick determination about the amount of effort that is going to be required to investigate the case, close the case, or move the case forward, or potentially not at all.
And so it’s making the law enforcement and adult probation and parole field much more effective and efficient is something that we always look for in law enforcement.
But then, the other part of that is a term that I’m sure you’ve heard before is, you know, force multiplication or force multiplier. How can we do more with less? Which is the expectation in a lot of communities today. They expect us to do it all, but, you know, don’t want to provide the resources to do it. And I think we overcome those burdens and barriers and pain points for law enforcement.
Graham: Can I just add something to that, Christa? Mike is former law enforcement, so I wouldn’t really expect him to say this cause he’s far too humble, but as a non-law enforcement background guy myself, you know, the tools are one thing, but the community of law enforcement specialists, expertise experts out there that fight this heinous crime type is is nothing short of phenomenal.
The passion, the commitment, the expertise and the community, as well. The way these agencies and these units and these individuals come together to share best practice expertise, it really is phenomenal. I’ve seen something similar with my military background, but to see that in law enforcement absolutely blows me away.
So it’d be remiss of us not to acknowledge, you know, the skill set of police investigators fighting these crimes really is phenomenal what they do.
Si: No, completely, completely agree with you. I mean, the amazing people doing one of the most unpleasant jobs in the world day in, day out.
I mean, to both of you, do you see Cyacomb as an end-to-end forensic tool, or is it just this, I don’t mean just in any sense of a belittling term, but is it really genuinely focused on this triage stage at the beginning? “Let’s find this stuff so we can act on it quickly and then pass this off into a more traditional forensics tool,” or do you actually envisage it as a tool that will come up, you know, you will write a report at the end using Cyacomb and put that as evidence into court?
Graham: Yeah, so we talk about our suite of tools. So we have a tool that can be deployed on-scene and very quickly, very thoroughly and very simply detect CSAM or CT material.
Then we have a desktop tool that is perhaps more suited for the lab that would help a unit get through, you know, pretty intense backlogs. Some backlogs can run, you know, maybe a year or longer.
So we can cut down backlogs and also try and stop those backlogs from getting bad in the first place by limiting the amount of devices that end up going into the backlog. So they’re kind of scanned as they come into the lab.
And then as Mike alluded to in the parole and probation; or in the UK, the offender manager market; we have a tool for those guys and girls, as well.
But without any shadow of a doubt, we are but a tool in the digital forensics toolkit. No tool can do everything and I don’t think you’ll find a digital forensics expert anywhere on the planet that will tell you that there is just one tool to rule them all.
Definitely it’s about having that toolbox, but also about having that training, the understanding and that experience as well, and sharing that experience and understanding and expertise with those kind of, you know, less experienced digital forensics analysts that are coming through the ranks.
I don’t know, Mike, if you’ve got anything that you want to add to that with your policing experience?
Mike: Yeah, I will add based on my background and experience, but then working with the investigators who are using this tool is that, as I’m sure you’re aware, each prosecutorial area is a little bit different in what they require to charge someone, to take them to court, which evidence will be, you know, allowed for discovery.
And in some cases, some of the folks have said that by running a rapid triage tool and the report that it produces is enough for the local District Attorney or Attorney General, whoever their prosecutor is to do the initial charges.
And sometimes they will require additional forensic work, sometimes they won’t. So it really depends on where you’re at and how it will be taken through the court system and the justice system.
Si: That’s fair. I think, you know, from my perspective, what I’ve seen in the UK court system is the inappropriate use of triage evidence being presented at the point where it hasn’t had sufficient detail applied.
So, I think it’s an interesting balance that needs to be met, not by the tool, because the tool is a tool as Graham said, but by people using tools to appreciate where the limitations of it lie.
And in that regard, you know, you guys have, or you’ve alluded to training several times, you obviously do training and you do it at several levels. Can you tell us a bit more about your training schemes?
Graham: Yeah, absolutely. So, we think it’s absolutely vital that people understand our tools, what they can do, but also, equally as important is what they can’t do.
So, I mean, you mentioned earlier by finding, you know, not exact matches, you know, our tool in a law enforcement environment if it’s not in the contraband filter it won’t find it. So if it’s first-generation data or production material sometimes referred to, we won’t find that.
So, you know, it’s vital that we educate our user group about things like that. And we have, you know, our capabilities and limitations document. Not a bad 16-page read, I’ve got to say. And, you know, we’re more than happy to give that out to our customers, again so that they understand what the tool can and can’t do.
And we think education and training is absolutely vital. And we’ve got a group of experts within Cyacomb, all previous law enforcement experience within digital forensics that we can, unfortunately in this kind of brave new world, a lot of the training happens over calls like this.
I say “unfortunately,” it also means that we can reach a lot more people than we otherwise would be able to pick someone on a flight. But we also have folk over in the US now that are able to conduct this training.
So training, training, training; we’ve got to make sure that frontline officers and investigators know exactly what the tool can and can’t do so they can get the most out of the tool.
But also, you know, we’re more than happy to send people to other vendors. You know, if they’re looking for something slightly different, that other tool in their toolkit, we’re more than happy to use our expertise to help that unit understand, okay, you want to be looking at other tools, as well and, you know, here’s what’s available.
It’s not just about training up on Cyacomb tools, it’s about our wider experience, as well. And we love to convey that onto our customer base.
Christa: I think that comes back to something that you said, Mike, earlier about budgets and issues like that, trying to find ways to do more with less. As VP of Sales, how are you talking to law enforcement agencies that are dealing with some of those budgetary limitations that might not have the ability to purchase additional tools yet, or need to find business justifications, I guess, for purchasing tools and training?
Mike: Well, our price point is very affordable when you look at the full spectrum of forensic tools. I know in talking to customers, potential customers, investigators, when we mention the price of the tool they’re quite shocked by how affordable it is.
So I have not honestly had a situation where I’ve mentioned what the cost is and had somebody say, “Wow, we just, we can’t do that.” When they start relating our cost to how much they’re spending in overtime, how much overtime is being used, backlog in the labs, I mean, you know, Graham mentioned that earlier about sometimes it’s a year before they can get to a computer.
I’ve had customers tell us that, you know, their labs aren’t taking any new work unless it’s a high profile or a homicide or, you know, a crisis situation. So there the tool is not even available.
And the other thing is one of the things that we encourage, and I think Graham may have mentioned it, perhaps not, but, you know, working with the Internet Crimes Against Children task force throughout the United States, and when they’re able to do things as a group it becomes much more available to them.
The cost goes down, and as a group that can use this tool, and it becomes much more powerful for them, whether they’re getting together to create contraband filters or getting together to purchase licenses or work together on licenses. It just almost really becomes a no-brainer.
Si: So, I mean, is there anything you guys have got coming up in terms of releases or in conferences or in anything that’s of interest to you that you want to share? So that, you know, we can talk about new product developments that are exciting to you, or new products that are exciting to you? Anything coming up?
Graham: Yeah, sure. So the best way to keep in touch with us and what we’re up to in the future with regards to being out and about is jump on our awesome, lovely new website: www.cyacomb.com. You’ll see what conferences we’re going to be attending.
We’re also quite active on LinkedIn as well, which is quite a nice way to reach law enforcement communities. With regards to new releases, our R&D department, I refer to them as the skunkworks, I hope that’s not a trademark infringement on Lockheed Martin, they’re always looking to be one step ahead of the criminal.
Basically, it’s a bit of a bit of a race out there. I think what sets us apart from perhaps other vendors out there, we are a relatively small company still and we are still very agile.
So what I like to say to people, and this is the absolute truth, I’m really after partners at the moment and less so kind of the traditional customer vendor approach. I want partnerships and we want people to be using our tools and feeding back to us, you know, what they want the tool to do more of or less of.
So that we can mold that tool, because we’re passionate about getting the best tool into the hands of investigators so they can catch more bad guys and with any lot safeguard more victims.
So, lots of exciting new features are coming out. The next one that’s going to be coming out in relatively short order, it’s not released actually this week I think, is being able to review videos. So when we scan a device we can review that video and we can kind of in a really nice way scan across the video. So if it’s a long video, you know, perhaps the first few…
Si: Frames?
Christa: Frames?
Graham: Frames. Perhaps the first few frames mean very nothing to the investigators. So it’s a really cool way that you can just very quickly scan to maybe the middle of the video, for example, to get a better understanding of what’s happening in that video.
And then we’re extending the reach and the scope of our tools as well to be able to scan file names and look deeper into the drive and find more stuff around what we need to focus on though, which is our laser-light focus on finding known images and videos on that hard drive.
‘Cause at the end of the day, being able to find those quickly and efficiently even if they’re deleted, that’s what’s going to lead to giving the investigator that early march on the suspect, being able to, you know, focus their questioning and actually lead to an arrest a lot quicker than some of the peripheral information that is good, but it’s not necessarily finding something illegal.
But if we can get them a hit on an illegal image or illegal video right there and then boom, you know, they’ve got them. And that’s what we’ve got laser-like focus on as we also pad out and build our tools.
Si: Ah, fantastic.
Mike: I think I would add to that, you know, Graham, that was a great explanation, but I’d also add to that just based on my previous experiences being a VP of other public safety companies, a VP of sales, other public safety companies, Cyacomb places a very, very high priority on customer feedback.
As we introduce the product into North America, into more law enforcement agencies, getting them to try the product and give us feedback, we’re learning that things that we may have done in the UK they don’t need in the US, but then there are things that they really need in the US that maybe we haven’t done in the UK.
And our product development team and our engineering group are so hyper-focused on every little bit of product feedback and it’s just going to make the product that much stronger as we go forward. I’ve already seen several updates that have been in direct correlation to feedback from US users and people that have tried our products.
So as Graham said, we encourage that feedback and really strive to get as much as we can.
Christa: Yeah, wonderful. Well, I think we’re going to close it there, Graham and Mike, thank you again for joining the Forensic Focus Podcast.
Mike: Thank you.
Si: Thanks, guys.
Graham: It’s been a pleasure. Thanks a lot.
Christa: Thanks also to our listeners. You’ll be able to find this recording and transcript along with more articles, information, and forums at www.forensicfocus.com. Stay safe and well.