Christa Miller: Child sexual abuse and exploitation is one of the great scourges of our time, often pitted against consumer privacy as investigators seek technological means to end it. This week on the Forensic Focus Podcast, we’re talking with Thomas Farrell, Global Head of Safeguarding Alliances at SafeToNet. Previously as a decorated professional serving law enforcement and government, Tom worked to develop innovative methods to identify the highest harm, most sophisticated online offenders. I’m your podcast host, Christa Miller and welcome, Tom.
Tom Farrell: Christa, thank you very much. Lovely to be with you.
Christa: It’s a pleasure to have you. I know when I saw you present at a Crimes Against Children Conference a couple of years ago, I was really impressed by your talk, the details of which I know you can’t [disclose], but I’m really happy to reconnect with you and talk a little bit more about what you’re doing now.
Tom: It’s great to be with you. Who would have thought when we met, probably three years ago, the way the world would have gone in those years since?
Christa: Exactly, I know. So, it was just in August that you transitioned from your law enforcement career to your new role at SafeToNet. What drove your decision and also what about your experience appealed to them?
Tom: Okay. So it’s quite an interesting story, really. I didn’t apply for a job. As you know, I’ll just outline. So I was a law enforcement officer for 19 years. The last eight or nine of those were fighting online child sexual abuse and exploitation, and the last five spent within UK government working on projects, like you say, to identify the worst of the worst, the harder to detect criminals.
So, a fairly organic journey, really to SafeToNet. About a year ago, I happened to listen to a webinar with the founder of SafeToNet and found that a lot of my views aligned with theirs. We got on well, we spoke a few times before I ended up joining. On the whole, my reason really was because I wanted to find a more proactive, preventative way to try and deal with online child abuse rather than the reactive way, which is so difficult and causes so many issues for everyone.
Christa: Yeah, yeah, absolutely. So tell us a little bit more about SafeToNet and your role. What does your work now entail? How does it build on your previous work as you were just saying? And what are you hoping and planning to accomplish in the coming years?
Tom: Okay, great. So, SafeToNet exists purely to protect children online. We were founded by a couple in the UK about seven or eight years ago, and our focus is to develop innovative tools and solutions that can be applied across a wide range of platforms or tech. So we have what you probably know in the United States, we own Net Nanny, which is quite a well-established parental monitoring brand, but we also have AI-driven, we have keyboards, for example, that sit on the child’s device and help educate them through that online journey and protect them, as well.
And we have more developing tech that looks to, for example, detect in real time the sharing of child abuse material and block it and prevent it from happening. So, a really preventative approach and one that is based on trying to educate both the child, but really importantly, for me, educating the parents, as well. Because as a parent of three daughters who are nine, 11 and 13, I know that often the parent, even if they’re fairly technical, will feel that the child knows more and is more in control of their online journey than they are.
Christa: So that actually touches on my next question. Because I think a lot of people tend to demonize the technology, but it’s really, I’m sure you can agree, just one piece of the overall problem. I think the main thing is ensuring children are less vulnerable to predators online, starting with good boundaries and relationships making them less susceptible to grooming. How do your software solutions help children and their caregivers to navigate those and communicate more openly about those kinds of issues?
Tom: Yeah, no, that’s a great question, Christa, and it’s a lot of the reason why I was so interested in SafeToNet in the first place. I’m a big believer that the old fashioned, ‘Don’t let them go on this platform or take away their device, they’re only allowed one hour a day,’ is not the way to deal with it. I try to draw parallels to our adult existence where it’s really quite hypocritical for me to sit on my device for 10, 12 hours a day and expect my children to only spend an hour on those.
So, our tools aim to keep children online, because online is going to become and is part of everyone’s life and will be for a long time, but to try and guide them through the complexities and how difficult that journey really is. So we offer wellbeing advice, we offer links out to different kinds of support networks. So if a child is feeling suicidal, for example, we don’t want to worry them. We want to help them understand and try to find ways online to educate themselves and try to make things better for them.
Christa: So, I mean, it sounds like you’ve made some pretty great strides in this area. What’s still to be done?
Tom: Well, so loads really. I spend most of my days with plenty of work to do. There’s more than enough work to be done. I think big strides are educating the public to just how bad the problem is, but not scaring them at the same time. It’s a real fine balance. If you scare people too much, they switch off and go back to the head-in-the-sand kind of response.
So it’s really important to try and educate parents, educate the public in general. And my biggest thing, and what my role really is, is it’s a collaborative approach and it’s an approach that has got so many moving parts. The only way to deal with it is for so many different agencies and industries to work together to try and develop proactive and innovative solutions.
Christa: I kind of want to pick up what you were saying about educating the public and educating parents without scaring them. I think there’s really a ‘yuck’ factor in talking about child exploitation and abuse and people just don’t want to think about it. So what are some ways that you are helping people understand how serious it is relative to other important issues like privacy and what roles they might take in turning the tide and really how we can all participate in that conversation?
Tom: Yep. So yeah, you’re spot on. It’s such a fine dividing line. So you’ll be aware that there’s been a worldwide movement for the past couple of years to avoid the use of terms like ‘child pornography’ or ‘kiddie porn’ which are — ‘child pornography’ in particular — which is the legislation in many countries. So as a global movement we prefer to use the term ‘child sexual abuse material’.
But at the same time, by using that, we need to try and be clear with what we really mean. Because in my mind, we’re not looking to demonize in any way the young children who are having natural, growing up experiences, who might be sharing images with their boyfriends.
We don’t want to make people feel that that is illegal material. Although by definition it might be, very much at SafeToNet, child privacy is at the heart of what we do. So we’re trying to help and educate the children. So for example, we want to make them aware that sharing an innocent image with their friend who’s the same age or their boyfriend might have dire consequences, because if that thing gets reshared, it then becomes part of that global cycle of material, which paedophiles for example, might want to access.
Christa: Yeah. and on that note, I’d like to look at sort of the bigger picture of, just all of the social and economic and political forces that are really affecting these young lives on the micro level. Your work investigating these crimes certainly saw a lot of upheaval in the last eight years. The technological advancement, as we’ve discussed, compounded by austerity efforts to standardize digital forensics in UK law enforcement, Brexit unrest, the pandemic, I mean, there’s a long, long list. How has all of this impacted your own work? Not just your current role, but also your work in law enforcement and government for better or worse?
Tom: So, law enforcement and government, my personal view was I’ve got a positive outlook. I think it focused a lot of our minds because we had to accept probably around, I don’t know, maybe 2012, 2013, that we weren’t going to get a lot of financial input to developing new solutions. So we had to work more with what we already had.
And as you know from my past law enforcement work, a lot of my law enforcement work to catch online predators was based around using data that already existed and partnership work and working with telecoms companies, working with service providers and trying to help law enforcement understand that they already had a lot of the data and the tools that they needed. But we had to think of a bit of a more savvy way of using that material.
Now, that was against the backdrop of, like you say, standardization of digital forensics coming in, which by definition was going to be a costly process because it required a lot of upgrading within law enforcement agencies, and the other backdrop of data becoming something that we couldn’t physically see a lot at the time. It was then being stored in the cloud, it was becoming harder to access, it was becoming easier to dispose of.
So it was a bit of a perfect storm of so much changing in a short period of time. But I think, and I’m probably a bit biased because I was one of them and I still value them, I think law enforcement worldwide did an incredible job of making good in such a bad situation.
Christa: To your point about child sexual abuse and exploitation moving further online, with images stored in the cloud and many transactions happening on the Dark Web, do you see more overlap between investigators and digital forensics? Are the boundaries shifting or do you see them staying about the same?
Tom: Yeah. So I always think this is an interesting one because my, and I’ve got plenty of friends who work in pure digital forensics and their view has always been, new investigators could keep away from our digital forensics world. ‘We do this, you investigate, never show it to me.’ But the reality is I noticed I’m probably in the last four or five years the sheer volume of material being seized. So a typical — let’s say a peer-to-peer file sharing investigation — in the UK could quite easily see the investigating officers seizing anywhere in the region of 50 to 100 devices capable of storing material.
Now, it doesn’t take a genius to realize that it’s absolutely impossible for a digital forensics unit to fully analyze all of that. So, I saw a need for the triage facilities to really take off. And as I think you may be aware, I then did a couple of years of work with Magnet Forensics, where I helped them with a law enforcement input on developing their Outrider triage [solution].
So, those kinds of things are inevitable that digital forensics is going to have to move and has moved slightly into the investigative side. And that, I hope means that the digital forensics units then concentrate on the devices that are really important rather than having to wade through, you used to see ones get seized where you could see they have dust on them, they didn’t look like they had a battery in them for years, they might not have been a hard drive, but they were still being seized because people were so unsure.
Christa: Yeah. Yeah. Well, and that’s the tricky part I think with child predators in general is because they do collect so much information or so much data, so many images going back so many years in some cases.
Tom: Yeah, absolutely. And I think the people whose collection was so important, we were only speaking the other day internally with an art company where I was trying to explain to people how some of these characters operate. And I was telling them this wasn’t that unusual to find someone who maybe had an excess of a million images or videos of child abuse. And that even with the most automated processes is a hell of a lot of work to make sure that that doesn’t contain any first-generation images that they’ve created themselves.
So that relationship to an investigation digital forensics triage tool is so important, and I can only see that getting more and more important. Also, you’ve got the wellbeing aspect. We’ve got officers in law enforcement whose job is pretty much to wade through, day after day, images and videos of children being abused. That’s not good for wellbeing. So, tools that can be produced that automate that process is fantastic in my book.
Christa: Yeah. And sort of also on that note, in terms of whether it’s first-generation of production, distribution, or whether it’s kids, as you mentioned, sharing their own images. So, 2021 started with a precipitous drop in reports between the time the European Union’s ePrivacy directive went into effect, and its later derogation of that directive so that the service providers could find and report the images on their platforms.
I think this is mirrored in Apple’s decision to retract its on-device CSAM detection along with enabling Safari private relay. You addressed both of the latter in your open letter to Apple calling for them to partner with organizations like SafeToNet. But then looking at Facebook’s 20 million cyber tips last year, contrasted with its own encryption plans and recent whistleblower revelations, we could infer that for-profit companies have a different view of what partnering really means. So in an ideal world, what, to you, does true partnership look like and how can our listeners add their voices and their overall efforts to yours?
Tom: Yeah. So I think, I mean, you’ve read the two articles I wrote relating to Apple. Now, are we really clear that anything Apple do that can help fight child sexual abuse is fantastic and needs to happen? Where I felt really disappointed and where I felt it lacked a little bit of knowledge of the overall subject was some of the terminology used. Some of their senior people were interviewed in the aftermath of the announcements and were using ‘child porn’ as terms, there were then internal memos that were released saying, someone’s seen you with an Apple saying, “We’re the greatest platform for child porn.” Well, given that they in 2020 only reported on 265 occasions to NCMEC, that infers that there’s something wrong because they know internally that their solutions are used for child sexual abuse material, but because of the privacy that’s in place in organizations such as Apple, it’s hidden from plain sight.
What really worries me is that if Facebook moves to their full end-to-end encryption solutions as well, those 20 million records they reported in 2020 very quickly become the 265 that Apple reported. And we don’t want to see a decline in reporting to NCMEC because it’s simply hidden but it’s still there, we only want to see a decline in reporting to NCMEC if we worldwide have tackled it, and it means that there is actually less in circulation.
Christa: Right. And, you know, going back to your point about terminology, it seems to me that the executive continuing to refer to the problem as ‘child pornography’ really kind of, not just minimizes the horrific aspect of it, but it makes it almost easier to make privacy a counterweight, right? Because it doesn’t get to the true nature of these crimes.
Tom: Yeah, and I completely agree. And I mean, so, I think the announcements, and Apple admitted the announcements were confused and they absolutely were. I think by being confused, they played into the hands of the fears of privacy advocates.
So for example, the announcement that there would be a threshold of 30 CSAM files was required before a report was made to NCMEC and law enforcement, to me kind of inferred that there was the chance that the wrong people were going to be identified and subject to law enforcement action.
And most people who have anything to do with law enforcement will know that there are actually a lot of failsafes in place between the point that a platform, let’s take Apple, for example, identify CSAM to the point that law enforcement even consider taking any enforcement action. That will be checked by moderators at NCMEC, the platform, law enforcement when they receive the cyber tip will check it, it’ll probably be checked again before somebody goes and attains a search warrant for someone’s address.
So I was worried that the public took away from the Apple announcement, and hopefully temporary U-turn, that there’s a risk that they’re going to be subject to action because they may have taken an innocent picture of their child in the bathtub. That’s really, really not what it’s about. This child sexual abuse material is about clear-cut, pre-pubescent, rape, abuse, torture of children. So I’d love some way of educating the general public, without scaring them, that they are not going to be subject of action because they’ve taken an innocent picture of their child in a bikini or in the bathtub. That’s not going to happen.
Christa: Well, Tom, thank you so much again for joining us on the Forensic Focus Podcast.
Tom: No, you’re absolutely welcome. It’s really nice to be with you.
Christa: Thanks also to our listeners. You’ll be able to find this recording and transcription along with more articles, information and forums at www.forensicfocus.com. If there are topics you’d like us to cover, or you’d like to suggest someone for us to interview, please let us know.