Cellebrite’s 2025 DFIR Industry Survey – Key Insights

The following transcript was generated by AI and may contain inaccuracies.

Si: Hello everyone and welcome to the Forensic Focus podcast. We have back with us Heather Mahalik. Sorry, Heather Barnhart.

Heather: It’s okay. It happens.

Si: Who we have spoken to relatively recently and on previous occasions, and for the first time ever we have Paul Lorenz with us as well, both from Cellebrite. We’re going to talk about whatever comes into our tiny little minds today, or at least my tiny little mind. These are big brain thinkers over at Cellebrite.

Si: But we’re gonna start off Paul, as you haven’t been on before, and this is your very first time. Do you wanna do a brief intro of yourself and your background and how you got into this crazy, fun, joyous world we call forensics?


Get The Latest DFIR News

Join the Forensic Focus newsletter for the best DFIR articles in your inbox every month.

Unsubscribe any time. We respect your privacy - read our privacy policy.


Paul: Fantastic. Thank you for having me on. It’s a privilege and honor. I know you have a big reach, so it’s fun to speak to people as well. So where do I start? I’m based in Ottawa, Canada. I started in law enforcement, spending close to 15 years across two different agencies.

Paul: I left finally with Ottawa Police service. Started on patrol like most people, went through various parts – patrol, traffic, some plain clothes stuff, high risk offender management, child exploitation investigations. Then I took on a temporary assignment in tech. I had interest in tech – broke computers when I was younger – and they’re like, “Hey, you’ll fit right in.”

Paul: Sure enough, I came into the section and took an interest in mobile forensics, had some very big files. Worked with Cellebrite and pretty much all the tools that were out there. We were very fortunate – I think how Heather called it before, we were a “rich lab.” We had many different tool sets at our disposal.

Paul: The unfortunate part was primarily doing child exploitation cases, which of course has its toll. One day out of the blue, I met with one of the guys from Cellebrite and they asked if I wanted to come over to Cellebrite. That’s how my journey started. Had no real intentions of leaving policing or law enforcement – I absolutely loved it.

Paul: At Cellebrite, my journey has been a wild one as well. Started with pre-sales, ran the CAS Advanced Services Lab in Canada, moved over to product. Now I’m part of, or the head of, customer engagement. A lot of the forward-facing work – I have some very smart minds that work with me like Heather, Ronan, JP, and we’re expanding out.

Paul: I try to be that voice of the customer, try to be that thought leadership piece to bridge that gap of what customers need, what the industry needs, and bring that back to our products. Hopefully providing support, providing that digital face to talk to people on Discord. Essentially trying to be that person that I didn’t have when I was working forensics. It feels like you’re always alone, where there’s thousands of people across the world doing the same thing. In a nutshell, it’s a big community and we’re trying to help bridge some of those gaps.

Heather: That was such a positive spin. I like it.

Si: I think that’s a beautiful segue into what we were talking about very briefly before we started, which is you’ve just put out the industry trend survey. Obviously that’s a huge part of customer engagement – going out and finding out what people are actually doing and what they want. As a business, it’s important to know whether you’re delivering on that or not. It determines whether they stay with you or go to Magnet. So what of interest came out of that? What jumped out as the key salient points worth bringing up?

Heather: It was all over the place. Everything from how you identify – what’s your persona? Are you an investigator, an analyst, head of a lab, a prosecutor, examiner? Issues? The biggest issues we found were obviously locked devices, locked iPhones. That was the biggest. Encrypted applications. We asked about cloud versus USB storage – the old days (I say old, people still do this) of “here’s my data, Si, have it, good luck, sign this chain of custody.” Even AI and what people think of AI – is it useful? How could it be used in digital forensics? What did I miss, Paul?

Paul: It was surprisingly a large subset of views – the survey hit over 2000 people. It was a lot of people that responded, which is great because it gives a better baseline rather than just 20 or 30 or 100 people. Backlogs – all the stuff that we talk about. The backlogs, the trends of how do you deal with all this data?

Paul: The fact that tools are getting easier to get more data out of devices, but what do you do with it? How do you effectively review it? Printing to PDF – a 400,000 page document… what do you do? It’s scary, but it’s a reality for some. And then how do you effectively use it?

Paul: I still remember days from my old job having multiple Excel documents on my screen and thinking, “Okay, which one matches? Do we have a link between these?” It’s mind-blowing. There are still agencies doing this. I think this is where the downside of having full access and full file system extraction is that you’re overloaded with work.

Heather: And it’s nice when we do these surveys – Paul and I had 110 slides of results, trimmed it down to 23, and then talked about it. But when I brought up Skopje, Macedonia, there are things that just spur your memory. When we’re having questions coming in, I’m like, “I wanted that to be blocked out of my mind forever, that experience.”

Heather: But it’s old casework and old things that come up that you see people still doing. When I was in Skopje, I was, oh my gosh, 27 years old, 28 years old. It was two years ago, Si.

Si: Yeah.

Heather: It was a long time ago, but it’s crazy how you see people asking questions and saying things. I’m like, “How do people not share enough on experience and how to progress beyond?” I think some people still just struggle on catching up.

Si: It’s an interesting one actually, and it’s a conversation that we’ve been having in the UK for a little while about provision of mental health care. We are not very good across an industry – we have our little silos, so there’s prosecution or there’s defense, and we have perhaps one law enforcement agency versus another. They’re not necessarily talking and sharing.

Si: And we need to get better because at the end of the day, we’re all actually trying to do the same thing. But everybody seems to be getting better. Do you think it’s getting better?

Paul: I say the growth of the online communities has started to bridge some of that stuff. Even five, six years ago – I’ll use the Discord channel as an example because I live in there daily – people can easily share stuff. So those avenues of sharing information, sharing details and talks and all different challenges they have, is easier.

Paul: But I think at the same time, the complexity of these investigations isn’t going away. Whether it’s encrypted applications, encrypted devices, or hardened devices, all this stuff is just as hard, if not harder now, because people expect this. Working together is super important.

Paul: Also factoring that legal justifications and legal challenges are completely different from one side to the other. What I experienced in Canada – we have these very strict timelines of prosecuting a case – don’t apply to the states.

Heather: I’m still waiting. I’ve been working a case for years.

Si: I’m gonna say it is fascinating because I don’t think there’s a statute of limitations in the UK. I know you guys have them. I’m prosecuting cases that are 10 years old – just because they finally come round. I don’t know if they ever run out. But I understand that there are statutes of limitations in the US and Canada definitely.

Paul: But that even puts pressure on the investigations. I’ll give you an example – it’s the Jordan decision. Essentially it means that depending on the complexity of investigations, you either have from the time you put handcuffs on someone or charge someone to concluding a case – 18 or 36 months.

Si: Geez.

Paul: So if you think about it, that’s a lot to put onto an investigator. I’m taking a step outside of the forensics piece – just everything else. You add a lot of pressure to it, and then you factor in all the other challenges on top of the forensics piece. You could be trying to get into a device, and then if you get the device, how do you examine this and how do you share it?

Paul: So there’s a lot of challenges, and I’ll say that being mindful to the different ones across the world, I think in the end there are very similar challenges. It comes down to people not having enough time. Your backlogs are getting bigger, you’re getting more cases, you’re getting more data.

Paul: One of the points, Heather – it was that the reliance on DNA was going down, but the reliance of digital evidence was going up. In the end, it makes sense too. That’s one of the survey results – we’re seeing more digital evidence going in than DNA. Everyone’s got a phone.

Heather: I actually remember one stat – I can’t believe it. Nine out of 10 who took the survey, so of these 2100 people, nine out of 10 said that digital evidence helped close a case in court through prosecution.

Paul: Prior to going to trial, I think. And then bringing things to court quicker. Making all that part will help – even from the survey results – was getting cases resolved prior to even going to trial. So that rush and everything else – if you get that evidence ahead of time, it’s gonna save time for the investigators, save time for the victim having to testify, all these fundamental pieces.

Paul: If we can help chip away at even the court backlog – I’m not sure how it is in the UK, but post-COVID, it’s been a total mess here.

Si: Yeah, it’s horrific here. And the idea of having to complete a case in 36 months, I think, would have most prosecutors here sobbing in a corner because it’s just not gonna be feasible. I literally got a case the other day and I was told that not only is it dated back to whenever it was, but they’re not gonna pay the current rates. They’re gonna pay the old rates because it was contracted back then, not now. I’m like, come on, that’s not fair.

Heather: That’s wild.

Paul: There’s that one stat – I think, Heather, you’re the one that brought it up yesterday – the 7% of people or 7% of devices don’t even get examined.

Heather: Yeah, just ignore it because they’re locked. 7%. So imagine if you’re accused of a crime, Si, and I’m like, “I can’t get into it, so I’m not gonna look.” But that’s what would prove your innocence. Isn’t that wild?

Si: Yeah, that’s… We’re very good as human beings in rounding things up and rounding things down, but that’s one in 10. 7% is nearly 10% – practically one in 10 devices. That’s not a risk you’d really wanna be taking.

Si: I remember somebody told me about a case years ago where they raided a guy who repaired laptops and computers for a living. He had stacks and stacks of machines, and they didn’t know how to process it, so they made a random selection from them and processed a couple.

Heather: They need to look right in the registry and see which ones belong to him.

Si: That didn’t matter. They pulled it, they processed it, and they found CSAM on there. But it wasn’t his – it was a random lucky guess on one of those devices from somebody else who had sent it in for repair.

Heather: Holy cow.

Si: If you had processed everything, then you would’ve gotten that without a shadow of a doubt – the guy was innocent. He hadn’t actually done anything, and he was acquitted, but the ongoing prosecution resulted in that. But that was like a random selection from a stack. That’s crazy because that was the only way it could be processed.

Heather: And the bravery of the person that put that computer in for repair – I guess they really needed that nastiness back in their life. Worth the risk. Crazy.

Si: Yeah, somebody clearly thought something was encrypted and it didn’t. That’s my official term – the nastiness.

Heather: The nastiness. I think we should roll that out across the industry. It should be a sticker.

Paul: But I did a lot of cases from submissions from repair shops. You’re like, what are you guys doing? This is not…

Si: Yeah.

Paul: But it segues into the other part of all these extractions that we’re doing from devices. I think one of the stats was 25% hold no valuable data. So out of all these extractions that people are doing, that 25% is wasted time.

Heather: Wasted time then? Yeah.

Paul: Wasted time. So if you think about access, whether it’s brute force and then extract and then analyze – how many hours are you wasting because you didn’t triage or take a look at it ahead of time to say, “Hey, is this even worth looking at?”

Heather: Look at the Boston Marathon. The guy that was caught – what was his name? Like Tsarnaev? The younger brother that was caught on the boat. When he was caught, he had a backpack full of random phones from a pawn shop. The amount of confusion that caused – I remember people saying, “Help us make sense of this. What is this stuff? What evil…?” I was gonna say a cuss word. I won’t say it, but seriously, what in the heck is wrong with this?

Si: This isn’t a PG rated podcast. I was gonna say evil fucky.

Heather: Who goes to a pawn shop and thinks, “I’m gonna get a whole bunch of phones and put them in this backpack just in case I get caught by the feds and it’s gonna mess with everyone”? Because it made no sense. The data was completely nonsensical. It was just like 20 random phones in a backpack. Why?

Si: Yeah, that’s… Unless they’re planning to make little bombs out of those.

Heather: That’s what I was also wondering if they were going to use those like a long-term strategy for detonators.

Si: Yeah, it could be. And you definitely want somebody else’s old phone, not one you’d bought brand new.

Heather: And these weren’t nice phones. They reminded me a lot of what I was seeing in Afghanistan back then in the work I was doing. So I’m like, this is “Hurt Locker” style – gonna get a call on this Nokia and everything’s gonna blow up. But that’s also where my nasty mind was back then.

Si: This is the thing – we’re all a little bit devious, aren’t we? Let’s be honest. A lot of us came into this industry from backgrounds like security. I was very used to trying to break into things rather than into people. So I love getting my Computer Misuse Acts and the hacking cases because it’s like, “Oh, I wouldn’t have done that, but yeah, that’s not bad. That’s cool. I quite rate that.”

Si: But after a little while, you’re like, “Oh, that’s a really cool idea.” And occasionally I think I’m in the wrong industry. There’ve been a couple of cases – especially the drugs cases, it’s terrible to say – where you’re watching these guys. They got caught, so they’re not good at it, but they’re there sitting on top of their Ferrari with the bags of money and the bags of cocaine, and you’re like, “Yeah, I’m driving my old Ford Focus and this is not really getting me very far.”

Heather: This is true.

Si: I guess I sleep at night, so you know, this, that, and the other. Oh dear. So anyway, your global survey – you said 2,100 people and this is across the world?

Heather: 95 countries responded.

Si: 95! Wow. That actually must be pretty much every country that you’re legally allowed to sell it in, I would’ve thought.

Heather: That’s a good point.

Si: Yeah, export restrictions. When you get higher than that, you have to start worrying about whether export restrictions are working or not.

Paul: That part has a lot of oversight at this point – where it goes. Because we ran a poll during the webinar yesterday and asked about AI – what’s your thought on AI? You have two different camps of people. What do you consider AI? Which component is it? And just what’s your view on AI and digital investigations? 26% from yesterday’s poll for everyone that attended live said “Full steam ahead, let’s go.”

Heather: Where are you, Paul? Where do you fall into this camp?

Paul: I’m cautious but open to it. I like the idea of augmenting some of the stuff that we’re doing – the mundane tasks. But you’re not gonna summarize – there’s no “find my evidence” button. There’s no “Hey, where’s all the critical pieces?” because there’s still an investigative mind, the intuitive mind. You still need a pair of human eyes to oversee this.

Paul: Hopefully it brings you to something quicker – fantastic. But you can’t, what I’m worried about is someone presses the button, goes “poof” off to court, and you’re like – that’s a poor forensic examiner going, “WTF am I doing with this? What does this mean? There’s been no validation.”

Paul: So there’s a lot of room to help improve the stuff that we are doing, whether it’s automating some of the tasks or surfacing stuff like, “Hey, these are pictures that are AI generated that have been modified” – something that you can look at that makes sense. But there’s still, no matter what, a need for a pair of human eyes to go over something, validate it, verify it before you’re saying, “Hey, this is what this is,” and not just totally trusting the tool for it.

Heather: Yeah, I also worry outside of forensics – it’s just gonna make people stupid. Everyone relies so much on it to write anything, to do anything. I used it a little bit yesterday to help me write an email because I thought I was gonna be nasty, and I needed it to make me sound nicer.

Si: Did it work?

Heather: Yes, I think so. I haven’t gotten a response yet.

Si: I find mine makes me sound horribly sarcastic if I try and do that sort of thing. Maybe it’s not one for me. It is very interesting – I went to a talk probably about three or four weeks ago now. It was an American crash investigator, an accident investigator who had been testing AI to see if it could solve traffic collision problems.

Si: It was very simple stuff like, “I’m traveling at 45 miles an hour along the road. The coefficient of friction is this. How long will it take me to stop?” When he started doing this investigation, it came out with some really plausible sounding things that were completely wrong.

Si: What it had done is gone out to the internet, scraped it, and gone probabilistically – “These are the sort of things that sound like good answers to this kind of question.” It assembled the bits together to give an output that sounded good but had no relevance to the actual figures used.

Si: Then he pointed out that there’s now a calculate button or a think button in a lot of AI. When you pressed that, it actually started to take a lot longer to come up with a response – two, three minutes at a time. But it then actually went and did the math.

Si: I was playing with this the other day – it was a very silly example, but it came up in a family conversation where there was a sequence of numbers, and the challenge was to write a poem with words of that length. So it was like 1, 3, 5, 2, 7, 9, 1, whatever. I couldn’t do it – there was no way on earth I could do this with 20-odd numbers.

Si: Then I plugged it into ChatGPT and said, “Go ahead, do this.” It went away and came back with a poem. It promptly counted the numbers of words and said, “Right, now here’s a five letter word: ‘the’.” And I was like, “You haven’t quite grasped this, have you?”

Si: But when I hit the calculate button, it went away and took five minutes to come up with something. It actually came up with – I’m not gonna call it good poetry because, first of all, I wouldn’t know good poetry if it came up and bit me on the bum. It wasn’t great, but it did get the word numbers right, it did make sense, and it did the job. You’re like, “Okay, maybe there’s a bit more to this than I originally gave it credit for.” But you have to have the calculate button ticked, otherwise it doesn’t work.

Heather: Which app were you using?

Si: That was ChatGPT.

Heather: I’m gonna see if I have the calculate button. I’m gonna look at it right now.

Paul: I think the example that you used yesterday resonates, Heather. If someone tried to go through six years of chats between me and you, that’s a lot of work. And I think that’s where that kind of stuff can help give you an insight to say, “Hey, they’ve known each other for this long. This is the stuff they usually send each other – ridiculous videos or TikToks or reels.” But how also are you gonna quickly go through six years of conversations?

Paul: The reality is you’ll still have to scroll through some of that stuff, but if you can get a snapshot and then verify afterwards… because how else do you go through thousands and thousands of messages?

Si: Yeah, I think AI is a fantastic investigation tool. It’s not a forensic tool – it’s an investigation tool, and we need to make that distinction very clear to people. But actually, the bit that scares me is the false negatives, like those laptops. If the AI comes back and says, “We’ve gone through it and we’ve got a probability thinking that it’s 90% likely there’s nothing of interest in here” – we’re back to that one in 10 where actually there is something of interest, but the AI hasn’t picked it up.

Si: Probability and AI are a nightmare anyway because it’s hard. It’s also things like – we have the concept of nuance, and if we are talking to two people, especially if you’ve known each other a long time, you have your own language. You have your own internal references that even the best AI is never gonna get – those in-jokes because your husband doesn’t get those in-jokes. So there’s no hope for AI.

Heather: They may think we’re psychopaths. AI’s like, “Do not talk to these people. They’re not good.”

Si: Yeah. AI recommendation: just don’t touch this.

Heather: Stay far away. Save yourself.

Paul: But it’s wild to think that if your one in 10 is wrong – that’s the scary part. I remember even doing the investigations years ago, you’re more worried about the person that’s wrongfully convicted than the other hundred.

Heather: But speaking of people wrongfully convicted – lately I feel like a lot of the cases coming my way, I’m told a narrative and the people asking for help expect me to say, “Yes, that is exactly what happened.” They’re not happy when I don’t confirm their theory. All I can think about on this one specific case is if they didn’t ask me to look at it, there’s a good chance a jury would have found this woman guilty of killing her kids. But the data doesn’t support that conclusion – they need something else. They can’t say, based upon what the digital evidence shows, that’s the truth.

Si: I do a lot of work similarly where assertions are being made that aren’t necessarily supported by the evidence that’s there. I think it’s very interesting because I’ve done a lot of video analysis courses recently with Lever. One of the things they’re very clear on is about cognitive bias and bias in general.

Si: When examining a video, they try to ask you to avoid being told what it is that you are looking for or the conclusions that others have made so that your examination is what you find, not what you are predisposed to find because somebody has told you in advance.

Si: I think that’s a very interesting concept – for case review, perhaps we should just be given the images without any context and asked, “Okay, here you go. Tell me what you find and what’s of interest to you.” At that point, maybe we’re into a way of getting a fairer representation of it.

Heather: But I think you need at least a little clue on what you’re looking at. Otherwise, the hunt’s gonna take forever.

Si: This is it. It’s a difficult one, isn’t it? Because where do you allow that cognitive bias to go? How far do you allow it? It’s easier when you are doing the case review because at that point you’ve got a great deal of evidence that’s been put together into this storyline that you’re looking at, and you’re looking for the holes in it technically.

Heather: But even something simple – instead of saying, “Hey, this person did this, wrote this note, that proves her guilt,” say, “Hey, can you look at this OneNote and make sense of how that data was created?” Just something simple like that.

Si: Yeah, I have a lovely exercise I do with my students when I’m teaching. I create an image and give them a disk image. They go off and process it and come back. The amount that come back and say, “This is what happened” – and I’m like, “No, for two reasons. One, this is a fictional thing, and two, that wasn’t the way that I created that image, so I know that’s not what happened because I did it a different way.”

Si: So there’s always gonna be more than one way. They need to learn at a fairly early stage that just because what they have seen could be that, phrasing it as a certainty is a very bad idea.

Paul: It’s similar to the capture the flags that we run. We put out these images without much context and a brief summary of what happened. People take completely different paths to get to that result, which is wild because it shows their investigative, inquisitive mind. Seeing people’s writeups afterwards when they submit how they got to the answers – it’s fantastic to see how they approached it and the direction they went.

Paul: But to your point about bias – I remember most of the cases we worked were presented as, “Here’s the crime, here’s the suspect, this is what happened. Find the evidence.” You have to take a step back. I remember one case I had with a USB drive – something ridiculous like, “Here’s a USB drive found at a suspect’s house.”

Paul: Sure enough, the guy had his resume on there along with CSAM and all that other stuff. But in the end, if I can’t link that to someone else, there are so many potential holes. You have to be able to step back. I think this is where users and examiners always focus on very definitive things, but sometimes you just don’t know, and it’s okay to say, “I don’t know.”

Paul: We always try to please and answer questions, but in reality, it’s okay to say, “I don’t know how that timestamp gets created. I tried to test it. I can’t figure this out.” Talking about making assumptions – there are several active cases where, based on a timestamp’s interpretation, it could lead to a completely different narrative. It’s okay to not know – I think that’s something we should help junior people understand.

Si: To say you don’t know is far better than claiming to know something you don’t.

Heather: But I’m a woman, and I know everything.

Si: Yeah, we’re not gonna argue that. That’s fine.

Heather: I’m the queen of saying, “I don’t know.”

Si: It’s essential. And it’s also when you come up against another examiner in an adversarial court scenario, and they are so certain about their correctness – those are the ones I really enjoy taking on. Sounds terrible, but…

Paul: At the same time, I remember seeing a quote over the past couple days – “You don’t learn from successes, you learn from failures.” It’s the time when you get raked over the coals in court that you realize what you should have done differently.

Heather: Was that Winston Churchill? Where did we see that? I saw that too. TikTok?

Paul: How about the cloud piece? This came up as part of the survey – transitioning and being open to the cloud, which is actually surprising. Some of the numbers we saw, I think there was like a 10-point increase from last year to this year of being open to the cloud.

Paul: That’s not from a user perspective – that’s from agency management and heads of labs. It’s surprising how much people are opening up to the idea of it. I think all of us still cringe a little bit, but are cautious about it because all your case data being stored in the cloud is scary. Then add CSAM stuff into that as well.

Si: A friend of mine very early on nailed it for me. He said every time you see “cloud” in a sentence, replace that with “somebody else’s computer” and then see how you feel. Because that’s what you’re doing – “I’m going to store all my case data in the cloud” becomes “I’m going to store all my case data on somebody else’s computer.” It sounds a lot scarier.

Si: I’ve done cloud work in the security space for things like patient data and the NHS here – the National Health Service. It’s all about how to make sure that data is safe and only accessible to the right people, and it’s all to do with encryption. We have moved to the point where the technology is generally speaking good enough to allow the safe storage of data.

Si: I’m still a little bit risk-averse. I prefer a private cloud or a local cloud to store it. The infrastructure scalability and virtualization and all of those good things still exist, but it’s just under a little tighter control.

Paul: Do the additional security mechanisms like FedRAMP and all that stuff in the states – does that sway your view a little bit? If the agencies and companies are agreeing and abiding by that and are certified, does that make it easier for you?

Si: It’s made it easier for me to do it professionally because you go, “Oh look, Microsoft is accredited to hold MHSA, so that’s great. Let’s use them.” From a personal perspective, I still have reservations. I’m not the person who’s ultimately signing off on the risk of these things, so it’s not for me to say or to accept that risk. I wouldn’t want to try and sleep at night if it were me.

Si: It’s very good. It’s very close. But we all know how reliable computers are and how vulnerabilities come up and get found. You just need one, and then somebody has accessed every CSAM case that you have uploaded there. At that point, you’re like, “No, I can’t condone it.” It’s a low risk, but it’s such a high impact. For me, that equation just can’t work.

Si: I’m not against it for lower impact stuff – I think it’s a fantastic idea. But to be honest, you probably won’t know that it’s higher or lower impact until you’ve examined it, and you won’t examine it until you’ve uploaded it. Then you’ll find out it’s got something dodgy on there that you want to pull out.

Si: Getting stuff off the internet or out of a cloud isn’t as straightforward and as clean as you might think it could be.

Heather: And attackers are really well versed on attacking the cloud. There are not enough protection mechanisms – barely any multifactor authentication set to protect – and logging?

Si: Logging – again, I’ve done logging and monitoring configurations for things, and people never do it well enough. It doesn’t matter because by the time you’re reviewing your logs, it’s too late anyway. You know who’s done it, but they’ve already got it.

Si: Given what we’re talking about, that’s what we want to stop. We don’t want to know who’s got it afterwards – we want to prevent them from getting it in the first place. I am a Luddite in that regard. Hands up – that’s me. I admit to it. I’m risk-averse, and for me, it’s just not there yet.

Paul: I think there are slow steps in that direction, but I totally agree that the potential impact is difficult. But it’s interesting to see the trend slowly going up – 10 points from last year of people being open to it.

Si: I think we’re gonna see the same with AI. We will see it increase, but there’s a familiarity that breeds contempt. You’ll have seen this as well just on the front line – before, you’d seize a phone, it would go into a bag, it would be handled carefully. They’d want to give it to a techie as soon as humanly possible because they didn’t know what they were doing.

Si: Now everybody’s got a smartphone. Everybody knows what they’re doing. Every phone that gets seized has been tampered with in some way by the guy who seized it because he’s decided to take a look for himself and scroll through it just to see if there’s any evidence on there.

Si: I think what we’ll see is that because people use the cloud constantly now – we’ve all got a Gmail account. I remember Gmail being invitation-only. Crikey. And I was an early adopter on that front. We all use the cloud. We all use OneDrive, we all use Apple iCloud. It just happens automatically.

Si: We’re all so familiar with it – it’s just, “Oh yeah, it is fine. It works the rest of the time, it’ll be no problem.” And then… I don’t know if you’re aware, but the UK government has decided to request that Apple turn off strong encryption in iCloud. Apple is currently fighting the lawsuit in the UK courts.

Si: But people are already there – their data is already there. They’re already using it. They’re not gonna stop using it because the government’s turned off strong encryption. They’re just gonna carry on because for most people it doesn’t matter.

Paul: Do you think it’s generational? Because if you think about it, some of the decision makers are the younger ones moving up through the ranks too, right? They’re more, as you said—pretty much everyone’s familiar with phones now. I remember when I started, we were just introducing laptops in cruisers. So it was a new thing, and then all of a sudden the guys on the road have ops access to everything: GPS, live search, everything.

I think that part too is that the people making some of these decisions are probably starting to grow up with technology more than the ones before. We call them dinosaurs, but it’s different people who were making some of the decisions previously.

Si: You’re right. I think there’s a little bit of a disconnect though, because I don’t think it’s the tech people who are getting promoted to those roles. I think it’s the operational people who are getting promoted to those senior level roles.

Si: Tech is… good. Hang on. I’m phrasing this really carefully so as not to offend anybody—something I’ve been doing all my life, offending people by opening my mouth and saying what I think instead of what I should be saying.

Heather: Story of my life.

Si: I think that the best technical people want to stay technical people. He says, modestly putting himself into this bucket. God, terrible. One of the reasons I stopped doing security work was because I ended up writing reports and risk assessments. I ended up doing lots of paperwork. I was such a high risk that nobody would ever actually let me touch a computer with security because they were afraid there would be a backdoor in it.

You get to that point and then you’re like, “Actually, I really miss it. I got into computing because I wanted to play with computers. I didn’t get into this because I wanted to write tons of paperwork.” You made that change. I think techies, the techy mindset is like that for the good techies, and therefore they’ll stay in the lab. They’ll switch careers or go somewhere else where they can play.

I don’t think they necessarily stick out that promotional career ladder to keep going. I think you’re quite a good example yourself, Paul. You didn’t stay in the police to get to a position of authority. You left to go and do something interesting. I think that makes my point quite well.

Paul: I think with people, it’s just the change in technology and how it’s evolved as well. I think that’s where it’s everywhere now too, right? You either—

Heather: —grow with it or you die.

Paul: Like with AI, right? I think that’s where we’re going with it.

Si: But consumer devices have been made so easy. Fundamentally, you buy your iPhone, it connects to your laptop, you sync your calendars, it all happens. Last time any Apple user ever opened a configuration file was decades ago. They don’t even know what a P list is, let alone have the capability of editing one.

Paul: But forensic software’s the same, right? I think if you take a step back, we’ve created this industry of making things much easier as we go as well. To help move cases along quicker, make finding some of those deep diving artifacts easier, to surface them. We’ve created this “easy examination” button as well.

Si: Yeah. I think this is one of the reasons that training is so important, and I know Cellebrite does it and I teach it at a university level. But training is the thing that says, “Okay, first of all, we’ve got to drum into people that if they are presenting, they have to know what it is that they’re presenting, where it came from, and how they got it.” Just sticking this thing in front of a jury and going, “Ha, I found this” is not actually valid forensics.

Heather: At least you hold people true to that.

Si: Yeah. I’ve done Cellebrite training courses and I’ve seen a little bit of how the sausage is made in terms of the results coming out of it. I think that’s very important. But the trouble is, because of the overload, because of the sheer volume, the training requirement is going down for first responders. They are essentially at the point of plugging this into a kiosk, getting the data off it, extracting something of evidential value, and then going to arrest someone.

I have no problem with that, but it needs to be escalated to a forensic analyst afterwards to verify, to make sure, and to understand. I think that’s the stage we are currently missing. But again, this is not a fault with software or training. This is a fault about staffing, manpower, and money. On the police side, it only comes from governments. And on the defense side, unless you have a particularly rich defendant, it comes from governments as well who are even less willing to pay for opposing their own money.

Paul: Speaking of training, there’s one of the points in our survey about online versus in person. I’d love to hear this. I’m just trying to find the exact number on that. Was that six out of ten?

Heather: Six or seven out of ten prefer in person over online. I’m terrible with online training. I cannot pay attention.

Paul: Squirrel brain.

Heather: Yep. What’s hard is now the US government is frozen from attending training in person. So they’re forced to do it online if they have a budget at all. And it’s expensive too. I know SANS is not cheap, but if you’re North America local law enforcement, you can get 50% off all training, and most people don’t know that.

Si: Yeah, put that out there. Everybody heard that here first. A significant number apparently will have heard that here first. I’ve got mixed feelings about it personally as an individual because I don’t like going places. No, that sounds terrible. I like teaching—

Heather: —virtually sometimes so I can be in my sweatpants and my socks.

Si: Yeah. Teaching virtually, although it’s more exhausting. It depends what you’re doing because if you’re teaching professionals and they’re paying to be there, you can do things like, “Okay, everybody, I want the cameras on.” If you’re teaching a class of 30-odd students and you’ve got to abide by various privacy rules, you can’t say, “Alright everyone, I want the cameras on.”

So what you end up doing is sitting in Teams and staring at a bunch of those little circles with letters in them, apart from the character students who changed theirs. One of my years was fantastic because they decided their theme would be frogs. None of the cameras were on, but every single icon was a different frog.

It was a frog fest. That’s so cool. Very good year. But you sit there and talk for an hour, giving a lecture, and there’s not a sound. You’ve got the headphones on, but nothing’s happening. They’ve all got their mics muted. Occasionally you’re like, “Could somebody stick a thumb up if you’re still there, please?” And a couple of hands go up.

Heather: Now when I teach, we do something called hybrid as well, so it’s in person and people virtual. They can unmute themselves. The first time someone unmuted themselves, I jumped out of my skin. I was like, “Who is that? Jesus! What is happening?” I remember looking around, “What is that?” I also had a guy recently who had his camera on all day. There’s a huge confidence monitor in front of me when I’m teaching, and I just see him staring at me. It was so weird, but awesome. Good for you, guy. You’re paying attention.

Paul: Yeah, it’s hard when you’re presenting too. You’re doing onboarding stuff, and you’re like, “Okay, am I talking to anyone? Does anyone even care?” At this point, I could be talking about my Ninja Creamy recipes, and I’d get the same feedback.

Si: Yeah. One of my issues is that university regulations specified that lectures had to be recorded. Someone’s not going to come or turn up if it’s recorded lectures. They might log in and then go away and do something else. They’re going to play it back on double speed. Actually, it’s funny because my students—as you can probably hear, I talk reasonably fast—my students told me they were so glad the lectures were recorded because they could play me back at half speed and hear what I was saying.

But yeah, what was the statistic again? Six out of ten?

Heather: I thought it was seven out of ten.

Si: Seven out of ten. I’ve really enjoyed going to do on-site training. I’ve done—Lever is the most recent example. I did two in-person courses and two online courses. The online courses were excellent. But like you, I suffered from sitting at the computer to do it. The computer is what we use every day. It’s what we have our email on, our Discord, every other communications thing we use, plus all those academic papers we haven’t read yet, plus the news and the web browser.

I’m also the squirrel brain. I’m terrible. Somebody will say—and you’ve probably seen me do it—you said “Boston Marathon,” and I’m like, “Okay, I’m just going to double check what his name was.” So you go off and Google it.

Heather: Tsarnaev?

Si: I don’t know. I did discover that the Boston Marathon is the world’s oldest annual marathon though. So I think his name was—

Heather: Tsarnaev. Now I’m going to Google it.

Si: Rather nicely, all of the top hits now are actually about the marathon instead of the bombing. So when I just put it in, it wasn’t the—

Heather: Tamerlan or Dzhokhar?

Paul: The relationships you build from the in-person trainings too, I think that’s where it’s really cool. Because you build these bonds. I ran into someone from my first computer forensic course from years ago at a conference, and we were like, “Hey, it’s been how many years?” It’s wild.

Heather: Last week at C2C, this guy Rob Fried who runs a consulting firm in New York City—Rob taught me when I was 23 years old at NW3C. And now he’s sitting in the audience consuming what I’m putting out. It’s so interesting. He posted a picture on LinkedIn.

Si: He’ll be thrilled to have his name mentioned here.

Heather: He was a great instructor. I remember learning FAT32 and NTFS from Rob Fried and Berla. I told him, “I remember how you said it.” His analogy was that in FAT32, the file system in each file is like Thanksgiving leftovers. You can’t just let it overfill and stuff it in. You have to put it into a smaller container that sits beside it and makes the whole—I’m like, “I noted that. That was 2003, and I still remember that analogy.” Food works for me. Food and drink analogies stay forever in my mind.

Si: Brilliant. I’m going to have to start using food analogies in my teaching. Squirrel brain often. Hey, there’s nothing like—

Heather: —slack space and Thanksgiving dinner. It helps. See, this ended up being fun.

Si: I’m not very happy to draw a line under it there because I’m fairly enjoying it, and I can carry on talking to you guys for hours. I hope you both come back and we speak again on the podcast because it has been great fun. If nothing else, this time next year to find out what the shifts are in the results. It will be fascinating to see if another 10% goes up in the cloud.

Paul: Yep.

Si: And whether people are entirely fed up with AI by that point in time, or whether they still think there’s a good idea. Just keep hearing it on repeat is driving me around the bend. If nothing else, it’s the reality. We have to live with it. We have to accept it. We have to get on board or—

Heather: —or get left behind.

Si: Or get left behind. That’s it. It’s quite funny because obviously, one of the prime examples of AI going horribly wrong is in the Terminator movies, and it’s Skynet. Skynet is the horrible, overarching thing. That’s actually the name, or at least it was the name of the UK military satellite network. There’s a place where I used to work that had a Skynet Avenue because it was part of the thing. You’re like, “I’m not so sure I want to be here anymore.”

Oh my goodness. It doesn’t seem like such a good idea. Anyway, I digress massively. So I will say thank you very much, Heather. Thank you very much, Paul. It has been an absolute pleasure talking to you. As always, you can find the Forensic Focus podcast on all of the places that Desi remembers and I always forget, which is Apple Podcast, Spotify, and the Forensic Focus website.

But the point is, if you’re listening to this already, you’ve already found it. So me saying this is irrelevant and totally superfluous. I will say again, thank you very much indeed. It’s been an absolute pleasure, and we would welcome you back on any time.

Paul: Thank you.

Heather: Thank you very much indeed.

Si: Thank you.

Leave a Comment