Hexordia’s Jessica Hyde: Navigating The Future Of Digital Forensics

The following transcript was generated by AI and may contain inaccuracies.

Si: Welcome, everyone, to the Forensic Focus podcast. We are delighted to have with us today, Jessica Hyde. Jessica has been on before. Although I was going back through the archives and I was wondering when this happened. You talked to Christa, and brought Desi and I on several years ago now and we started working with her, and then she went on to other things. You spoke to her alone back before we even joined, but you did recently interview with Forensic Focus. You’ve got an article and an interview up.

Si: To refresh the listeners’ minds and to bring everyone back, could you give us a little bit about your background now? I’m going to prompt this because I did my research for a change a little bit before we started this. You started off as an avionics technician in the US Marine Corps. This is not a typical start in life for anybody, let alone a start in life to then end up teaching at universities for digital forensics. So how exactly did this transpire?

Jessica: Sure. So I joined the Marine Corps, not typically where you find people who wind up in this field, but right after September 11th. I was a direct ship in October 2001. Kind of giving away my age here.

Si: Younger than me, so that’s fine. Let’s carry on. That’s not a problem.


Get The Latest DFIR News

Join the Forensic Focus newsletter for the best DFIR articles in your inbox every month.

Unsubscribe any time. We respect your privacy - read our privacy policy.


Jessica: I will preface this by saying there are some other folks who came specifically out of avionics and some other Marines into digital forensics. So you just mentioned Brett Shavers. Brett Shavers and I are both Marine veterans, as is Harlan Carvey and Alyssa Torres. So there’s quite a few Marines who have become forensic experts, and then there are some avionics personnel, albeit more from the Air Force side, namely a SANS instructor as well as Leslie Carhartt. They also were in aviation, but I might be wrong about Leslie. I know Leslie was Air Force.

Jessica: So nonetheless, it’s common but uncommon, whichever way. But I would say that maybe those of us who were in the Marine Corps, we are the kind of people who make our footsteps known. We’ll leave it at that. We might be a little less afraid of being loud.

Si: Yeah, that’s fair enough.

Jessica: So when I got out of the Marine Corps, I took a job in a lab doing reverse engineering of improvised explosive devices. This was still very much in the timeframe in which the US was heavily involved in the wars in Afghanistan and Iraq in the mid-2000s. I took a job doing that and reverse engineering IEDs and ones that were blown up. What are they connected to in terms of trigger and receiver mobile phones?

Jessica: So yes, this is pre-smartphone, but definitely phones that were post-blast. So I like to say that I got my start in forensics analyzing circuit boards of unknown origin and phones that were blown to pieces. It’s a little bit of a different start. I recognized I was really niche, went on and got a master’s in computer forensics to help myself get more well-rounded. Did my stint in the private sector, went and worked for one of the big four firms, EY. And then went back into doing forensics as a government contractor again and so forth and so on. And here we are.

Si: You say you went back to do a master’s in forensics, but that would mean fundamentally that you had an undergraduate degree. Was that in avionics? Was that in…?

Jessica: No, electronics engineering. Electronics engineering, which I did because I was in avionics. It made sense. The government said, what do you know how to do and what can you do? Apparently it is dealing with electronics. And I did that weird thing where I was working on my degree while on active duty. So I went to lots of different institutions and pieced together an undergraduate degree slowly but surely. And here we are now.

Si: I’m going to just ask, because my education background is varied as well. Do you actually think that has been an advantage to you, or do you think that it was just the same or do you think it was a disadvantage to have that?

Jessica: Oh, I think it’s an absolute advantage. I’ll start with the fact that I think our job in the digital forensics field is to solve changing problems that exist with new technology and how to find data. So we do a lot of troubleshooting. As an avionics specialist working on the Harrier, my job was to be able to troubleshoot technical issues, communicate with people who weren’t as knowledgeable about the technology.

Jessica: They were very knowledgeable about their area – pilots, right? They’re very knowledgeable about how to fly a plane. This is much akin to dealing with lawyers. They’re very knowledgeable about law, but not necessarily about the underpinnings of the technology. I was an expert on the technology that was helping them fly versus a technology that is providing digital evidence.

Jessica: So our job is to be able to troubleshoot under austere circumstances and figure out how to solve problems that haven’t been solved before, where the manuals don’t tell you how to do it. From that perspective, it’s the same mental skillset. From a technical perspective, I soldered, I worked with wires, I worked with multimeters. As someone who does a fair bit of hardware analysis in my forensics career, those skills were directly transferable.

Jessica: So the fact that I soldered wires and dealt with electricity and signals and computers and swapping discs – those things literally happen on jets, much the same that they happen in our forensics labs. So yes, I think that there are directly transferable skills, but I will also advocate for the fact that I think the best teams in digital forensics and the best ways for solving problems is by having people who have diverse backgrounds.

Jessica: I love having lateral movers out of different fields on the team because they bring different approaches to problems and oftentimes have soft skills, which are very necessary for talking with our stakeholders, be them generals, lawyers, etc. But also, I worked a fair bit in the beginning of my digital forensics career with the Department of Defense and I still do.

So my time in the Marine Corps has literally allowed me to articulate things well in that manner too, having some understanding about military organizations. So yes, I think it has directly helped, and it’s a lot of the same mental process when you think of it from an abstracted layer.

Si: Yeah, I get that abstracted layer. What we try and say, I think, is that we follow a scientific methodology, but that scientific methodology of test, verify, and loop around when you get it wrong and understand is exactly the same.

Jessica: So I’ll tell you the process on debriefing a pilot, right? They come to you with a gripe and so you’re going to ask them questions about the problem that they have. That’s exactly the same as when we’re brought a question from an investigator or an attorney. Then the next thing you’re going to do is look at, acquire whatever data you can, and then create a testing scenario, create a hypothesis, conduct your testing, and then based on your testing, apply it back to the original problem.

Jessica: Swap it apart. Replace something. Understand how something works. Write a script as we may do in digital forensics. And then you’re going to verify your results. And then you’re going to write a report or write up your documentation. In all of these things, we have to do documentation as we go. Trust me, when you’re working on a jet, you’re filling out your logbook as you go. Just like we document as we go through our process.

Jessica: Put up a report at the end, be able to brief people high up, be able to give status updates to people high up on mission critical things, be able to work in time-sensitive environments, and then the result goes and flies, or the result goes to court. The biggest difference is, presuming everything is going right and your jet has no major mishaps, you shouldn’t have to testify as a digital forensics examiner. If you’re somebody who is working on aircraft, hopefully never.

Si: Never. Yeah. It’s interesting actually, because I think one of the skills that is most valuable is that note-taking, those contemporaneous notes. And actually it’s very hard to teach, or at least I found it very hard to teach. I found it very hard to learn, to be honest. I came from a systems administration background and I did a lot of things, and then I wrote documentation. I didn’t write documentation while I was doing things, and it has taken me a while to get around to being a little more efficient in the note-keeping department.

Jessica: Yeah, I’ve got to say, keeping logbooks in the military that are timestamped with who it is, what happened, being able to give a bottom line up front – that’s the big wording in the military, the BLUF. And then be able to substantiate every step you took, and it can be read by other people who have to pick up your problem. Yeah, that’s definitely a skill that I had acquired there. I think you’re right, it is. I think the best way for people to learn how to do that is to see samples though.

Jessica: I think showing them how you take notes and showing the quality of notes and actually, peer review is another big thing. Having people peer review notes of people who are learning how to take notes is really important so that they understand the most important thing. I like to tell folks, when I’m teaching my students at the university or students in a Horia class and we always talk about note-taking on the first day, because if you don’t know how to take notes, how can you do all of the other things you’re supposed to be doing?

Jessica: I always like to say the most important reason to take notes is in case you’re hit by the lotto, because I want to put a positive spin on it, but somebody else has to recreate your work. And how many of us have worked something that comes back years and years later? I want to make my own life easier. I like it when past Jessica does current day Jessica a big favor or future Jessica a big favor.

Si: Yes. I was going to say, I don’t get too many cases that loop back around a couple of years later, but I had one a little while ago that came up and it was like, “You reset the password on this,” and I was like, “What did you set it to?” And I was like, “Oh my God, that was four years ago.” That is not in my mind anymore. And you’re flicking back through your notes. It’s like, “Oh good, I did write it down.”

Si: That sort of mad panic as you realize that you’ve reset an account in order to get access to it, and then you can’t necessarily recall four years later when you need to get back into it again. So yeah, I wholeheartedly appreciate that. You looped in an excellent segue for me to go to your students. So I’ve seen you are, obviously Horia is your company in your organization and you’re a university lecturer. You have been at Champlain…

Jessica: I only taught at Champlain for a year. I’ve been teaching at George Mason since 2016.

Si: And George Mason as I worked my way up to this. It’s alright, don’t worry. How did you fall into that?

Jessica: I’m very lucky. I did my master’s at George Mason and so when Bob Osgood knew that I worked specifically in a mobile forensics lab and that I had spent my time focused on problems pertaining to mobile forensics, and he needed an instructor, he reached out to me and asked me if I was interested. And I was like, absolutely, this was a no-brainer for me.

Jessica: I had the luxury of doing some teaching in my roles in labs of first responders and creating content to help our practitioners within our lab skill up. So writing a class to teach mobile forensics was a no-brainer, and I love it. Do you know why I love it? Because I learn more from teaching because of the questions that get asked by folks who this is new to. They look at it differently, and as each new group of students who have had a different set of circumstances that have brought them to where they are, come to take this course, the situation is different.

Jessica: The technology they grew up on is different. The way in which they’ve been exposed to the environment is different. So their questions change and it helps me gain perspective and it also challenges me always to be aware of the newest operating system and the newest issues. Not only does my casework inform that, but having to be prepared for each lecture in the fall, it’s the most fun because the new operating system versions for Android and iOS typically hit in September or October. Usually about one to two weeks before I’m teaching.

Jessica: And of course, I want to know what’s happening and what’s changed on that newest operating system. So it gives me an extra little push even before my casework has hit it. But also, the questions that are asked and the “let’s figure it out, let’s find out” – I learn more from the questions people ask than I could probably even come up with on my own or from my independent coursework or from my colleagues.

Si: Yeah, I love my opportunities to teach and I’m going to say I’ve been very fortunate in having my own children educate me frequently on the way that phones actually work, as opposed to the way that I think they work.

Jessica: My children – I have two teenagers, or I guess one young adult and one teenager. I guess they’re technically both teens, but neither here nor there. They definitely are the people who I go to with slang terms that I don’t understand that are in data. “Hey, completely no context. Do you know what this means?” They school me.

Si: Yeah. I had an absolutely wonderful opportunity. During lockdown and COVID, we were giving evidence remotely to court. So I was sitting actually in this chair here giving evidence in a case. And a question came up and the judge was like, “Do you know what would happen under these circumstances?” And I was like, “I don’t, but actually, if you give me two seconds, I can test that because I’m here.”

Si: And it was me madly texting my daughter upstairs going, “What happens if this happens?” And she’s responding, “What I saw was this,” and I’m like, “Your Honor, what we saw was…” She’s gone on to a career in law, so I cannot…

Jessica: Oh, how appropriate!

Si: Yes. My influence has rubbed off in that regard. But yeah, it’s the way that people use technology that varies upon the way that they… The app does the same thing, whether I pick it up or you pick it up, or they pick it up, but the way they use it is so fundamentally different. And that’s fascinating to me, that sort of approach that they can bring.

Si: And I was also very fortunate in the degree that I was teaching previously. It was very keen to bring on all sorts of backgrounds of student. It wasn’t just maths, computer science, physics people. We had English, we had foreign languages, we had psychologists. We had all sorts of…

Jessica: Different perspectives. That goes back to what we were saying earlier about different perspectives in the lab makes for different solutions. I have folks who have transitioned from healthcare, from education, from so many different fields that you get to collaborate with. And when you have people who do that lateral movement, they definitely approach problems in different ways.

Jessica: The best teams are going to have somebody who comes from a law enforcement background, somebody who comes from an academic background, somebody who comes from a computer science background, somebody who comes from an electrical engineering or hardware background. And when you take those different backgrounds, they’re all going to have a different approach as to how to solve that same problem.

Jessica: Many times the problem would never get solved with just one of those people working in a silo. It’s when they all start communicating and then they’re able to feed off of each other to come to a new resolution that no one’s brain would’ve come to on its own. And that’s awesome. That’s how we solve technical problems.

Si: Yeah, the fantastic concept of interdisciplinary sharing is amazing. And we start to hear about it in all the really sexy things like biomimicry, where they make robots crawl up walls, pretending to be geckos and stuff. But it applies all round and that’s absolutely amazing.

Jessica: I’m going to have nightmares about your biomimicry example now. Thank you.

Si: I’m sorry. I’m sorry. We won’t get into the glow-in-the-dark jellyfish and all sorts of things like that as well.

Jessica: I’ve been to Australia. No, I’m joking. I have, the jellyfish are the scary thing.

Si: I was about to say, Desi would be able to answer this more accurately, but I’m pretty sure everything in Australia is trying to kill you. I have yet to experience it myself, but it seems that way from any guidebook I read: “And this is poisonous and this is the most poisonous.” And “Yeah, don’t touch these. And the spiders are this large.” Yeah. No. Thank you. I’ll stay here.

Si: Obviously you come from the electronics background. Are you still hands-on with chip-off forensics? I’m not going to say necessarily exploded phones, but you may still be getting exploded phones.

Jessica: I don’t usually get post-blast phones. But yes, and I would say because we’ve been doing a fair bit of IoT forensics, a fair bit of IoT research, and at the end of the day, those are the chips that are unencrypted still, or a lot of them are. So those are where a lot of those techniques are more relevant – ISP, JTAG, UART. They’re very relevant still on getting data off of physical hardware.

Jessica: You can still do a lot of chip transfer in the repair world. So for damaged devices, yes. And there are a couple of folks on my team as well who are strong in that area. So the answer is yes, but not as much as I used to when I was dealing with post-blast devices every day or even when I worked with a team where we had our own hardware exploitation lab.

Jessica: It still does happen, and we do quite a bit with IoT devices in both research as well as casework and instruction. So yes. But definitely seeing more digging into databases these days than into other data structures. Trying to understand Android binary XML, SQLite, LevelDB and dealing with those kinds of data constructs, I’d say, are more common in my daily world at this time.

Jessica: But we have an IoT class that we teach and we get very hands-on in there. Not so much where we’re teaching students chip-off and JTAG – there are fantastic courses out there that do that – but we’re teaching them how to deal with that data, and that’s still how we’re getting that data.

Si: Are you finding that there’s a particular sort of class of IoT device that you are seeing more frequently? Is it like the Nest home things or is it security cameras or…?

Jessica: I’ll give Google some credit that the Google devices are the ones that are encrypted. So the Nest and the Google Home, those are encrypted. It’s a mix. There’s a lot of smart watches that we see, a lot of health data devices that people are giving up a lot of their health data for, for their own personal information. But it’s coming up a lot in casework.

Jessica: Also smart speakers, just because they’re almost ubiquitous in people’s homes, but there are sensors everywhere. I think that there are a lot of door cameras or doorbell cameras, but a lot of that’s just being gotten from warrant returns more than taking the data off. We actually have a cool project for the DOJ where we were analyzing, does it make sense to seize the hardware or is most of it in the app or is most of it in the cloud?

Jessica: And trying to determine what devices it makes sense to get the hardware for. I’m not getting cases where the refrigerator is what we care about the data on, but refrigerators could have data. And if your refrigerator is synced to tracking geolocation of your kids and to calendars, that may be valuable. But I think part of the IoT landscape has to do with awareness and what is the best source of data.

Jessica: So if we can get the data from the mobile phone, and that is where the data is resident, why wouldn’t we stick with that? Because it’s in the sync app, but sometimes you may not have access. So I think that it ebbs and flows based on what is in people’s environments. Not a particular area of my expertise, or there are other people who spend way more time in this, but the most IoT thing that’s giving people lots of data on lots of cases is vehicles.

Jessica: Vehicles are a giant moving IoT device. I know this because I particularly drive a low-tech vehicle and I am not looking forward to having to replace it because I like that I don’t have a heads-up display and that my car does not have CarPlay or Android Auto or any of those features. And it will be harder and harder to find vehicles that have that.

Si: My car is decidedly archaic – doesn’t even have Bluetooth, so it’s a long way down the scale. But the thing that fundamentally scares me is, and Tesla is the prime example of this, and I know they’re having a really bad time at the moment for various sorted reasons, mostly to do with their own fault or at least their CEO’s own fault. But it’s the idea that you can push updates over the air.

Jessica: I’ll stay away from discussing vehicle forensics at this juncture in time. I’ll give the clear disclosure that my husband works in the red team side of the house for a major motor vehicle manufacturer. So that’s probably the one area, and that’s the reason I say I try to stay out of that area just because I don’t ever want there to be any question about my knowledge or my information or where things come from.

Si: That’s more than reasonable. And we can have this struck from the record, if you’d like.

Jessica: No, it’s actually fine either way, but there is definitely something funny about having a red team, blue team marriage there.

Si: I was going to say, we spoke to Heather and Jared a little while ago from Cellebrite. And yes, the dynamic of two people in the same industry is quite fascinating when you come down to it. My wife’s a project manager, so I’m not sure how it would go if it wasn’t that way. Certainly if we were on opposing sides technically, I think it might be a little more interesting.

Jessica: We did meet fixing jets. My husband and I both fixed jets and that’s how we met. So the fact that he is a red teamer and that I do defense security is ironic. But yeah, I steer away from the vehicle stuff.

Si: That’s very fair. “Steer” is a funny pun though. “Steer” vehicle. I do love puns as evidenced by anyone who’s ever seen a CTF I’ve worked on. I think that a slightly irreverent sense of humor is an absolute must in this industry. I come across some people who are way, way too serious for their own good, and I’m not sure it’s good for their mental health, if I’m honest.

Jessica: I think it’s important to have a sense of humor because at the end of the day we deal with the darkest of humanity. So if we can’t have levity, if we can’t find places to have joy in the mundane, how are we going to be able to deal with the fact that we are dealing with, and I don’t care, even if you’re on the incident response side, you’re dealing with companies on their worst day.

Jessica: No matter where you are in this field, you’re dealing with people on the hardest day they’ve ever imagined. And our work affects people’s lives, and that’s heavy. That’s a heavy weight to bear. So if we’re going to bear that much weight – how cool are our jobs? We get to use our technical knowledge and our brains to figure out problems, to be able to find truth and help with justice, and help people resolve issues and conflicts and sometimes save lives or protect lives.

Si: Absolutely.

Jessica: We have the best jobs in the world, but the weight of it is important. The weight of it is important for us to be aware of and feel because it should drive us to do the absolute best we can, regardless of what side you’re on, what environment you work in. Our work has victims. Not a case exists without a victim. Be it a company, be it wrongly accused, be it a victim of a crime.

Si: Yeah. I think it’s a very important thing that seems to get lost a little bit actually is that we talk about the sides that we’re on. We talk about prosecution or defense. But actually at the end of the day, we’re all here to achieve justice. That is what our role is, and we are a burden to the court. We’re not burdened to anyone else.

Jessica: Our job is to find the truth in the data. That is our role. Our role, regardless of whom is hiring you – our responsibility, and I think that this is really keen, is that I’m very involved with HTCIA, the first VP at the international executive level at this time. And one of the things I was really happy about is that the organization two years ago got rid of its rule that required it to limit membership to people who limit anybody who does defense-for-hire work.

Jessica: And I am so glad that there was a unanimous vote by the organization to say no. The High Tech Cyber Investigation Association represents the work we do. It doesn’t matter who you work for, because the results should be the same. Our goal is the truth in that data. What attorneys do with our work is beyond our control, but it is our job to do our due diligence and to represent the facts as they are displayed in the data, which again, takes that testing and that need for understanding and validating, etc.

Si: I think it introduces the interesting problem of bias. I was talking to a colleague about this not terribly long ago, like yesterday. And we were saying everybody has bias.

Jessica: 100%.

Si: It’s not whether you are biased or not because you are. It is a very biased opinion to think you’re not biased.

Jessica: Yeah, exactly. You’ve got to allow for that fact and to handle it and manage it and to say, to think that you are not is a mistake. But if you only do defense work, if you only do prosecution work, your biases are almost increased. It’s inherently an echo chamber.

Jessica: We started with a bias. We talked about generational bias, right? That the approach that people take. So both of us who are parents of young people now in this world, the way that they approach or see technology is different, which means if you or I are testing, “How did this artifact get here?” If we don’t think to involve, and again, this is one of the great things about having multi-generational teams.

Jessica: I have people who are much more senior to me in age and much more junior to me in age as part of my teams. And if we don’t get that perspective, I might not be thinking of some of the ways in which data could wind up on a system. Sometimes my young folks, they do things with their fingers on the screen and I never thought of interacting with said app in that way. And it makes magic happen.

Jessica: So we don’t necessarily even understand all the ways. In order to test a feature, to say, “How else can that data get here?” And that is the truth in the bias. So anytime that we see that we have something that is demonstrative of this is how something got there and that we can test and prove that in the affirmative, we need to be able to ask what else could have caused that data to get there.

Jessica: What would it look like? What other actions could cause that to not be there, right? So we need to look at the complete opposite of the things that we’re proving to make the best attempt to neutralize the bias that is going to be implicit. When your scope in itself in an investigation introduces bias. However, it is an intentional bias that prevents us from actually having privacy issues in a lot of instances. Someone’s phone is the most…

Si: This is an interesting distinction actually, between US and UK law. Because you are restricted by your warrant. We aren’t, but what we’ve seen as the counter to that is that quite often we are seeing victims almost self-submitting evidence. And selective. And it’s not because they’re not victims. I’m sure they are. But they are making a selection of the things that they think are important and sharing them with the police.

Si: There is a limited capture done on the basis of what the complainant says, and it’s not the full picture. And you end up with some very interesting questions.

Jessica: This is fascinating to me. There is an interesting paper from the Scientific Working Group on Digital Evidence that I’m a member of (SWGDE). I love it. Lots of great papers coming out on minimization of data, what is the right way to minimize data for privacy concerns. And this is really critical. My personal opinion is that if possible, minimization should be done in the analysis or via a human firewall and not on collection because of exactly what you said there.

Jessica: If we are only given a subset of collection of the evidence, then we risk not having access to exculpatory data, to being able to put together the full picture. When a database only has part of the data, but it actually correlates to some other database that maybe isn’t within a time restraint. So I really believe that we need as complete a collection as appropriate, and then to minimize on analysis.

Jessica: And I am very heavily speaking towards mobile and digital forensics here, as opposed to incident response where you would not collect a full endpoint of every single device, because that would diminish the capability to do incident response. Or even in a large enterprise environment, it might not make sense from a source and amount of data.

Jessica: But if we’re talking about somebody’s mobile device, having a human firewall, or I hate this expression, a “taint team,” those things may be more appropriate to protect someone’s privacy concerns. And again, I’m trying, because you just mentioned the fact that the laws are different, I’m sticking straight to privacy and to have the most respect for people, the people whose devices we’re looking at.

Jessica: And I think that’s really important because we want to be able to still have access to the exculpatory data. And for another reason, and this is something I’ve been hammering a lot recently, I just had an article in Forensics Magazine about this – acquisition should be seen as preservation because of how the availability of data degrades rapidly.

Jessica: Particularly in the world of mobile, and I’m not just talking about access, phones rebooting from a FU to BFU or USB restricted mode. Let’s say we’re in a completely consent-based environment. Consent-based, not consent-based environment where the person is giving us a password and the device and they’re giving consent for their device to be looked at. Maybe they’re a victim even in a consent-based environment.

Jessica: Data degrades because they’re and becomes nonrecoverable when you introduce the element of time. So I’m not talking about things like the WhatsApp wiping somebody’s phone, I’m not talking about… I am talking about literally that every day that goes by, knowledge loses data or cache locations loses what’s seven days old on that date. Literally data we will never be able to recover and it could be exculpatory or inculpatory.

Jessica: So while prosecution may want in a criminal case, access to that data so they can demonstrate where someone was, validate an alibi, etc., defense may likewise be saying this: “How can you even proceed? You miss the evidence that could have shown that my client wasn’t there because you didn’t get those cache locations images that degrade after 30 days post deletion.” And this is data that is no longer recoverable forensically.

Jessica: This wouldn’t be a question if we were talking about other wet science data. And there’s actually a good paper. One of my colleagues at Horia, as well as a co-forensics examiner Holmes from a university in Europe. Frank Adelstein and… I’m going to butcher his last name. Not in the UK, in Europe. I am sorry, Holmes. I know I just butchered your last name.

Jessica: They had a paper that just came out yesterday as part of DFWRS-EU. And they basically were saying timely preservation is critical because if you were to try to get the cast of footprints a week later, they’d be gone. The same thing happens in digital evidence and that we also, they brought up an interesting point, need to be preserving our test data at that time.

Jessica: Because the way in which the phone, not just what app version it’s on and what database it’s on, what’s stored on the backend of the server, it’s communicating with changes the capability of the server on the other end changes. And so I’m really just keen on this topic of acquisition as soon as possible. That timely acquisition is just absolutely critical particularly in mobile, but in digital forensics as a whole and that we need to pay more attention to it.

Si: I think it’s fascinating because, I’m going to say my age is greater than yours and therefore my starting point in this is a bit different, but we started off with “You get a computer, pull the plug straight away. That’s it. Done.” That was the way I was taught. That was the original start to this.

Si: And then it was like, “Oh no, there’s a whole bunch of ephemeral data that you’re going to lose if you do that,” which is completely true. You lose potentially encryption keys, network connections, all sorts of stuff. So we started thinking about doing that.

Jessica: And no one’s going to question you if you pull RAM on scene.

Si: No.

Jessica: But if you image a mobile phone on scene instead of just seizing it and putting it in a Faraday bag, that is questioned depending on your jurisdiction, where you’re in the world. Now it’s a search, or are we just… because the US has this really big ruling, very famous called Riley, and it’s the reason you can’t image a phone upon arrest.

Jessica: And at the end of the day, the ruling in Riley said, “Hey, the reason you can’t do it is because you can use these cool little nifty things called Faraday bags.” Guess what? Putting a phone in a Faraday pouch doesn’t stop a dead man switch. Putting a phone in a Faraday bag doesn’t stop these timers, doesn’t stop the reboots, doesn’t stop the non-recoverable data due to time degradation.

Jessica: So it’s just, what do you know? Technology changes and we have to continually adapt our methodologies. Speaking of SWGDE, they put out a great position paper on this, and SWGDE does not put out position papers all that often. So when they do, it’s important. I believe it’s called something like “Timely Acquisition” or “Timely Preservation Through Acquisition.” Don’t quote me on it, something to that effect. It’s definitely absolutely worth a read and definitely a conversation I hope people are bringing back to their labs, but more importantly to the attorneys who are telling them what rules they have to live under.

Si: Yeah. In the world of CCTV and surveillance forensics, what’s going on is that everybody and their dog now has a Ring doorbell or something like that. But unless that is seized or the data is acquired in a forensically sound way, within 24 hours, 48 hours of an incident having happened, it’s just going to vanish.

Jessica: Here’s the thing, then it becomes who is paying for what version of service, right? Because I have a doorbell camera on my residence. I’m a very big proponent about cameras outside of the house. And I’m a personal proponent of no cameras inside the house except for obviously the one I’m using for this communication. Webcams are a bit different and I can turn them off.

Jessica: However, I’m a very big proponent of cameras on the outside, but I also pay the added service fee to have my data maintained for 30 days, because I know that 24 hours… I travel a lot. I might not even know something occurred within a 24-hour period. So I want to have that 30 days worth of data so that if something happens… but again, I come at this as somebody with a specific heart for digital evidence.

Si: This is the world we collect in. And if you look at it – I’m not going to quote any numbers – but if you look at the amount of data that’s created now daily on the IoT devices, the things that we have, it’s astronomical. We could probably solve every murder in the world that ever happens if we actually collected it all in one place and managed to filter it in any sensible way.

Jessica: There’s no human way of doing it. It’s impossible. It’s just way too much. And different formats and all of this.

Si: And what does it mean is different than what is there, right?

Jessica: Exactly. Yeah. I get nervous when we start talking about large autonomous systems that could deal with all of this data.

Si: No, so I’m going to say we could segue into AI and its use in here. I am a terrible Luddite in this regard. I’m the one who’s going to be throwing a spanner into the AI works. I actually studied artificial intelligence at university once upon a time. And I wish people would stop calling it artificial intelligence for a start. “Applied statistics” is my personal choice of phrase.

Jessica: There are many different applications that fall under AI and machine learning. Everybody is all on the new hotness because of their personal user-end experiences with generative AI. AI has been in digital forensics, different levels of it since 2007. Anybody who’s done an e-discovery case with targeted assisted review or TAR has used AI. Anyone who’s used most forensics tools have… it depends what kind of algorithm it is.

Jessica: We use algorithms, we use computers. They are important. The criticality is that machines don’t understand what they don’t know. And a large part of our problem is dealing with the unknowns. I like to cite the fact that there’s over 6 million apps just between Google Play and the Apple App Store. I could generously say a thousand are supported by commercial tools and I’m probably being generous.

Jessica: So that leaves a wide, vast world of unknowns, proprietary data structures, things we don’t know how they’re stored. But even more importantly, even if the AI can figure out what all the data translates to, what causes the timestamp to occur, what does it mean? What causes that URL, what is that IP address of? What other things could cause it? The meaning takes a human and testing and understanding.

Jessica: And my big concern is the legal system erroneously thinking that our jobs could be done better by computers than humans. Now there are some humans whose roles in this field are not doing deep technical work, but with the rate of change of technology, I would say those people who do go beyond the “find evidence” button, beyond just clicking, are going to be needed in order for justice and truth.

Jessica: And if not, we’re going to have a large issue with misinterpretation of data because if we can have multiple experts on the stand in the same case, interpreting the same thing differently, not because they’re saying that the ones and zeros are different, they’re saying that the meaning is different. And that is what we provide and that’s the reason digital forensics experts should be here to stay.

Jessica: And because somebody’s got to test and validate and think of those new apps. And yes, testing could be potentially automated in the future. And a lot of things should – we should automate as much as we can to allow humans to solve the problems humans need to do. And besides that, it keeps our work more interesting. So there’s my ten seconds on it.

Si: I couldn’t agree more. I think the important thing is that we back away from calling it artificial intelligence because it has no intelligence whatsoever. And it’s a very misleading thing to say at any point. The idea of automation, the idea of even things like fuzzy pattern matching is technically sitting in the right area. But it’s enhancing our skills, enhancing our tool set a little bit.

Jessica: I’m not going to use a slide ruler and an abacus to do math problems. I’m going to use a calculator. I’m going to use my phone. I might even call out to a smart assistant as I’m cooking and say, “Hey, smart assistant, how many cups in a quart?” Because I do have to deal with cups and quarts instead of liters. You’re lucky.

Si: I’ve given a couple of talks on AI in my time. And many years ago, criticized it very badly, but I heard a talk and I’m just going to look up his name because I don’t want to miss… He’s an American… not Brandon Epstein.

Jessica: Okay.

Si: A guy called Jared Carter. He’s… hang on, let me scroll through and I can give you a little more information.

Jessica: The truth of it is there’s good, there’s bad and there’s ugly, and we should let computers do what computers know how to do well, but we shouldn’t be thinking that computers are replacing humans in this digital evidence element. They’ve always been a tool. We love our tools, right?

Si: Yeah. Jared actually specializes in accident investigation.

Jessica: Ah, okay.

Si: He’s a forensic analyst in collisions. Lovely. But he was playing with ChatGPT and a couple of the other ones to see whether they could solve collision problems, mathematical collision problems. And what he very interestingly discovered was that they were getting it completely wrong.

Jessica: Because generative AI isn’t the right type of computer for math. Generative AI isn’t good at math. It’s a large language model.

Si: So it was returning a probabilistic statistical…

Jessica: …based on what should be, yeah.

Si: Yeah. So if you want to talk about bias – when you feed something the entire world of the internet, and as experts as we are in knowing what grossness and evil exists on the internet, the last thing we’d want is that. I’ll tell you a fun quip. I obviously work for Horia. I do not, nor have I worked for SANS, but I was trying to find a picture of myself and look at a GitHub profile for myself, and I said, “Jessica Hyde and forensics.”

Jessica: Google Gemini brought up an automated, created bio it made for me and it said “founder at SANS.” Now my bank account would love it if I was the founder at SANS. I also was 12 years old when SANS was founded. So this is not a true statement. I was around 12. My point being that AI will get things wrong because it will call different things from the internet and try to make an assertion about them that is not quite right.

Jessica: So yeah, it’s bad at doing logic problems – that is not the goal of a generative AI. However, there is lots of great computation that can be done of math, otherwise we wouldn’t be able to use hash values.

Si: Exactly. And this is using tools appropriately for what they’re designed for and looking at your areas, understanding the limitations of what they are. And also not trying to, like you say, not trying to push stuff further than it should be going because just because it’s a new hot and sexy thing over here that everybody’s talking about doesn’t mean you should be shoehorning it into your forensic tool today to do stuff.

Si: I’m going to perhaps speak slightly out of term because I know there’s a possibility we may be getting them on to talk about their products, but I’m aware of a company that has a feature in their forensic video software that re-colors infrared footage.

Jessica: Oh!

Si: Now infrared footage is recorded in monochrome, what they’re effectively doing is making up colors and sticking it on top of it. And I’m a little unsure how one could possibly do this in any forensically sound way.

Jessica: I am not educated to speak to that, but fascinating.

Si: It’s interesting and I think something that somebody said was that you can’t stick your head in the sand about it. We have to be aware. We have to be able to refute it. We have to be able to understand it. But it’s not…

Jessica: And again, it’s been in our world. It’s been in our world.

Si: It’s been in our world. Like I said, I read it at university, and I was, like I said, SANS was founded probably after I was 12. You’ve made me feel so young. I’ve enjoyed this. Thank you.

Si: Oh, trust me. You’re, don’t worry about that. Like I say, I went and read it in university before the turn of the century, which sounds so bad. But yeah, so it’s been around 25 years minimum now. Absolutely more than that. So we really do need to get a grip on it, I think is probably the way to phrase it, rather than anything else.

Si: It’s there. We just need to learn how we’re going to live with it. And also dial back the rhetoric a little bit about how, first of all, how it’s going to destroy us all. Because it isn’t, something can barely tie its own shoelaces, let alone come out and…

Jessica: …introduce a Terminator. It’s the chicken little thing, right? Do you remember when they told us that big data was going to destroy us and then encryption was going to destroy us? We weren’t going to be able to get data because of encryption, and then locked phones were going to get us, and then the cloud was going to get us.

Jessica: And there was going to be no data on devices. It’s just the newest in what’s going to get us. And you know what? We’re still going to be able to do our jobs for a long time. It’s a game of cat and mouse and we’ll continue to do what we need to do as professionals who adapt to technology to be able to uncover the truth in digital evidence.

Si: Absolutely. Now, I think as we’re coming towards the top of the hour, that’s actually quite a good point to draw perhaps a final question for you, which is: with everything that you’ve had an opportunity to see, and with your successful business there at Horia, and with your role at the university, what do you see as our next biggest challenge?

Jessica: Our next biggest challenges are going to be in policy. Policy and law are the biggest challenges. Morrison and technical will continue to evolve to meet the technical needs, use technology to deal with technology, but it’s educating the legal professionals so that they understand our world.

Jessica: I meet with many an attorney who still pulls out a cart full of paper and documents every time you go to court. And it is educating the legal system and those who make policy on how to deal with the newest technical issues at hand.

Si: Yes. I think you’re absolutely right. I think that we are in a position whereby we are evolving faster than the law. We’ve always been evolving faster than the law.

Si: One of the joys – I’m a big fan of history – but one of the joys is that the first computer crime case in the UK was actually prosecuted under Fraud Law. And it was prosecuted on something called “making a False Instrument,” which is technically used for creating a fake document to prove who you are.

Si: Something like a fake passport or a fake driver’s license. And they translated this to making a fake password because the password was what proved who you were. And therefore by using a password that wasn’t really yours, you were technically creating a false instrument, that password that allowed you to enter…

Jessica: …who you are. Oh, my.

Si: And very shortly after this, the Computer Misuse Act was created in the UK because they found that this wasn’t really fit for purpose. But it’s just a demonstration of how the law is several steps behind.

Jessica: I’m not an expert in the law. I am so glad that there are so many good legal professionals who understand digital evidence, but there’s a lot who need education in digital evidence.

Si: I have to say it’s been an absolute pleasure talking to you.

Jessica: Likewise.

Si: And thank you very much for coming on. And please don’t leave it three years before you come back again. It would be great to have you back on to chat again in the near-ish future. When something else happens that we can have an opportunity to talk about, which would be wonderful.

Si: For listeners out there, you’ve obviously already listened to this, but I’m still obliged to say that you can find this podcast on various mediums like Spotify, iTunes, all of the good stuff, YouTube, but of course, most importantly on forensicfocus.com – our own website, which will have this and the interview with Jessica written up.

Si: And various other wonderful things and you can come and participate in the Discord channel. You can drop in and come chat with us on the forums. And basically we are a huge community of people who like to talk about forensics like this because it’s fun and it is, as Jessica said, the best job in the world.

Si: And there’s nothing that beats it because we get to make a difference. So again, thank you so much for joining us. I really appreciate it and I look forward to having an opportunity to talk to you again.

Jessica: Thank you. This was such a pleasure.

Leave a Comment