Introducing 2 Forensic Focus Podcast Co-Hosts: Simon Biles and Alex Desmond

Christa: Welcome to the Forensic Focus podcast. I’m your host, Christa Miller. And this week we’re switching it up a little bit. We’re introducing some new co-hosts: Simon Biles and Alex Desmond. Simon’s an IT and digital forensics expert witness based in Oxford, United Kingdom. And Alex is a lead incident responder and insider threat specialist in Adelaide, Australia. They’ll be working with me to produce podcasts with maybe a little more practitioner-oriented technical content. Welcome to you both.

Simon: Hi.

Alex: Thanks, Christa.

Christa: So I want to start, let’s expand a bit on your introductions. If you could tell us how you both got into digital forensics, what drew you to where you are now, and anything else you’d like to share?


Get The Latest DFIR News

Join the Forensic Focus newsletter for the best DFIR articles in your inbox every month.


Unsubscribe any time. We respect your privacy - read our privacy policy.

Simon: Let’s go in alphabetical order. Alex, you go first.

Alex: Oh, under the bus straight away. Okay.

Simon: This is the way it’s going to be from now on. Get used to it.

Alex: Ah, okay. I was going to say beauty and age first, but I’m happy to go.

So, I started off doing an engineering degree. I’ve got a background in mechatronics and mathematics from the University of Sydney. From there, I went into the Royal Australian Air Force where I predominantly worked within the cyber squadron.

And I was there for about four years and that’s where I kind of fell into I guess the digital forensics and IR side of things. So, militaries spend a lot of time kind of training the digital forensics side, but they still train the IR side, as well.

When I left the military, I went into a company that did DFIR, so I was a lead incident responder for them. And I spent a lot more of my time on the incident response side of things.

So, within every region within the world, dealing with a lot of ransomware was essentially my bread and butter day to day. I’ve since changed companies again and now I’m working as an insider threat specialist.

So I spend a lot of my time making sure that the product that we make makes sense in a forensic sense. So if it needs to go to court, it can, because some of our cases do go to court and the evidence needs to be clear for the analyst that’s being collected by the agent.

But also a lot of time writing detection rules and training analysts on how to properly understand the data that they’re looking at, because there’s a big difference between getting the data and then understanding what that means in the context of what the user was doing, cause our platform’s very user-centric.

So I spend most of my days just researching what the computer’s doing and then lining that up to what a user is doing and then writing detection rules for that.

But I guess that’s kind of like my pathway through and it’s been a bit of a mix between digital forensics and incident response and I guess that’s why you kind of get that DFIR acronym all the time. Yeah, Si, let’s have you.

Simon: Fair enough. So I’ve been kicking around the IT industry. I rewrote my CV the other day and I’ve discovered it’s been 25 years. I now feel slightly older than I did before I rewrote my CV.

I wanted to be in forensics from a very young age, about 13. I read The Cuckoo’s Egg, a Clifford Stoll book, and I thought, “That’s the best thing ever. I really, really want to do that.”

And I went to my careers master at school. Great guy, lousy careers master, brilliant English teacher, lousy careers master. And he said, “Oh yeah, you need to go and become an accountant because forensic accounting is the way to get it.”

And I went and I did accountancy work experience, and I apologize to any forensic accountants who may be listening or any accountants who may be listening, but my God, your job is boring.

And I couldn’t do it. And, I mean, I can’t count either which doesn’t really help. So I read computer science and artificial intelligence at university. I went in and I became a Unix sysadmin when I left. And I actually branched out into security, I was a security consultant.

Primarily, I’ve been a security consultant in the UK for the Ministry of Defense, Revenue, and Customs; or HMRC now; the NHS and other sort of defense and central government and utilities. I became an independent contractor.

Fairly early on in my career I was made redundant from a permanent role I had and then discovered that actually there’s no such thing as job security in the modern world. So I might as well make some money while I was at it and I became an independent contractor. I’ve been running my own company now for 20 odd years.

And it got to a point in my career where I thought, “You know what? I should go and I should learn something new. I should further my education.” And I looked at the courses that were available and I looked at sort of the topic lists and having been working in security, I thought, “Oh, you know, maybe I’ll do a master’s in security.”

And I looked down the list and it was like, “Well, I’ve been doing that, I’ve done that, I understand that, I’m going to pay £15,000 to study this again. Nah, nah, I’m not doing that.” I looked down the forensics course list and, I have no idea what that means, that’s completely new to me and I haven’t got a clue. That seems like a much better way to spend £15,000 of my money to actually learn something.

So I started studying. I still haven’t finished that master’s degree strangely, because I ended up teaching on it. I transitioned in and I gave some lectures on alternative operating systems, forensics, having been a Unix CIS admin, I knew more about it than some of the people who were teaching it, so I got invited to teach.

And since then I’ve lectured at some Russell Group universities in the UK on digital forensics and cyber security, both undergraduate and postgraduate courses. I am now an independent forensic analyst. Again, I run my own company and I operate on criminal cases here and everything up to the high court, terrorism, all the good, fun things, you know; terrorism, drug smuggling, child abuse, all the things that make a forensic analyst happy on a day-to-day basis.

So that’s where I’m at now. Yeah, I think that’s a fairly precise summary. I’ve been involved with Forensic Focus since 2012, so 10 years at least, having signed up as part of my sort of research when I was trying to do my degree, and found it hugely useful.

And, you know, so I’ve been a member of Forensic Focus and popping in and answering questions and breathing and researching for the better part of 10 years now. And no, I’m really thrilled to be able to participate in this and give back a bit, it’s fantastic. So I’m really excited.

Alex: I wonder if we all had that common experience with having a career advisor at high school that’s just not very good. My experience was, mine taught me to go and pick a trade because uni wouldn’t be good enough, or he didn’t think I was good enough to go to uni. And I was like, “Mm, oh, that’s true.” Yeah, he was essentially suggesting I go be a plumber.

Simon: There’s good money in plumbing.

Alex: Yeah, there is. Yeah.

Simon: It’s interesting actually, because my kids, they were more or less told that they should go to university. And in fact, my son has chosen to do a degree apprenticeship in the UK and it doesn’t seem to me that schools necessarily keep up with what’s current as well as they should in that regard.

Because most of the jobs that are sort of a 16-year-old now will be looking at when they finally graduate, I’ve heard it said several times that those jobs don’t actually exist yet. So you’re getting career advice for stuff that isn’t even there.

And, you know, obviously talking to a guy who did his doctorate in Shakespearean Literature at Cambridge, he was not really in the right state to understand the concept of cyber security and cyber forensics just because it was just so far beyond his knowledge base.

Christa: Yeah, I mean, my kids, I feel like this is more the technology. As the technology is changing it’s harder and harder, I think there’s almost a fragmentation going on. It’s harder for guidance, I mean, we call them guidance counselors in the US, career counselors to be able to stay on top of.

I mean, they should be able to: that’s their job. And at the same time, like, there are so many other little things that go into a day-to-day job. I feel like that it makes it hard to stay on top of these rapidly changing, I mean, I was putting together some podcast questions the other day for somebody else, and it was about cloud forensics and I realized how little I know about cloud forensics.

And, I mean, this is somebody, I do tons of research and, and you know, I go to the talks and, you know, there’s a lot that I pick up and there is still so much that I don’t, you know, because I don’t have that technical background. So I really think that career counselors are kind of up against it in that regard.

Simon: I think so. And, I mean, I’m sure Alex will agree with me on this one. As a technical person, the amount of time we actually spend doing research still on the new technologies, the new, you know, you get a case and all of a sudden it’s got a particular social media aspect to it, you then spend so long researching the background of how this social media app works, you’re installing copies of it, you’re looking for forensic artifacts and carrying out some experimental methodology against something new.

And given the exponential pace at which technology is progressing, we are facing a severely uphill battle whereby, you know, it becomes more and more tempting to become a specialist focusing only on one particular aspect of something rather than a forensic generalist whereby you end up saying, “Okay, well, yeah, you know, I can examine a computer.”

You start to think, “Actually, you know what, maybe…” well, actually in my case, “Maybe I don’t want to sort of specialize in social media because I’m not that interested.”

Personally on a day-to-day basis, I don’t use it all that much. Keeping up with the latest app is a lot of effort. Maybe I should, you know, specialize in something in particular, rather than looking at a broader set of things.

Alex: I think even that in itself, like, the new technologies that you’ve got to keep up with, but then you look back to even just Windows as itself. Like, in my instance, when we’ve got agents on a Windows system and I was looking at Windows Event Logs the other day and pulling stuff out of it, there are actually hidden fields within the Windows Event Logs that just aren’t reported within the event viewer, but you can use the API to pull those out.

And you’ve got to think like, people like Mark Russinovich, who pretty much made a living off writing Windows internals books, and writing tools to pull stuff out of Windows. It’s not documented in itself. And there’s still so much, even with technologies that we do use that we just don’t understand.

And I guess that plays into the part of those cat and mouse games that digital forensics and instant responders play against the attackers constantly, because they only need to find one thing that we don’t know, whereas we’ve got to figure out what they found and then present it in a way that makes sense of finding that evidence and then proving it, which is an uphill battle in itself.

Simon: There’s an interesting corollary to that actually, which is that sometimes as a forensic analyst, even if someone has made a great deal of effort to cleanse a system of evidence with all the bits that they do know about, all of a sudden you find the bits that you hadn’t quite realized were there, and they certainly hadn’t realized were there. And you’re able to build a case off that.

So, so it is six of one, half a dozen of the other, swings and roundabouts, you know, pick your metaphor. But yes, it is. And it’s very deep. Operating systems are very deep. I mean, I’m personally a big Linux fan, and I can go and read the source code to it.

And I still, you know, there’s so much of it that you couldn’t possibly in a single lifetime understand it. You know, you have people who write this Kernel module and they know everything about that and then somebody else writes another one and nobody has a true understanding of the whole thing or at least certainly at any point in time; you have to go and find the bit you’re interested in and then bring yourself up to speed on it.

Christa: Well, it’s complicated too I imagine by multiple layers. I mean, I was talking about cloud earlier and you’re talking about Windows and I feel like there’s a lot of overlap there with OneDrive and SharePoint and some of the other Microsoft systems that just having to be able to know, I guess, how the data is stored both locally and somewhere else entirely and then how that potentially affects systems now.

Simon: Yes, yeah, definitely. I mean, I think one of the best things that I’ve heard said, I can’t remember who said it, so I apologize if I’m not citing someone fairly, but is that, “If you go through anything that says cloud, remove the word ‘cloud’, replace it with ‘somebody else’s computer’”, you are actually at a better understanding of what cloud is.

This is like, “I store my files in the cloud. No, you don’t. You store your files on somebody else’s computer.” And what, you know, I said at the beginning, I’ve been in the industry 25 years. I’m old enough to remember having shared resources whereby I started off as a Unix CIS admin in a place that did computational fluid dynamics.

So we were using some very large Unix machines, multiprocessors, huge amounts of, well, huge amounts of memory at the time, I’ve got more sitting under my desk now. But at the time massive amounts of memory and people were connecting to them with terminals, Unix terminals, and then running jobs.

The world hasn’t really changed. We went from, you know, centralized computing to decentralized where everybody processed stuff on their laptops to centralized computing, whereby everybody’s doing stuff in the cloud to stuff now where people are pulling it back down and doing it all on their laptops and it’s that wonderful cyclical, what do you call it? The life cycle kind thing.

Alex: Do you think in saying that nothing’s really changed from having like, the endpoint hosted locally and now the cloud is just someone else’s computer, it’s just another example of a layer of abstraction that’s happened though, and now when that gets implemented and we have that layer of abstraction where we’re like, “Sweet, we can use this decentralized” and we can work more efficiently and we can work across….

Like, I can work with someone in the US instantaneously because we’ve got this cloud service, but then we’re also abstracting that layer of evidence almost in a digital forensic sense, because in the past, if something happened, you could go, “Well, everything is within the systems that I own, and I can go find all the evidence because it’s sitting on these endpoints.”?

Whereas now, like you look at AWS and Azure and even Google cloud, who’s becoming bigger and bigger in the market share, they all have their own kind of logging systems. And the logging system is different for like, net flow and then access control and just like, everything else.

And it’s becoming more and more complex with companies who implement all these solutions, particularly companies who buy other companies and then are trying to merge them all in, and then their security teams are playing catch up in terms of incident response and digital forensics where they’re like, “Well, we’ve got all these log sources, let’s just chuck it into a seam” which is not necessarily the answer, but there’s a layer of abstraction with the defense side of things, as well. Which is, I guess, the trade off to ease of use.

Simon: It is. It’s definitely their trade off to ease of use. But when you go back to just first principles, and, you know, they’re still just computers. So all of that information that we would expect to collect on the desktop is still there and present in the cloud.

It’s the matter of prying it out of the grasping gripping claws of Facebook, Google, Amazon to get them to give it to us so that we can then process it. Because they are so reluctant to cooperate in a wide sense, unless you have a warrant, generally speaking, or even, even if you own your own data in theory, you know, pulling it down is still reasonably challenging. I mean, and it’s more your field than mine.

Christa: I’m sorry, I was going to say, even if you have a warrant, they often don’t want to cooperate.

Simon: No. And, you know, the multi-jurisdictional thing becomes a big issue because, you know, there’s plenty of stuff hosted, or that you’re accessing from here, you know, in the UK, I go to Office 365. I don’t know whether that’s in London or in New York or in outer Mongolia. It makes no real difference to me where it’s served from, because connections now are actually so fast. Although it’s still quicker and easier to put large amounts of data onto a hard disk and fly it across the Atlantic, apparently.

Alex: I have experienced that myself. Yes. Only because I live in Australia and our Internet’s really bad down here, so…

Simon: You’re not on Elon Musk’s Starlink list yet, are you?

Alex: Not yet. Not yet.

Simon: I noticed with amusement actually, I was looking and it appears to operate a very specific band around the world which basically encompasses North America and England, but it cuts off at the Scottish border. For some reason they seem not yet to have put satellites that far north, so I guess Canada’s out the question, as well. But yeah.

Christa: So I feel like – sorry, go ahead.

Simon: No, I was going to lead on to the things that we were actually talking about before, which was to do with, you know, some of the changes that we’re seeing. And actually I was going to bring up the Exterro example. So Exterro FTK are actually now leveraging the cloud in the UK for doing forensics.

Christa: Yeah, because they put a press release on Forensic Focus about they’re kind of paving the way for a national digital forensic service. And yeah, I’d love to hear your thoughts more about that.

Simon: I’m going to say my thoughts came across somewhat in my earlier reservation, which was this is somebody else’s computer and therefore you are uploading a bunch of critical and potentially, you know, national security critical informational, or certainly information and data that is of a sensitive nature into an environment which isn’t yours.

I did some work for the NHS in a security capacity a few, well, during the pandemic, which now could be anything from up to about three years ago. But we were doing this piece of NHS work. So, the NHS for anybody who is not familiar, is the National Health Service in the UK. We have a system whereby everybody is entitled to free healthcare up to a point. Well, actually up to a very high point, it’s just that if you want it to happen quickly, you have to pay for it.

So there’s a lot of important patient data in there. And we were looking at putting that patient data into Azure and how we could do that in a way which was safe and secure. And I was the architect on that, the security architect on that. And I’d like to think that we did a good job.

There’s a lot of encryption involved, but one of the things that we did specify very much was actually that we could restrict the location of that data within Azure. So we could say that, you know, it will only reside on UK data centers.

We had to be quite strict in that Azure is very fluid. If a data center fails, it will automatically go to another data center. And we actually had to go a step further and specify that, okay, so if that UK data center fails, it can only go to another UK data center, and there’s only three. So  it is a fairly limited choice there.

And there are a lot of controls that you can put in place about using encryption and using multifactor authentication to log into it. And I digress slightly into security, but it’s a tightly secure environment. And you know, I felt comfortable putting personal NHS data into it for the purposes of what we were doing.

I’m not sure that my faith extends as far as things like indecent material. I still have a lot of reservations about taking that stuff and putting it effectively anywhere other than a physically controlled environment. And that perhaps is a slightly archaic view on my part.

Christa: I don’t know. I mean, I went to a webinar, it was a couple years ago now, but it was a major prosecutor’s office in the US. And they were talking about how they had largely moved to the cloud except for child exploitation material, which was still stored locally at their office. So I don’t think that’s archaic at all, yeah.

Simon: And I think, I think the problem is that you tend not to know, I mean, that there are cases that, you know, contain this stuff when you start, obviously, because that’s the reason that people have been arrested.

But there are also quite a lot of cases where you don’t know that it’s in there from the start. I was chatting with an old, he is old, it’s not fair, sorry, Brian. But an ex-fraud squad forensic analyst and actually he was saying that in so many of the fraud cases they were picking up and doing, they were finding child pornography on machines.

And, you know, you would come into a fraud case and you’d upload all of your material to this environment. And then all of a sudden you would have this material there, so you can’t really segregate it.

And I think that’s where my concern lies is that we don’t know what’s there until we look. So we need to look before we start uploading. In my opinion, we need to look before we start uploading stuff to the cloud, and if we’re uploading stuff to the cloud and then looking, we’re finding things.

But, you know, Exterro FTK, they’re actually talking about using it for classification of indecent material. And again, they were talking about, and, you know, this is from their press releases, so I don’t feel that I’m going beyond what they would say themselves. If I am, then they’ve told me, so they’ve gone beyond what they should be saying anyway.

But you know, West Midlands Police used their cloud solution during the pandemic, and they were able to allow their staff and this was at a critical time to work from home.

So you were looking at people who were doing some of this work from remote locations. It’s not a controlled environment at the desktop. So that, to me from a physical information security perspective, that becomes a bit of a concern.

But, you know, they have gone and they’ve talked to the home office and they’ve had this signed off and somebody somewhere has accepted the risk on it. And therefore we are talking about a risk management case here, risk benefit case.

And clearly the benefits to West Midlands Police and the streamlining and the workflow is sufficiently high to counter the backlog of cases that they’re suffering from the staffing issues that they’re suffering from, and the hardware, you know, hardware maintenance and all of the things that come from running a physical lab.

That analysis has been done and it works into the favor of the cloud. And that becomes a way to start to tackle the problems that we’re facing with, you know, large data sets, when you have a scalable cloud, you have, you know, requirements for high processing and sometimes low processing at others, you can scale it, it becomes a lot more cost effective.

And, you know, that’s why the cloud is doing well. That’s why we see it being used, is that cost effective use of IT resources. The issue of who owns the data, what happens when you are unable to pay your bill one month and you get cut off, you know, somebody makes a rooting change by accident, and these things happen in large data centers.

You know, we’ve seen Amazon go offline, we’ve seen Azure go offline. You’re talking about a major forensic police system going offline completely without access for, well, I don’t know, 24 hours. What’s the impact on the legal system? What’s the impact on the capability of the police to deliver stuff?

In a scenario where perhaps this is operationally important, you know, whether it’s counter-terrorism or child protection, you know, if I can’t access the thing that’s telling to do the analysis on a seized laptop from a terrorist that’s telling me where the bomb’s going to go off, you know, is that outage going to be a serious threat to my capabilities? It’s something I’m really interested in.

And because they’re only in the proof of concept stage really at this point, it’s a bit unfair to judge them. I really want to see how they’re going to manage this going forward. And I understand John Cook was speaking at the conference we were at the other day in London: the Forensic Europe Expo. It’s dragging his mind through it.

And he said, there’ll be a white paper coming out. I mean, they’re putting out their press releases as it is, but there’ll be a white paper coming out from the government later this year from the Home Office that will actually describe the legal aspects of it. And I’m really excited to see that. Yeah. So there’s something to look forward to there. So yeah.

Alex: I’d be keen to get like, both of your opinions, cause as you just mentioned Si, you have a reservation for material going into a cloud infrastructure rather than locally on a physical storage.

So from my experience with, like, I’ve never been in policing, so I’ve never dealt with indecent material myself. I’ve come across cases where it’s been there and it’s been handed off and I haven’t had to deal with it myself.

But within Australia and how we classify I guess the personal information and the threat that it can cause for an individual to have their data exposed, that goes down to just like a tax file number or a credit card number. Like, it’s a very low threshold for that data in terms of what it is in terms of a threat to an individual.

Now, a lot of that stuff like, we will put into things like Avi and a cloud platform to process that data that’s in a cloud infrastructure and that all sits there and it could have indent material in it.

But I guess from both of your perspectives, why is it the line that you guys kind of feel that it’s the indecent material? Like, it’s okay for this other stuff, that’s class one threatening material if it gets out, but then indecent material is kind of like this extra level above that even though it sits within class one, you feel like it’s slightly different?

Simon: So in the UK, we would be looking at GDPR, so General Data Protection Regulation, which defines what we’re allowed to hold and process about an individual. There are plenty of people who are processing all of this stuff in the cloud already.

So, oh, for example, the NHS project I was talking about, your NHS number is personal information, your name is personal information, your address, your date of birth, all of this stuff, all of your medical records are all personal information. Your credit card numbers, your tax numbers, social security is a bigger thing in the us than it is here, but we have a social security number as well and all of this stuff, and that’s what’s packed under the GDPR.

But there is a kind of a defined level for handling it and you have to meet those regulations on data security to do it. And to a certain degree, it’s not unachievable, well, it’s definitely not unachievable. Plenty of people have done it. For credit card information, it’s called PCI-DSS, so Payment Card Industry Data Security Standard. The NHS has their own standards and so on and so forth.

So if you want to handle stuff in this regard, you have to have met a certain sort of base level of criteria or in terms of data security, but it’s not actually that high in reality.

You know, I’ve worked in, as I said before in the Ministry of Defense, you know, the protections that exist around the classified system for some sensitively protectively marked data in the UK is higher than this PCI-DSS or GDPR stuff.

The indecent material, in the UK, it’s an offense in its own right to be in possession of it. It’s definitely an offense to transmit it, and both of those things have to occur in order to use a cloud solution to handle it.

Now, there is an exemption that is given for it that you have a legal reason for possessing it or transmitting it for the purposes of law enforcement.

So, you know, I’ve done very few cases, but in the cases that I have done, I have had a legal reason to be in possession of it for the duration of the case as it was being tried, and thereafter, you know, I have no legal reason to possess it. So it was destroyed, you know, with prejudice, with extreme prejudice.

That exemption is being extended for the cloud solution, it seems to justify that the need exists, therefore we can do this. There was the understanding I got from Exterro. But it is  just purely, you know, the act of having a credit card number is not an illegal offense, cause obviously you have one, I have one, the world has one.

Alex: Sorry, I suppose more like to line it up, I guess, the indecent material, the same could be said with the NHS, with having an individual’s personal medical records that is compromising to the individual, has potential insurance implications to them, if that was going to affect them or could just be like, it could significantly impact their social image if a medical condition came out that they had that they didn’t want to disclose to the public.

So in that sense, the NHS going like, we’ve done these regulations, we’re happy to put it in the cloud. Like, does it come down then to the regulations themselves of saying the data security bar to store this in the cloud is too low? Like should lawmakers be making those bars higher for all that kind of material for that to use?

Christa: I mean, I think with indecent material specifically, I mean, the foundation for those prohibitions is consent. You know, children, there are really two levels of consent there, you know, first of all, they’re not going to have consented to this horrible act perpetrated against them to be uploaded.

And second of all, like they don’t even understand things like terms and conditions, you know, to be able to just consent on a, you know, just a normal data sharing level. You know, their parents might be sharing their health information for instance, you know, because we’re, you know, cognitively able to understand that.

But I think that that’s really where the line is. And Si, I think you have a much better understanding of probably where the regulations come down on that. But to me that’s where the bright line really is.

Simon: Yeah, definitely. For GDPR you’re looking at it has to be a consensual thing and you have the right to go and ask for it to be removed. I’m pretty confident I wouldn’t have the right to go and ask the Met Police to remove my data from their cloud system if I committed a crime or if I was involved in the crime even. I don’t think that they would be interested in doing that.

So yes, perhaps that’s it. Do I think that the standards should be increased? As a security professional, I think the standards should always be stronger than they are.

In reality they’re, okay, say you’re an incident response specialist, you know that with the best standards in place, somebody still misses something or there’s non-disclosed vulnerability that comes out or, you know, so, you know, I would definitely be in favor of standards being stronger and more prescriptive.

It’s too easy in my opinion, to pass some certifications by paying lip service to the rules rather than actually understanding the motivations behind them. You know, part of it is like, you know, you must have passwords that are X long or whatever.

The issue is not about how long the password is or anything, it’s about password management, it’s about security through authentication and identification and authorization through the use of a credential that identifies an individual. You know, you can have the best password in the world, but if you’ve written it on a post-it note and stuck it on your screen, it’s irrelevant.

And, you know, the thing is that we need to get a better understanding across the world. We need to get a better understanding in the workplace of how cyber security actually works, so that people understand why they’re doing the things that they’re doing.

The NCSC, the National Cyber Security Centre in the UK has actually made some interesting steps in that regard, because they’ve gone away from this, “Your password must be 10 characters long; it must contain numbers, letters, complex stuff” and all of those things.

And they’ve actually moved to something basically this suggests you should choose three words that are memorable to you. So, and I’m going to pick on XKCD as the fantastic example of a correct-battery-horse-staple, which I can still remember from the cartoon, because it’s so obvious when you actually think about it, but, you know, you you’ve picked three words and I’ll choose some random ones off my screen to go speaking-knowledgeable-security.

Alex: Now we know your password, Si.

Simon: Yeah, you’re correct. Battery-horse-staple. But, you know, you’ve got what, speaking-knowledgeable-security; one, two, three, four, five, six, seven, eight, nine, lots. You know, that’s an incredibly long password with potential complexity that is far higher than is calculable in a reasonable time. And it’s something that an individual can remember without having to put it on a post-it note. And that point, yeah.

Alex: That’s echoed with the ACSC, so the Australian Cyber Security Center is they’ve done something very similar where it’s, pick a phrase that you can remember that’s long and then couple that with a password manager.

So you have those long complex ones, you only have to remember that one long phrase that’s individual to you that no one else is going to guess, but it’s easier to kind of type on a keyboard and remember, which is I think much better than the old password123 for 50 different accounts.

Christa: So I’m curious, though. I mean, that’s password manager, but I mean, you know, when we’re talking about cybersecurity and the reasons that, you know, we do things, I mean, there’s still that element of employees still want work to be convenient to them.

And so, you know, if somebody fires up Slack and they start a Slack channel and it’s not necessarily company-approved, but it’s helping them get their work done, I mean, what kinds of, and, I mean, we’re getting off track of digital forensics, but I feel like it all feeds back into that evidence that you’re collecting of, you know, whether it’s a employee misconduct or whether it’s a breach response that…

Simon: I’ll give you a fantastic example that came out in the news yesterday. Two officers in the Royal Navy were court martialed. So court martial, everybody familiar with court martial, okay, and they were having an affair. It was a female weapons officer and a captain on a submarine, actually, or on submarines.

Generally speaking, there’s more than one that we have somewhere. And because they were having this affair, they were communicating with each other to arrange their liaisons via, I think it was just via email, I forget the exact thing, but either way, over public channels, over sort of internet-accessible channels.

But because they were arranging where they were going to be and when they were going to meet up, they were effectively transmitting the movements of submarines, which you’re kind of not supposed to know where those are across the internet in clear text, basically. So they were both found, you know, guilty of breaching the Official Secrets Acts and have both been discharged, and I think, possibly facing jail sentences.

But you are absolutely right. At the end of the day, actually, I’m going to say it’s fairly well known in security, humans are by far and away the biggest threat that we have. Everything from, you know, this kind of inadvertent but deliberate action to the hitting reply all on the thing that sends out the company accounts to the entire mailing list, to deliberate actions of things, but to bringing malware and by clicking on the phishing links and, you know, bang somebody enough money and they’ll do anything.

Alex: I think you’re right on there, like, the human element in a company or in an organization definitely is the greatest threat. It’s also, I guess, your greatest asset to defending against that because like, humans are the one that are generating the revenue or generating the work for your organization. You can’t not have humans, because if we didn’t, none of us would be working and then we’d all be out of a job.

But like, organizations need to work with  the individuals to come up with solutions for this. So I know with my time in the military, and this may be like that example with the two having an affair, like they wanted to keep everything a secret cause it was an affair.

If it was a normal just like, “Hey, we’re two people on two different submarines, we want to meet up,” within the military when I was working, we used something very similar to like, MSN, which was just a messenger and it was a very like, casual chat messenger.

So if you wanted to like, catch up with friends or whatever, or you were like meeting up with people, then you could message over that messenger app. But in that sense, the defense force had created an avenue for individuals to have a social aspect within the military that was outside of like all the email systems and all the formal things.

And I think that that extends across all companies and all organizations. So if you need a platform for individuals to cooperate on and collaborate something like Slack that makes their job easier, pay for the corporate one and monitor it and make sure you’re monitoring it for maybe individuals copying a whole bunch of data out of it and putting into their email and sending it via personal email.

Which is an intellectual property theft, if they’re taking that off corporate instances, but if it’s making their life easier, then like those should be investigated in, but it needs to be a solution that comes up where everyone’s involved, and you’re getting feedback constantly from your end users, which is the humans that sit at the keyboard.

Which, if they’ve got all the tools they need, they’re less of a threat because they’re not going to look for ways to get around things that are impeding their work, which is there’s definitely like the biggest thing.

Christa: There’s kind of a rub there too though, because, you know, if people know that they’re being monitored then that’s almost in some cultures I think more of an incentive to try to go offline and find a way to communicate that isn’t monitored.

Simon: I think there’s also the issue of you also kind of, if you’re setting up something like this, you become responsible to a certain extent. Not entirely, but for the things that get said upon it. So if you end up with anything that’s defamatory, racist, sexist, offensive, you as a company then have a problem to deal with.

Now personally, I’m of the opinion that, you know, I’d rather that that was highlighted within my company and that I can get rid of those people because I don’t actually want them in my company, but I think there’s a litigious…

Alex: I think you just hit on it right then. Like, as a company, like, you are on work time, you have the expectation that you are being monitored. I don’t think anyone is to be under any illusion that they shouldn’t be monitored while they’re working because it’s during work time.

If you are making defamatory or racist comments on a work platform during work time, you should be fired. Like, I don’t think anyone can argue against that, they’re just like, “Oh, it’s in personal chat.” No, it’s on work time. Like, you shouldn’t be working for that company.

I think monitoring during work time is completely fine. I think individuals can expect their privacy in terms of what they do when they go home, they’re using their own personal email, and this is a whole separate topic, but when governments are like monitoring private chat platforms, I think that’s a no-go.

But in terms of workplace monitoring: 100%. It should be fully monitored because you’re protecting your revenue as a company and that is also the employees and what they’re doing and the image that affects to the public.

Simon: Yeah.

Christa: Yeah. I think I was thinking in terms of like those human relationships that, I don’t know if friendships is the right term, but those collaborations that sort of help to smooth the process and it might be tempting to take something offline and not again have it be monitored just in case somebody, I don’t know, not misreads it….

But, I mean, I think it’s part of being human at work to want to just be as informal as possible, obviously knowing that there is all kinds of risk going along with that.

So, you know, is there a reason that, you know, you want to take it offline? Does somebody want to, you know, do people want to flirt inappropriately, kinds of things? So you know, but in terms of just humans being the biggest vulnerability I think those sorts of human factors need to be a part of the conversation.

Simon: Yeah. I mean, it’s definitely something that if you’re drawing up any sort of risk analysis of a given system, human factors are at the forefront and something that you consider.

Certainly when I’ve been doing it in the past for military systems, you are looking at the potential human interactions that exist; whether that’s bystanders, cleaning staff coming in, you know, people you’re interacting with in the sense of third parties or even your own your own employees, you’re looking at the potential motivations and and levels of access and impacts that they could leverage against a system in a risk management system.

Christa: Well, and I think that comes back to what you were saying earlier about the pandemic. You know, but just remote work, cause I remember when it all started there, you know, people working from home and suddenly, you know, now family members have access to company assets and potentially sensitive data. And so, you know, that was a, for a while, and then it became less of a thing, or at least in the news or as I’m understanding it.

Simon: It was a thing, I think it is a thing in reality. I mean, I think full disc encryption has an obvious gated forensics to a certain extent. But also has largely addressed that issue. If I leave my laptop on the train and it’s switched off — and there’s the important phrase there, “and it’s switched off,” it’s in a secure state.

And this is actually the stance of the MOD is that, you know, if we were carrying restricted material on laptops, as long as those machines were switched off, they are about as useful as a doorstop in terms of getting data off them.

But inside the house, you know, I’ve got three children and my wife is here. If I am doing cloud work on a forensic case that contains indecent material, what’s to stop one of the kids coming in and looking over my shoulder? If I am, you know, doing forensic, I mean, okay, my study has a door which I shut if I’m doing anything even remotely like that. But you know, to be honest, I’ll sit in the living room and type up a report.

And technically, you know, I mean, I keep the screen away from people because my background in security and in, you know, slightly more sensitive security teaches me that, you know, my screen shouldn’t be facing toward anybody at any time.

Well, I’ve sat on the train and read over the shoulder of people doing all sorts of things they shouldn’t be. And, and you know, we’ve got that now in the home environment now. I guess you trust the people that you are around, but as a company, I don’t necessarily trust your husband, wife, dog to not put whatever you are seeing on Twitter, especially if it’s something that’s even slightly controversial.

You know, there are companies that do things or new products, you know, if somebody’s excited that your company is now going to release Widget 3, you know, the upgrades from Widget 2 are amazing, you know, it’s hard not just to go on to Instagram and post that because you know, going back to social media, a lot of us are fueled by likes, you know, and engagement and, you know, we’re doing a podcast here and the people don’t like it, then, you know, I’ll be upset.

Alex: We need those fake internet points.

Simon: Yeah. You wait to see the number of bot accounts that click that they loved this. So you know, something that will give that engagement. It’s a whole new threat profile that we have to consider for the pandemic.

And I’m not sure what’s going to happen in the future. There seems to be more of a move to hybrid working. So, and, you know, I’ve been working in a hybrid fashion for the better part of probably about 15 years now, where I would do a couple of days in an office and then remote.

So to me, it’s no, but a lot of people, that’s a new thing. And it looks like that may well persist longer term. So yeah, I think it’s something we do need to consider.

Alex: On the side of the company thing, so I guess from my background with the company that I work with now, being insider threat and being a vendor in that providing to companies the viewpoint of that, is the shift has been purely from — if we’re talking about data loss protection — it’s shifted from network data and being centric around that, to endpoint data, because of this pandemic and the fact….

So if you take the example of like the military, for example when I was there, if you wanted to work from home, like you needed a thin client to run on your endpoint, you had a VPN, and that was all monitored. And if you wanted to get a certain level of access to say, like secret material, then you needed to be in a room that was rated to a certain physical standard.

So you had to have like, physical access control in your house, essentially, if you wanted to access that material. With the pandemic that shifted so quickly for companies. So, everyone going home and the fact that a lot of companies don’t give employees devices that are controlled by the organization, they’re just allowing the endpoint devices to be used with a VPN connection into the corporate network.

And those hybrid devices can then just essentially screen capture whatever they’re working on. And so that’s why there’s been a huge shift to monitoring the endpoint data rather than monitoring the network data, cause you’re sitting outside that network.

And that’s been a big challenge for a lot of companies cause that shift happened so quickly and then only probably like two years down the track they’re now thinking, “Oh, okay, we’ve shifted, how much data have we actually lost?” Like, you think about employees who have changed jobs in that time as well, how much have they taken with them? And that’s probably a really big unanswered question for a lot of companies.

Christa: I think that that shift to the endpoint really, and we’re coming up to the top of our time here, but maybe save the conversation for another day, is the impact of digital forensics. It impacts the amount of work that you have to do, right? Just that greater volume of data from all of the different endpoints.

Alex: Yeah, and not just the amount of data from the endpoints, but also the legal implications of that. So if companies have allowed BYO devices, but have no legal right over that, then if a company needs to do an internal investigation, requesting that can be quite difficult.

And then by the time they get a warrant, an individual can then wipe their device and then all that evidence is gone, which in past cases I’ve seen where, and this is the funniest thing is, companies will always let their board of directors have their own devices, which I think is the worst thing that they can ever do because they’re the ones in my opinion, most likely to commit corporate fraud.

Whereas they give like, all their, I guess, foot soldiers the devices and then the board of directors are like, “Oh no, we want to use our own device.” But then when it comes to investigation, they’re like, “No, this is our device. You can’t take this.” And then by the time they get a warrant it’s wiped and you don’t have evidence.

Simon: And what’s worse is that the entire company is standardized on Windows and all of the board of directors have nice new Apple MacBook Pros.

Alex: Nice, Macs.

Simon: And so they’re not running the same AV as everybody else, no anti-malware, there’s no monitoring on them, they’re not set up to log, there’s no corporate logins, they’re just coming in over Office 365, pulling down all of that stuff, storing it on their desktop, it’s not an encrypted hard — actually on a Mac is now, but, you know.

I did a job, actually I was there as a security manager a few years back, but with a forensic background I ended up doing a forensic job on one of a company director’s Mac who had left under a small cloud because he’d been caught looking at things he shouldn’t be looking at during company time.

Which of course wasn’t being picked up by any of the monitoring systems because, you know, although it was very easy to do the forensic job cause he had it all neatly put away in folders and nothing illegal, but still a very well sorted collection.

So, yeah, but I mean, again, you know, having said all of that, I’m aware of a police force that seized a mobile phone, didn’t secure it properly, left it on the shelf in an evidence store, the guy walked outside, logged into his iCloud account and wiped it. Because they hadn’t put it in a Faraday bag, they hadn’t protected it properly. So, you know, the issue of preserving evidence is always going to be a fun one.

Christa: Well, I’m going to close it there. Si and Alex, thank you so much again for joining us. It’s exciting to have you both as co-hosts and I’m looking forward to future discussions with you.

Simon: Yeah, very much so.

Alex: Yeah, me too. That was fun.

Simon: Cheers, guys.

Christa: Thanks also to our listeners. You’ll be able to find this recording and transcription along with more articles, information and forums at www.forensicfocus.com. Stay safe and well.

Editor’s note: An earlier version of this transcript misquoted Si Biles as saying he was a “UNIX CIS admin” and has been corrected to reflect the accurate title, “UNIX sysadmin.”

Leave a Comment

Latest Videos

This error message is only visible to WordPress admins

Important: No API Key Entered.

Many features are not available without adding an API Key. Please go to the YouTube Feeds settings page to add an API key after following these instructions.

Latest Articles