Christa: When we talk about diversity in DFIR, are we referring to hiring people who look different from us, or do we also mean people who think differently? Welcome to the Forensic Focus podcast, where monthly we interview experts from the digital forensics and incident response community on a host of topics ranging from technical aspects to career soft skills. I’m your host, Christa Miller.
Today we’re talking about diversity, though not just in terms of demographic diversity. We also want to talk about the diverse set of experiences that each practitioner brings to digital forensics, their personal and work experiences, the places they’ve lived, the people they’ve encountered, and the cultures they were born into, or maybe adopted. Our guest this month is Lodrina Cherne, Principal Security Advocate at CyberReason, and a SANS-certified instructor of Windows forensic analysis. Lodrina has over a decade of experience in digital forensics and a lifelong passion for cybersecurity. Her work focuses on preservation and analysis of electronic evidence, including host-based analysis of Windows, Mac iOS, Android, and iOS systems in matters concerning intellectual property theft, employment disputes, and evidence tampering. She earned a bachelor’s degree in computer science from Boston university and has continued her education by earning the GCFE, GCFA, and GASF certifications from GIAC. Before we go any further, I should note that as an independent contractor, I worked for CyberReason, though not directly with Lodrina. With that said, Lodrina, welcome to the show.
Lodrina: Thank you so much, Christa.
Christa: The diversity of experiences was a strong element of your keynote address at the SANS DFIR summit in July, where you spoke about how your experience with bicycle repair, as well as insights for warfare firefighting and medical fields, had helped you to think about and solve digital forensics problems. What is it about this field that particularly benefits from such a wide range of perspectives?
Lodrina: Yeah, that is a great question. So I’ll share a little bit about my own background as I entered the forensics world, and how I got there, and why even to this day, I keep my customer service jobs on LinkedIn as something that I feel benefits me as a forensic examiner.
So when I went to college, I went to Boston University for computer science because I was always interested in computers. However, I found my college experience somewhat challenging, you know, I’m a first generation college student and whether that meant having the right support network or the right financial funding to finish that degree, I decided it was best for me to take seven years, stretch it out and finish my computer science degree, which I am very proud to this day to have done.
And one of the things I got to do in that process was — as I finished my degree part time — was work in a bicycling store and in a camping store. And what I think that has helped me do today in my career is really use some of those people skills, some of those communication skills. And all of that has helped me elevate and communicate all of the work that I do technically. Because I might be the best forensic examiner in the world, but it doesn’t mean anything if I can’t communicate my findings to people.
So I could have the most technical knowledge of anybody on something like how a shellbag looks like on disc. And what does that matter if I can’t share with other people — whether it’s examiners that I teach in my classes, or whether it is somebody in the legal profession, or somebody who’s nontechnical in some kind of a lawsuit — that “Hey, shellbags are important because they show what this person was doing on their Windows computer. And we see evidence of what files they were touching in these ways.” If I can’t make that explanation, and if I can’t talk to somebody in their language; you know, I’m even thinking about the bicycle shop example that I mentioned, if I’m talking to somebody who’s a bicycle commuter versus somebody who’s an elite racer, both of whom I had to work with on a regular basis in that retail role, I’m going to need to speak to them differently about what their needs are and what their goals are, use different language, and really cater to the problem they’re trying to solve.
That being said, one thing that I use constantly in my computer science education is problem solving. And we can definitely talk more about that as well, but that’s a little bit of my journey and how these customer service jobs I worked during college, I think, are some of the most important experiences I’ve had. Just as much as my degree, just as much as my certifications.
Christa: That makes sense. And that was something that came up in your DFIR summit keynote as well, was the need to give things meaningful names. So words that will apply across experiences and understanding, as you just mentioned, with the difference between a beginner bicyclist and an elite racer. Why are diverse perspectives important to language and names, especially when it comes to management of incidents, investigations and teams?
Lodrina: Ooh, naming is such an important thing. And I think maybe I can start with one thing you asked in your question, which is when we’re talking about incidents. And for some people, an “incident” or a “breach” means something incredibly serious. Just saying the word “incident” or saying the word “breach” can potentially mean all hands on deck, not just we need our incident responders and security teams, but we need our legal, we need our communications and marketing and PR and CISO and everybody, right now, to deal with this event. But maybe — depending on the business you’re in, or the challenge that you’re facing — maybe your idea of an incident — and that could be something like somebody misplacing a company phone or computer — is not really an incident to everybody else in your organization. Because they think an incident is an attacker getting into your network and stealing your company secrets.
And I don’t know the right answer, but I think it’s worth acknowledging that there are some terms in our industry, or in these sub industries, that carry a lot of weight. And it’s always worth clarifying, especially when you have those kind of ‘pull the fire alarm’ moments: Are we really talking about the same thing here? Because do you want to tie up all of those executives right now? Or is it something that can wait?
So that’s just one example. And you know, if you’re somebody who is an incident responder, does your legal team communications team executive board know the severity of the things you’re working on and the incidents that you are involved in? So I think from a technical perspective versus a business perspective, it’s something that we want to acknowledge, to make sure that security is valued and that the security work we do is highlighted and prioritized at times we need it to be.
Christa: So I think that goes to another thing that you had talked about in that keynote, which was about solving problems that we’ve never seen before, as well as recognizing problems that we may not even be aware existed. So as both technology and digital forensics evolve, what are the risks that those types of problems pose, and how do diverse perspectives help to solve them in addition to communicating them?
Lodrina: Yeah, that’s a great question. Let’s see if I can remember one of the examples that I gave during my keynote at the DFIR summit, I just love it so much.
One of the examples I gave was from a book called Range by David Epstein. And I love David Epstein’s writing. David Epstein has written a lot about sports and human performance, which is another love of mine. There’s one other thing I do is, I’m in the gym a lot and I do power lifting, and I find that that kind of pushing through barriers I have there helps me with technical issues I face, but that’s another story.
Back to Range. One example in that book I love is this theoretical example of solving a medical problem, which is: let’s say that you’re a doctor that has to treat a patient with a tumor, and you have a laser that will eradicate the tumor. However, any healthy tissue that laser passes through is going to be damaged on the way to treating the tumor. However, if you can hit the tumor with a low dose of the ray, you won’t damage the living tissue, but the low dose ray isn’t effective to completely eradicate the tumor. So how do you treat this patient?
And then Epstein goes on to give us two examples from other parts of the world. Things like firefighting, and things like strategy and war. So the firefighting example is: say that you’re a fire chief who’s called to a shed fire. You get there. And all of the local residents have this bucket brigade going, and they’re passing buckets of water up from the lake and throwing it on the shed, and they’ve been at this a little while and it’s just not effective. You, as the fire chief, come in and you tell everybody: Stop. Everybody, let’s go down to the lake together, everybody fill up your buckets, and now let’s stand in a circle 360 degrees around the shed. And let’s all on the count, throw our buckets of water on the shed fire.” And you one, two, three, hurl your buckets of water onto the shed and the fire gets put out.
So, a different problem solving scenario is: say that you’re a general leading your troops into capture a fortress, and you can’t take your whole platoon down the main road that leads to the fortress because that road is booby trapped with mines. And you simply have too many people in your unit to be able to safely navigate through the minefield. So what did you decide to do is you split up your platoon, and you have them in small groups, and turn all these little wagon wheel roads that approach the fortress. And these small groups are able to successfully navigate all of the additional mine-laden roads that lead to the fortress. And you’ve decided at a predetermined time that you’re going to attack. So everybody synchronizes their watches. They split up into different groups. They attack on these small wagon wheel road approaches, and the attack is successful.
So now having heard these two stories, think about your role as the doctor, trying to treat this tumor. The full strength ray is effective for the tumor, but it hurts living tissue. The low dose ray isn’t effective on the tumor, but it preserves the living healthy tissue in between. The strategy that you might have come up with from hearing those two stories — that one about firefighting and that one about war strategy — is that you can apply a low dose of this ray from 360 degrees around the body, around the tumor, and eradicate the tumor by applying that ray from 360 degrees at a low dose, preserving the healthy tissue and getting rid of the tumor.
So that was one of the examples from different worlds that I brought into this summit speech that I gave. And again, whether it is treating a patient, strategy and the art of war, or firefighting, those seem like completely disparate fields. But even with these disparate fields, I think you can see how there’s some value to problem solving, and applying that problem solving activity from other fields.
So I don’t know what the exact analog would be for something in the digital forensics incident response world. But I think that we’re going to find, the more we bring in people with different experiences, the more we might be able to solve problems that we didn’t even know we had. And at the rate that our field is evolving, this is going to be crucial to being able to defend systems, to stay ahead of attackers and really address the bleeding edge of technology.
Christa: And as a SANS instructor, as well, how do you encourage students to either develop or seek out those diverse perspectives, whether they’re in your classroom or following the training itself?
Lodrina: One way I’ve been trying to encourage diverse perspectives is what we’re doing right here, which is sharing my own journey. I’m somebody who, you know, has had a decent forensic career, and I’ve been working in the industry for over a decade. It’s really only recently that I’ve started teaching, and started getting more into the community, that I’ve been sharing my experiences like I did at this year’s DFIR summit. And that’s both my experiences in cases I’ve solved, but also my experiences with, you know, how important fitness is to my life. And by the way, maybe I can’t take that meeting late in the day because I need to go to the gym and meet my training partners. You know, and just being very public about how all of the things that I do when I’m outside of the keyboard or away from the computer… really sharing how those things are important to me and help me recharge and be a better investigator when I come back to the keyboard.
So, you know, being mentally rested and fresh, and just personally hoping that I can lead by example, you know, sharing books that I love to read and how you don’t need to go to bed at night, hugging the Pen Tester’s Bible, you know. I think the more that I can share about my own experiences, the more I hope that other people can see themselves and their own experiences and their interests as part of the forensic field.
Christa: And on that note, I also wanted to ask, in terms of management and leadership, as you’re describing, tell me more about CyberReason’s #YouBeYou value. What has been most valuable about that both to you personally, and to the people that you work with on your team?
Lodrina: Yeah, I am so glad you asked about this. So, in my work in the forensic world I’ve decided to try and amplify all of my technical expertise by moving into a more business focused role, which I’ve been able to find at CyberReason. So as a startup, I’ve been able to do some really cool projects, and really do some self-directed projects that I think are important to the business and important to our industry, that nobody has asked me to work on.
So this ties a little bit into one of our company values called ‘You Be You’. And out of that value, which by the way, in our company, we spell out with the letters U-B-U, and those three letters simply mean: “Hey, bring all of yourself to work.” You know, you don’t need to look like this person sitting at the desk next to you in the office, back in the days when we were in the office all the time.
You know, you don’t need to look like this person. You don’t need to have the experience of somebody who’s sitting across the table from you. Bring yourself to work. And whatever those experiences are, whether you went to college or you didn’t; whether you’re a parent; whether you’re not; bring yourself to the table. So something that I care about, and that I took the initiative to pursue at CyberReason, and was fully backed by the company when I brought up how important this was to me, was thinking about the language that we use in our workplace, and how can we use more inclusive language?
And especially in this past year where we are experiencing one of the biggest social justice movements in history. When I talk about social justice, I talk about… thinking about George Floyd and all of the reaction that we’ve seen, with people in the streets fighting for social justice and equality.
Now I’ve been able to take that into the workplace and, you know, in the workplace, maybe it’s not a social justice issue. Maybe it’s a corporate social responsibility issue, to make sure that we’re seeing the people that we work with and the people who are our customers as respected peers. And in support of that respect — in support of trying to build an industry where we value everybody that we work with, no matter what their background is — I was able to bring to light that there’s this discussion in our industry about the language we use and removing terms that might be less inclusive.
The people who are traditionally othered in society. So I’m talking about terms like ‘master/slave,’ I’m talking about application control terms like ‘blacklisting’ and ‘whitelisting.’ And if this is a topic that’s new to anybody listening, there’s really been a huge groundswell, especially this year, but something that’s not new, it’s something that has been quietly happening over the past few years, where there are large parts of our industry that acknowledge maybe instead of ‘master/slave,’ we could use something that’s even more descriptive without relying on language that has roots in oppression.
And I think ‘master/slave’ is the clearest one when we’re talking about slavery and enslaved people. So maybe instead of those terms, we can use things like ‘primary’ and ‘secondary,’ and maybe instead of ‘blacklist’ and ‘whitelist’, that rely on outdated assumptions about black being bad and white being good, maybe we can talk about ‘block lists’ and ‘allow lists.’ And certainly depending on the sub industry that you’re in, instead of ‘primary’ and ‘secondary’, we might be talking about ‘parent’ and ‘child,’ we might be talking about ‘controller’ and ‘subcontroller.’ There are definitely variants here.
I think the important thing is to acknowledge that there are many ways that we can welcome people who are diverse in being, or in thought, or in their person. And I think it’s important to acknowledge that just as our culture is always evolving, our field is always evolving, that language is evolving with it. And it’s incredibly important for me to promote inclusive language. The U-B-U effort at CyberReason is one way I was able to bring that in to our company and really have it not just be accepted, but celebrated.
Christa: That’s wonderful. On a related note, I wanted to ask about… your current role is as principal security advocate. In December at the RSA conference, you delivered a presentation titled: ‘Tracking every move: From location-based apps to stalkerware and advanced attackers.’ So kind of in a similar vein as the social justice and developing inclusive language, talk to us about how the homogeneous approach to app development has created these vulnerabilities, not just for the apps and the devices, but also for the people using them.
Lodrina: Absolutely. So let’s start with one that’s fairly ubiquitous, probably for many people listening, which is social media apps. And I think something that we’re seeing right now — and I will mention, I am based in the United States — so at the time of this recording, we’re gearing up for a big election season at the end of 2020. What we’re seeing in social media, and whether it’s politics, or whether it is something like the experiences of women online, or vulnerable populations, we’re seeing that there’s a lot of social media reporting and social media ability to report either things that make users unsafe, or maybe to report things that might be untrue when we talk about politics.
And I think it’s something that, if you look at early founders of these popular social media companies or social media applications, where you see a fairly homogenous founder culture and maybe homogenous boards or programmers. And we’re really seeing this groundswell right now, that people who are vulnerable online, and misinformation online, are two huge topics that the architecture of social media on the whole is not really ready to address in meaningful broad ways. And I think it’s something where we’re starting to see some apps come up now that are aimed at helping people manage their social media. And whether this is you know, as a topic that is very near and dear to my heart, as I wrote about for RSA, things like stalkerware; things like keeping people safe online. If you had more people in the room, whether it’s a social media app or a stalkerware app is being created, do you have people in that room who have the ability to raise their hand because of their different experiences and say, “Hey, maybe this isn’t such a good feature. Maybe shouldn’t implement this tracking ability in our program.”
Or maybe, “When we roll out this new feature, how are we going to think about ways people abuse this technology?” I don’t know that we have enough people in R&D roles, in management roles, who are willing to raise their hands and say, Hey, how could our technology be used in a Black Mirror episode? And how can we just consider that, as we think about this really cool techie thing that we’re working on? I think it’s really important to get those other experiences in there, so that we have those people who raised their hands and say: “You know, it’s really cool that we can cryptocurrency block chain this authentication thing, but how do we make sure that this technology is benefiting everybody in our society? How can we make sure this technology isn’t going to be abused?”
And I think when we get more people with, again, diverse experiences in their lives, then we can start asking questions that maybe weren’t asked early on at a lot of social media companies, or some of these applications where they’re developing tracking software. Can it be used for stalking? Is it really stalkerware?
I don’t know how to get more people to do that, but I think bringing awareness to it is one place that I can help that effort.
Christa: From a digital forensics perspective on this — and this might be one way that examiners in the community could also help — from the community’s perspective, digital forensic investigators benefit from the data left behind, although that may be after an incident that didn’t benefit the user. So where do you think the balance is between being able to access more data to investigate, versus advocating for less data to protect the user to begin with?
Lodrina: So this is a great question, because… I want to frame it in the context of something else that came up constantly during the latest digital forensics summit. This is a concept that’s been dubbed DFIR For Good. So hashtag #DFIRForGood. It’s something that was echoed in a lot of talks. So we have this incredibly amazing set of… when you think about it, really, superpowers as forensic investigators. And we can look into the past, we can see what was going on, even when people using their computers, using their cell phones, doing things online, didn’t intend to be tracked. You know, and we put together this digital crime scene after the fact. That’s our whole job.
And we’re starting to open this conversation about: How can we do digital forensics for good? How can we make sure that all of the data that’s left behind is being used for crime fighting? Is being used to help people in need? How do we balance this world where we have an increasing amount of data and an increasing need for privacy? That is also a huge question.
For that issue, we’re starting again with awareness. and I’ve been talking a lot to different user groups, whether they’re technical user groups or college computer clubs, and just sharing with them: “Hey, I work in forensics. This is the kind of stuff that I can uncover as a forensic examiner. And here’s how, as a user, you might be able to tell that your devices are holding onto some kind of information for you.”
So here’s a really easy example, is I have a very unique name. I don’t know any other Lodrinas. And every time I get a new cell phone, or every time I use a new computer and fire up my email client, I need to teach my computer my name. I need to teach it: “Hey, when I type ‘L-O-D-R-I-N-A,’ I really mean it. That’s not a mistake. I don’t want the red squiggly line or the autocorrect. Like, I’m very protective about my name.” But the mere fact that my computer now remembers how my name is spelled, or my cell phone now remembers to not autocorrect my name to something else, well, if your computer’s remembering it, or your phone’s remembering it, that means that forensics can uncover it. There’s data being stored somewhere. And yes, it’s very helpful to me as a user to not have to, you know, un-autocorrect my name every time I type it. But that my computer remembers it means that forensics can recover it.
So yeah, we do have to acknowledge there is more and more data in this world, period. And even if we’re good about limiting our own data that we save in applications, you know, there’s data out there in the world. Whether it’s banks and credit scores, or just other places collecting data about us. One of the best things that we can do as individual users and as forensic professionals is to acknowledge, yes, there is a need for privacy, however, just… getting some education out there.
And like I said, I’m talking to college computer clubs and user groups, but maybe a place that we can bring this beyond the digital forensics field and beyond computers is: can you talk to your local scouting group? Or, you know, your local group of young people in whatever community you might be connected to? Or some community group, and share with them, “Hey, I work in technology. This is the kind of data I’m exposed to, and these are ways I keep myself safe.”
So again, I think, yes, there is increasingly more data in this world. Yes, particularly in the world that I live in, in the United States, there’s a lot of talk about an increased need for privacy, which I also personally believe in. I think the way to balance that is awareness. So awareness and education, and bringing that information and awareness into different communities outside of our digital forensics world, is how I’ve decided to help tackle that balance and that struggle.
Christa: What would you recommend for listeners who might want to either get involved in their own way, or might want to find something else to contribute to the DFIR For Good initiative? What are some ways that you can suggest for them to discover really what their passions are in the space and turn them to doing some good above and beyond working cases?
Lodrina: Yeah. I have loved seeing some of the conversations online, and particularly on Twitter, around DFIR For Good. Some of the early conversations that we’ve been having online have been reaching out to those local community groups. And again, whether it is a group of young people in your community who you have access to; people who might be interested in securing their own lives, whatever group that is; if you can share your technical skills with them, that is a great start.
Specifically, many of us in forensics work in the world of either law enforcement or supporting cases that might go to trial. If you’re willing to contribute your technical skills, there are going to be people… let’s say in the United States, we have federal public defenders’ offices, and maybe you’re able to show, using an example that’s very personal to me, thinking about people who might be experiencing some kind of tracking or cyber-stalking. Maybe you can look at their devices and confirm: “Yes, I do see that’s what’s going on in this case. And here are some technical artifacts that you can take to court with you.” And maybe you’re willing to do that work pro bono: for free.
So whether it’s education and community groups, reaching out to people in the legal system who might not have the resources monetarily to afford forensic services. That’s a great one. I have been hooked into an organization called Operation Safe Escape. And this is a group of people across the tech community who works with people experiencing technical tracking and abuse. There are also other resources out there, like the Electronic Frontier Foundation has a mailing list called Cooperating Technologists. So if you’re somebody who is in the US, in particular, is a lot of the work I see coming that list.
There are organizations who send out calls for, “Hey, we need people with this kind of digital expertise or forensics expertise.” Would you be willing maybe pro bono, or for a reduced rate, would you be willing to volunteer your expertise to somebody who otherwise couldn’t afford digital forensics? And I think reaching out to groups that you might already have relationships with, or looking at some of these organizations that ask for technical volunteers, are great ways to get involved.
Christa: Lodrina, those are fantastic observations and insights. Thank you so much for joining us today on the Forensic Focus podcast. For our listeners, you can find more articles, information and forums at www.forensicfocus.com. If there are any topics you would like us to cover, or if you would like to suggest someone for us to interview, please let us know.