Si Biles: Okay, friends and enemies, welcome to the Forensic Focus podcast. Today, we are very privileged to have with us Eva Galperin and Emma Pickering, who are joining us to talk about some of the aspects of DFIR in domestic abuse and how it impacts on that.
Ladies, if it’s quite all right, would you be so kind as to introduce yourselves and to tell us who you’re working for currently and what your areas of interest are where they cross over with ours? Which is, digital forensics and incident response.
Eva Galperin: Hi, my name is Eva Galperin. I’m the director of cybersecurity at the Electronic Frontier Foundation. I also am a co-founder of the Coalition Against Stalkerware, and I spend a alarming amount of time doing abuse cases for victims of tech-enabled abuse. Prior to specializing in tech-enabled abuse, I spent a lot of time trying to protect journalists and activists from government spying, and it turns out that these are people with extremely similar problems. So there’s a lot of stuff that goes from that experience to this experience very easily.
Emma Pickering: Hi, I’m Emma Pickering, and I work at Refuge. We’re the UK’s largest single provider of domestic abuse services, and my role is head of Tech and Economic Abuse. I oversee the technology facilitator and economic empowerment team, and we directly support survivors of tech facilitated abuse. My team works on campaigns, legislation. We look at making sure that survivors can enter into a refuge and safe accommodation safely, because now we see a number of ways in which they can be tracked and monitored. We also work in partnership, as well alongside the survivor, with children as well. We recognize that children also are very much victims of tech facilitated abuse.
My areas of interest, I’m also part of the Stalkerware Coalition as a working member. We see unfortunately an increase in complex tech abuse concerns and stalkerware in the UK, and I’m currently researching for my thesis as well, which is looking at tech facilitated abuse and domestic homicide reviews. Looking at seeing where the pattern is, when somebody’s unfortunately been murdered, where tech facilitated abuse hasn’t been identified by agencies.
Si Biles: Okay. You are currently in Australia, I understand, and you said that you were there for work purposes. Is that facilitating your current research goals? Or is that related to something else?
Emma Pickering: Yes, I’m here because I’ve secured a Churchill Fellowship. So with that, I’m looking at digital forensics and best practice across different states, different countries and continents. So, I’m currently in Australia speaking to police, legal firms, tech researchers, and then I’ll go to Estonia in October, the end of October. I’m looking at how they navigate survivors in their country, because obviously it’s completely digitalized society, and then I’ll complete hybrid learning in November.
Si Biles: Okay, amazing. How are the two of you together to talk to us today? What’s your relationship between each other?
Eva Galperin: We have worked in the same circles for many years on various campaigns and also on tackling the broader problem of tech-enabled abuse, because one of the interesting things about stalkerware as a topic is that people get really interested in stalkerware, because stalkerware is very scary. It completely takes over your device. Every time that a victim of tech-enabled abuse feels that they cannot trust their devices, they feel, “Oh, it must be stalkerware.”
And also, people who do news programs and who write science fiction and things like that, find that these are very sexy topics. But for the most part, when we look at tech-enabled abuse, for one thing, it’s usually not stalkerware. And even when it is, stalkerware is only a single component of a much broader campaign of tech-enabled abuse.
So, Emma and I have worked together for many years on and off on various things looking at the different things that abusers do in order to control their victims. That goes beyond stalkerware. It includes Bluetooth-enabled physical trackers. It includes account compromise. It includes all kinds of IOT devices, and also it includes bringing in friends and family members and group stalking, and often exploiting children that are going back and forth, if in the abusive relationship, the parents are separated, but they still have to co-parent a child. Often, their devices are used as a way of continuing the abuse.
We’re looking at a very broad range of tech-enabled abuse and how to fight that because just focusing on stalkerware is really looking at the problem in a very myopic way.
Si Biles: And in that regard, obviously like you say, stalkerware gets a massive public interest story to it, but I mean obviously Apple iPhones and “find my Apple” is a very useful thing, and lots of people are joining the accounts, and you tick the box to share all of that information. What is the breakdown roughly, percentage-wise, if you know, that is actually specific stalkerware malware type stuff versus just misapplication of legitimate tech?
Eva Galperin: It’s hard to know. I could tell you the percentage of people who come to me. I could tell you the percentage based on the number of detections of stalkerware that companies that are in our coalition have detected. But it is different across different geographic regions and also across different socioeconomic classes and strata. So I think it would be really difficult to generalize. I think the only generalization that I would be comfortable making is that, in cases where you find stalkerware, it is almost never just stalkerware that is being used as a tool of abuse.
Si Biles: So it’s a more advanced level of manipulation of an individual, perhaps you would say? Because obviously they’re using the lower levels.
Eva Galperin: Sometimes.
Si Biles: Yeah. Okay.
Eva Galperin: Sometimes. One of the things that I think a lot of people really miss about stalkerware is that commercial stalkerware is stupidly easy. It is meant to be usable by a non-technical person. People are frequently very scared of stalkerware, because it gives you essentially root access to the phone, and it is extremely powerful. And they associate that kind of power with people who are very technically skilled. What stalkerware does is it puts that power into the hands of non-technically skilled people.
Alex Desmond: When you were explaining the bigger gamut of techniques that people use to facilitate this tech abuse to stalk their victims, Mitre Att&Ck popped to mind in terms of, there’s obviously stages, as people go through, to stalk their victims. Is there any consolidated list that the foundation has or anywhere else that explains the different types of abuse that victims might experience, whether it’s evidence on their phone or evidence otherwise, that would alert them if they weren’t aware, if they knew of this thing that exists that they’re potentially being stalked by their partner or family member or someone else?
Emma Pickering: I don’t mind answering. From our perspective in the UK, when we’re supporting survivors, the first thing that we do is try and establish a safe way that we can make contact with them. Because what we’ve identified is a number of agencies will go ahead and make contact with them. They’re becoming more aware that they can’t make conversations with them over the phone because they’re quite alerted to the fact that the phone could be monitored. But what we’re finding is agencies are then contacting them via email, which is just as concerning, because perpetrator is still going to be able to access emails.
What we’ll do is make safe contact. We hand out a number of burner phones to survivors every week, and then after we’ve set up safe contact methods with them, we’ll complete a tech assessment. We’ve designed these ourselves at Refuge, and they’re to enable us to unpick everything that’s happening to try and find patterns of abuse. Trying to find devices, accounts that the person has in their name, or associated with the perpetrator, or the children’s devices and accounts, and we have to go through things in exact detail as well.
And then what we do is create a bespoke safety plan. I think that’s the key there, is bespoke, because sometimes people try to have a very generalized approach to it, and the same can’t apply to everybody, because every person’s situation is very individual. One person, for instance, may still want to remain with the abusive partner for a period of time until it’s safe to flee, so we need to work with them for a number of months to put safety measures in place so they can leave safely. Or they may have left already and they need to ensure that that person can’t track and monitor them to the next location. The advice that you would give each individual is very different because their situation’s different. It’s really about having that bespoke information with a specialist agency that understands the nuances around tech facilitated abuse.
Alex Desmond: So even though it’s unique to each person, and depending what they want to do, are the underlying patterns for the tech abuse, though, that you find with the abusers similar? Is there a pattern there? Like, they escalate from eavesdropping conversations first to then account control, looking at emails, and then stalkerware is an end thing that gets escalated in the end? Or is it drops at the start?
Yeah, I’m seeing some shaking heads. So-
Si Biles: I was going to say, unfortunately the camera doesn’t track movement. It only tracks your voice, but Eva is shaking her head quite emphatically, for the people who haven’t crafted that.
Eva Galperin: Yeah, there’s no escalator of abuse, where it starts with one thing and then it moves onto a bunch of other stuff and that each step is more invasive than the last. That’s simply not the case. You cannot predict what direction it’s going to go in. You can’t predict what kind of tracking they’re going to do. The only thing that you can predict is that the abuser is lazy. They will not go through more trouble than they absolutely have to in order to maintain control of their target, their victim.
Having said that, there are abusers that will go through a tremendous amount of trouble in order to continue abusing, controlling, or harassing their victim. And if you are providing support to a survivor, you don’t immediately know how far the abuser is willing to go, which is a really big problem in trying to put together a safety plan for them.
That’s one of those times when you really have to trust the survivor. You need to leave the decision up to them. “Do you want to take the stalkerware off the phone? Do you want to confront the abuser? Do you want to take away their ability to track your location, and therefore tip them off that you are onto them? Do you want to leave your abuser? Do you want to go back to your abuser? Do you want to lock them out of your account? Do you want to give them your password again?”
One of the most frustrating aspects of working with survivors of abuse is that leaving an abuser is not easy. On average, a survivor will try to leave their abuser at least seven times before it actually sticks. So, if you are doing this kind of work, and you are not willing to see the survivor go back to their abuser repeatedly, this is not the job for you.
Emma Pickering: I think that’s interesting, because looking at the domestic homicide reviews, and this is only very much a UK perspective, what I’m identifying as well is that having to go back because of the level of monitoring and tracking and coercion and blackmail. I was reading one report where a victim unfortunately took her own life, because every time she tried to leave, he’d monitored everything. So he knew her plan.
Every time she went to leave the front door to make that escape plan, and she’d been involved with numerous agencies, the police, domestic abuse organizations, he could rumble her plan, turn back up again. He worked abroad. He would divert planes, he’d change his business plans, and he’d come back home. She was never able to safely leave, and her only option at the end, she thought, was to escape through taking her own life, because the monitoring was so severe. So I think Eva’s right, it’s about reminding people the level that a perpetrator will go to monitor and survey and blackmail and coerce somebody.
Si Biles: You’ve suggested the ideas of not doing something and taking the stalkerware or closing the accounts or anything like that out. Is there any concept of misinformation or false information to enable an escape? Does that exist within your security plans and your strategies?
Eva Galperin: Oh yeah, that’s incredibly common.
Si Biles: Okay.
Eva Galperin: Yeah. In fact, one of the things that I sometimes do when helping to create security plans with survivors of abuse is I introduce them to the concept of a barium meal.
When you have information that’s leaking and you don’t know the source of the leak, what you do is you feed all of your potential leaks slightly different information and you wait to see what information the attacker acts on. I’ve definitely gone through this process with some survivors before. Spycraft turns out to be surprisingly useful in this setting.
Si Biles: That’s fascinating. Do you extend that to the ultimate degree, of feeding in false data, in the sense of we are able to manipulate things like GPS tracking and stuff like that? Is that a length that you are able to go to? Or is that…
Eva Galperin: I have rarely suggested feeding false GPS data, but I have suggested simply handing a tracker off to someone else, waiting to see whether or not the abuser shows up in the other place, making false calendar entries to see whether or not this causes the abuser to show up in a specific location. So yeah, there are all kinds of ways to feed false information to abusers. But again, you absolutely have to center the survivor and their understanding of how much risk they’re willing to take and how far they are willing to push their abuser. I don’t ever want to do anything that makes the survivor uncomfortable, that they think that they should not be doing.
Si Biles: Yes. Obviously, you deal with a risk profile that is considerably higher than the average forensic analyst or incident responder than we deal with. I mean, I’ve done domestic abuse cases, but only after an arrest. From my side of things, it is purely a forensic aspect of looking to see what has happened previously as opposed to actually attempting to do anything actively.
Eva Galperin: The technical aspect is, in some ways, very different. Because when we do technical work, when you’re trying to lock an attacker out of a system, the assumption is that once you have locked them out, they’re out. I mean, maybe they will try to get back in, but there is nothing to be gained from letting them stay in there.
Si Biles: There’s certainly nothing to be gained from letting them stay in there. But I mean, [inaudible 00:18:38] going to say, Alex works in instant response, and I was talking with a colleague earlier today about an incident response whereby a company has rebuilt their IT systems five times and they got locked out of their own building this morning because the attacker is still inside of their systems.
And you must be facing the same sort of information security problems, of this persistent, of the advanced persistent threat stuff as we do. An email, you do an account reset or password reset and it goes to an email address that is already compromised. Immediately. You are into huge things. It must be a very broad field that you’re operating in that regard.
Eva Galperin: An APT that is motivated by a psychotic need to control you or gain revenge on you is exceptionally scary.
Si Biles: Yeah.
Alex Desmond: How common is that, that you would come across? You said that it differs across socioeconomic. And you’re probably dealing with a wide range of intelligences, in some people being very non-technical trying to control, but there would be that end of the spectrum where you’ve got quite an intense, intelligent person stalking someone. Is that common? Or probably not common, but is there many cases you come across where you come across someone who’s very intelligent, technically savvy, and stalking someone and abusing tech that way?
Eva Galperin: I’ve come across people who are extremely persistent and who are sometimes technically savvy, but I would not describe them as particularly smart.
Alex Desmond: Okay.
Emma Pickering: We’ve started to create a spreadsheet where we need to be mindful of particular individuals, because obviously we associate with tech developers, different platforms, and we have a number of perpetrators where they’re associated in high profile roles. I don’t just think it’s a level of intelligence. I think it’s also their networks, their ability, their contacts, their money. It’s a number of things that make them quite powerful and give them access to a victim in a way that other perpetrators may not necessarily have. So I think it’s a number of things. I don’t just think it’s their level of tech knowledge that puts them at such a threat. It’s sometimes their connections and their money.
Sometimes they just pay somebody. And also, what we are seeing in the UK, is a lot of proxy stalking and harassment. There’s particular websites, blogs, pages, that perpetrators have created, where they will tag team, so they’ll have a rotor of where they will harass and abuse their partners and their other partners. So then they can take a step back, concentrate on a piece of work, and do something with their family and friends. They’ve got Tuesday free, because Saturday’s dedicated to harassing multiple people across different platforms, but that also helps feed into… It’s difficult, then, for police to try and track to see who the perpetrator is, because there’s multiple perpetrators. Multiple accounts.
Si Biles: That is absolutely terrifying, and the first time I’ve ever heard of that. Obviously that’s deeply shocking to any sane, normal human being, that people collaborate like that to do something like that. I’m a bit of a loss for words. I wasn’t expecting that.
Eva Galperin: Oh, yeah. There’s entire forums devoted to this.
Alex Desmond: Yeah. I saw in the last interview that you did with Christa, Eva, I think it would’ve been a while back. Oh no, 2022. So not too long ago. Christa quoted some stats between female and male victims, and it was more balanced than I expected. Is there any updated stats on, I guess, male, female, undisclosed, and then also, one of you mentioned right at the beginning, children are also receiving it, and I assume that’s a difference between children having adult abusers and also other children abusers stalking them. Is there any picture on what it’s like out there at the moment?
Emma Pickering: For us, because we’re a charity that dedicates working with children and women, we really only see female survivors come to us. Children is obviously very mixed. There is no gender bias there. Both male and female children are abused by perpetrator, abusive parent handing gifts of technology, tracking devices, monitoring equipment.
So I can only give you the perspective that we’re seeing in the UK, which is very much there seems to be a gendered angle. What we’re also identifying is females seem to be disadvantaged because they haven’t set the tech up. They very unconfident around navigating their tech safely. Most contracts and most information, most tech, seems to be in the male’s name and responsibility, so he has most access. That’s just the perspective that I can give.
Alex Desmond: Emma, with the kids that you’re seeing, do you see a split between adults and their peers stalking them?
Emma Pickering: Predominantly, what we’re seeing is around child contact and conflict. When there’s court orders in place, they’re giving them GPS trackers to then try and monitor and find where they’ve moved to. But we’re also, we’ve got youth tech leads as well, so we’re supporting young people. Because there’s a normalization around monitoring and tracking.
And also, what we’re seeing in terms of stalkerware is a lot of parental apps that are on the market under the guise of, “Monitor your child, keep them safe.” If you look at the features, it all has hidden stalkerware features within. So I think that’s something that we’re unpicking as well as a real issue here.
Alex Desmond: Okay. Yeah. And Eva?
Eva Galperin: I get a wider variety of people who come to me, but again, this is a self-selecting sample of mostly English-speaking people, mostly located in the United States. And of the people who come to me, I would say about two thirds are women and about one third are men. I see all kinds of stalking. I see men stalking women, I see women stalking men, I see men stalking men, women stalking women, non-binary people stalking each other.
The whole gamut happens, but the majority of the abuse is male perpetrators and female victims. I would say, more than 50% of what I see is that. I try really hard not to frame this as a women’s issue, because I think it makes it a lot easier to dismiss, and also it silences the male and non-binary survivors of abuse and really creates this stigma that keeps them from coming forward.
Si Biles: One of the things that you’ve mentioned several times is the concept of tracking devices. Now, I mean, I’ve been kicking around the industry long enough that you used to be able to buy things and stick a SIM card in them and throw them into the boot of a car with a magnetic thing and a couple of batteries and it would track for two weeks before it went flat.
Now we have Apple tags, which are year-long batteries, tiny little things that are not using the 3G network or a mobile phone network for their location. They’re pinging off other stuff. I guess there’s two questions. One is I’m assuming that these are being used more frequently now, and tracking devices being used more frequently now than ever before? And secondly, and this is slightly contentious question, I suppose, are the tech companies doing enough to prevent misuse?
This applies to the tags and to the technology implemented elsewhere in the mobile phone ecosystem, largely, I imagine, but also all the other areas.
Emma Pickering: No, no, they’re not doing enough. They’ve got a long, long way to go. I was just looking at the iOS updates for this year, just to see if there’s any features on there that we need to be concerned about. And again, there are a number of features that we are concerned about, like every year.
I think people, they design tech… And I understand, you want to make your life more accessible, easier. People want this, there’s a driver there. But sometimes, there’s certain features that are designed where you think they’ve almost gone into a prison and they’ve said to perpetrators, “Design this,” and then they’ve put it on the market. It’s that horrific. Ring-fencing features for instance are a real bugbear of ours. And everyone talks about Apple air tags, and I understand that, but that takes away the emphasis on the other forms of tracking and monitoring that we’re seeing as well.
I think, then, that we need to be mindful that we’re only focusing on one element, where there’s multiple elements of tracking and monitoring. Smart cars are a huge concern for us, and they will be in the coming years. They’re all GPS-enabled. They have apps. You don’t necessarily know who’s connected to that app. You can ring-fence location. It gives you real time movements. You can make changes. With mine, for instance, my husband, if he wanted to, could make changes and make me stop the car through my app while I’m driving the car. It’s just being mindful of there’s different ways now in which somebody can track and monitor.
Si Biles: Yeah, no, I do appreciate that the air tags is the easy commercial thing to wave around and say. And yes, indeed, the Track My iPhone thing has been kicking around a heck of a lot longer and lurking in the background of your iCloud account, and has always been available to track your phone.
Eva Galperin: Well, one of the things that we’ve really done, we’ve done a lot of work with Apple on helping people to clarify what information they’re sharing and who they’re sharing it with. Not a lot of people know about Safety Check, but if you go down into your menus, you can actually run a safety check on your phone, which is geared towards these close access domestic abuse data sharing and data leaking problems. That’s really helpful. And I want to give Apple credit where credit is due, because I spend a lot of time yelling at them.
In fact, I will now yell at them about iOS 17, because one of the features that they have introduced is a feature which allows you to share the tracking of an air tag. So now we can have several people stalking you with just a single air tag, and you will not discover who those other people are as a result of your security alert. Your air tag alert will only show you information about the person who is the primary owner of that tag. I find that very disturbing. I haven’t had a chance to test it out yet, and I try not to go crazy over a feature that I have not personally tested myself, but that’s next week’s project.
Si Biles: Yes. Yes, indeed. And obviously coupled with the conversation we just had about these online forums for this sort of thing, that’s a deeply scary concept. You mentioned that there is a particular thing, a menu item that has that. If you have a guide to how to get to that or can point us at that, we’ll put that in the show notes so that anybody who is listening and is curious and wants to double check is able to do that. I’d be very grateful for that. Thank you.
And I really interrupted you in the middle of saying something else, so I apologize.
Eva Galperin: No worries. The other thing that I was going to say is that Apple gets a lot of flack, but the entire industry of people making Bluetooth-enabled physical trackers is crap. Apple actually has more stalking capabilities built into the air tag than, say, Tile, which released a detection app and then months later released a way to disable the detection app by entering your Tile into an anti-theft mode. I reserve a great deal of my ire for Life360, which is the owner of Tile.
Alex Desmond: This is an interesting point, I guess. I think people who make technology and software, there’s an ever-growing list of compliance and considerations to make. I think in terms of developers, this is niche, but a very important issue. How can they go about to better engage? And I guess, thinking from Australia, I don’t… You are both based UK, US respectively, but in each of those three countries, where would developers go to liaise with someone who knows about this and then can incorporate that into their tech early on for considerations? Then, obviously, there’s going to have to be trade-offs, right? People will abuse technology no matter what. It’s making it as hard as possible for that to happen.
Emma Pickering: Yeah.
Eva Galperin: I’ve included a link to some design principles.
Alex Desmond: Okay, awesome.
Si Biles: Amazing. Okay, well that’s fantastic. We’ll share that with the group, as well. I think, possibly more to Emma, how is UK legislation managing to keep up at the moment? I’ve noticed, my daughter used to work for Rights for Women in the UK, so providing legal aid to survivors of domestic abuse and helping out in that regard. She pointed out that economic abuse has only recently been included as a form of domestic abuse, since 2021. How well is UK legislation keeping up?
We’ve also had recently the, and I can’t remember what exactly the law is, but the one that’s gone through that says service providers now supposed to provide decryption and various assorted rubbish like that, which isn’t going to work, but we’ll worry about that later. How well is the UK legislation actually managing to keep up? And is it getting better, or is it just stagnantly sitting where it isn’t good?
Emma Pickering: It is not just the legislation. You’ve got the criminal justice system, police response, training. There’s the whole structural system that needs to work with the legislation. The computer misuse, that’s 1990. So when offenses fall within that, the police will say that it only really applies to terrorism and big corporate organizations, not intimate partner violence. In terms of economic abuse, there is no support effectively, because if someone’s took a number of loans out in somebody’s name without their consent, that is fraud. If they report that to the police, they’ll just get told to report to action fraud. You have a crime reference number, and it’s now your responsibility to negotiate with those creditors to try and get debt write-offs, which is incredibly complicated process and very time-consuming.
Obviously we’ve just had the online safety bill just pass, but that’s very much in the early stages, and we’re feeding them with off on the regulator. I imagine there’ll be a number of different versions of that, because tech changes rapidly. I just think that the way in which tech’s evolving, and the way in which the legislation’s set in, it’s not evolving at the same pace. Also, the police response and their resources are very, very limited. They don’t have the right resources to be able to respond to this kind of crime at all.
Si Biles: Are you working with the police as a group? I mean obviously, Refuge is, I’m well aware of the work that you do in the UK. You’re a large charity. Which again, is not a good thing, but you’re a large charity doing great work, but are you able to engage with the police? Do they engage with you in providing, you provide training to them to help them to get it better, and they negotiate and liaise with you to help in specific cases?
Emma Pickering: We’ve just designed a training package for the police, and we’ve started to roll that out. Specifically the Met Police, but other forces, they’re taking their time to come on board with it. What we really need is the College of Policing to roll this out and to take it seriously and to be able to support the police so it’s a mandatory training component.
Eva Galperin: I actually worked with a state senator in Maryland in the United States to write and pass a law requiring all law enforcement officers to get a mandatory training in how tech-enabled abuse works and how to investigate it. This law passed last year, and so we’ve been seeing the trainings happen this year. And next year, we will look into how effective they have been. If it turns out that they have been effective, then I’m going to try to use this as a model law in other states.
Si Biles: Just in context, for those of us that have very little knowledge about America, how big is Maryland in comparison to other states? Is it a good model for a trial in that regard?
Eva Galperin: Maryland is a good model because Maryland is small. If you’re going to run an experiment, you want to run it in a relatively small state at a small scale before you start rolling out in say, Texas or California. New York. I think it’s a good place to start, and we will see how it goes.
I am very skeptical of the power of legislation to fix these kinds of problems. People frequently ask me, “What kind of laws do we need to pass in order to stop tech-enabled abuse?” And what I tell them is that there are already laws on the books and they simply don’t get enforced because of the problems that we have with the justice system and the problems that we have with our system of law enforcement. And this is not going to get fixed with more to legislation.
Si Biles: I assume that’s the same for us in the UK, is that essentially we have laws that prevent various assorted things, like fraud for example, or I mean the computer misuse act is a bit sucky, but it’s been manipulated into doing a great many things in the past. But it’s about the enforcement of it, not about the actual law itself?
Emma Pickering: Exactly, exactly. When you’ve got survivors calling the police and then stating to them, “This has happened to me and this falls under this legislation, so can you please go and arrest my perpetrator under this legislation,” because they have no confidence that when they speak to the police that they’re going to be able to understand the legislation. And interestingly in the UK, only 4% of frontline police officers have had any training related to the so-called revenge porn legislation, so you’ve got 96% of frontline police officers have no clue of what that legislation actually entails. They’re not given the time or the resources to actually go and attend any of the training and have any of the information or the tools to be able to do their job properly.
Alex Desmond: Yeah, it’s a balance. I guess they’re struggling with everything that has changed. Technology has exploded, and then all this legislation’s come out, but they still need to do day-to-day. Do either of you see a way for that to happen? Police are now having, at least in Australia I know, they’re having to start training on how to handle different types of mental conditions as they might respond to a site. That’s another thing that they have to respond to and get detailed training on. Is there a way that you can see that it works with police or maybe other support organizations that could alleviate that a little bit? Do you see a solution that is possible? Or is it just keep chipping away at the very large problem that we have across a broad range of issues?
It’s a very big question. It’s like, “Solve the issue.”
Emma Pickering: I’m trying to think of a way to answer it that isn’t completely negative. I’m really struggling.
Alex Desmond: I think it’s okay to be. It could just be, “Hey, we just need to keep chipping away at it.” It is a big issue amongst many other issues that we are facing, particularly dealing with individuals.
Eva Galperin: There’s also just not one solution. If there was a button that we could press, if there was a lever that we could pull, that would solve this problem, we would’ve done it already. It’s a matter of pushing in a lot of different places at once. And certainly, one of the elements where we feel that we can make a big difference is in the training of law enforcement officers.
Having said that, I think that there are limits to the effectiveness of that training. And one of those limits is the extent to which law enforcement officers are frequently perpetrators of abuse.
Si Biles: Yes.
Eva Galperin: So when they see abuse cases, they’re more likely to empathize with the abuser than they are with the survivor, and I think that that is a very serious problem that I am not able to solve from here.
Emma Pickering: Agreed. We have this similar problem as well. You can see the most recent report around the Met Police in particular around how many frontline police officers are still serving when they’ve got very serious reports against them around sexual and domestic abuse. There is that challenge as well, a misrepresentation. I’m obviously in Australia at the moment doing research, and that seems to be a theme here in Australia and in the UK around misidentification of who the victim is and who the perpetrator is. Huge, huge concerns there, as well.
Alex Desmond: Is there many supportive organizations similar to EFF and Refuge in Australia?
Emma Pickering: Yes. You have WESNET, they’re based in Australia, they’re based in Bendigo, and they support survivors with tech facilitated abuse as well.
Alex Desmond: Okay.
Si Biles: This is an interesting, potentially controversial question, and I apologize in advance if it is. Well, it is. I know it is.
One of the issues that I’ve certainly seen come to the fore in the UK is that the police are seeking to take digital forensic images of both the offender and the victim’s devices. And quite often, the victims are feeling very much worse for wear at the end of everything and don’t wish their privacy to be invaded any further, but in the case of tech-enabled abuse, it must be pretty critical to part of a prosecution in order to have a full image of a victim’s device. Would you agree, disagree, have a viewpoint on it?
Emma Pickering: I personally, and Eva may have a different one, but I personally disagree, because I don’t think that the information that you’re going to find that proves that perpetrator’s monitored, tracked, and surveyed that person sits on the victim’s device. It sits on the perpetrator’s device. That’s where you will find the evidence. It needs to be obtained from his device, not the victims. Because, quite often, they’re going to hand that over. He’s going to have removed everything.
He’s going to have wiped everything. The evidence isn’t there. It sits somewhere for six months, and then by the time they go to analyze it, any evidence that he was perpetrating against her is removed, so it looks like it didn’t exist. They need to take his devices. And also, they’re very, perpetrators are very crafty. They will hand over their work devices and then work will become involved and say, “We need those devices back.”
Si Biles: Eva?
Eva Galperin: It really depends on the case, but for the most part, yes, the evidence is on the perpetrator’s device, and not necessarily on the survivor’s device. There may be additional evidence on the survivor’s device, but if as part of the investigation, law enforcement feels that that is necessary in order to build the case, they can go and make that argument to a judge and get a warrant.
Si Biles: No, that’s fair. I think there’s a couple of things that I have left to ask, one of which is, as a forensic examiner, in cases that are being put to me, what would you recommend that I look for as my first and perhaps subsequent ports of call in order to find good solid evidence of tech-based abuse, tech-based control? What sort of things should I be looking out for?
Eva Galperin: It depends on what kind of device you’re looking at.
Si Biles: Yep.
Eva Galperin: And what kind of tech-based abuse you’re looking at. But certainly, you’re going to look for misconfigurations, going to look for exfiltration of data. Any application that is designed to hide from the user, to just not show up in the dock or not show up on a list of applications. Anything that is exfiltrating data about location or messages, contents of messages, contacts, passwords, that shouldn’t be doing so. Those are generally the kinds of things that I look for.
And in that sense, it’s not that different from tracking an APT, because these are people who are looking for pretty much the same kind of stuff that an APT finds useful once they have gotten onto a device or into a network.
Si Biles: And on a slightly broader sense, what can we do as decent human beings to try and improve the situation with regards to domestic abuse?
I’m going to say, you’ve opened my eyes to a couple of things that I wasn’t aware of. And as I said, my daughter worked for a domestic abuse charity in the UK, so I’m not entirely oblivious to some of the concepts, and I’m still having my eyes opened, which is deeply worrying. What more can we do as a society, as individuals? I mean, obviously we don’t want this to continue. What is it? Can we write to Apple and ask them to stop being quite so frivolous with their technology? Can we appeal to our politicians and donate money to Refuge? What is the best way that we can start to, as a society or as individuals, help to address these issues?
Emma Pickering: For us in the UK, it’s not really taken particularly seriously. We’re the only UK charity directly supporting survivors where there’s tech concerns. We’re a small team of 11 working nationally across the UK. And we have calls outside of the UK because people are really desperate for support, and we have no government funding. A lot of my time relies on trying to find partnerships, funding opportunities as well, when really we should be focusing our time and resource on dedicating to survivors. Our time is sometimes taken away from that priority.
And then there’s putting the pressure on tech developers as well to make sure that they’re designing products with safety in mind, contacting and speaking to agencies such as the Stalkerware Coalition, Refuge, WESNET, Safety Net, making sure that they’re asking for trends, themes, looking at reports, annual reports that we provide as well, to look to see what’s happening where tech-enabled abuse is concerned.
Eva Galperin: And I work for a digital civil liberties nonprofit. I would be remiss if I did not point out that you can give money to the Electronic Frontier Foundation at www.eff.org. We are a international organization. We work all over the world. We work on problems all over the world, because the internet is global.
Having said that, I think that the most useful thing that men can do is to, well, for one thing, to believe survivors when survivors come to them. To take their concerns seriously.
And also not to let your peers get away with this kind of abusive behavior, to make it clear that this kind of behavior is not all right when you see it. Because, honestly, it’s not uncommon for people to track their partners and to tell their friends about it, to tell strangers about it, to tell their workmates about it. They tell me about it. And this is a thing that I specialize in opposing.
The most frequent kind of conversation that I have is usually with someone who tells me that they needed to install stalkerware on their partner’s device because they were abusing them. Because they were abusing them and they needed to get evidence that they were cheating or that they weren’t where they said they were, or that they were sending messages to somebody that they said they weren’t talking to. And that, in and of itself, is abuse. It’s really important for us to sort of step in among our peers when we see this kind of thing and to speak up and say that that’s not okay.
Si Biles: Absolutely, absolutely. I’m going to say, normally we would close out with something a little more lighthearted, but I feel slightly off doing it today. I think I’ll go with a slightly more serious, but still ending question of what are you going to be doing next in terms of research or in terms of, I mean, technology or social? What’s on the cards next for each of you?
Emma Pickering: Eva?
Eva Galperin: Well, the next things on my plate mostly have to do with research around Bluetooth-enabled location trackers. I’ve got some researchers who’ll be coming out in the next month or two. I have some additional research that I need to work on this week. Mostly how these things work and how the detection environment is doing and what constitutes an effective mitigation for Bluetooth-enabled tracking, because it’s a problem that’s really growing exponentially. So far, the mitigations that we have pushed forward don’t seem to have been very effective.
Emma Pickering: For me, the Churchill Fellowship, so looking at digital forensics, looking at different countries, continents, different responses, best practice, challenges, barriers for survivors reporting, digital offenses, and then thesis. So, coding and analysis at the moment, and looking at trends within domestic homicides and serious case reviews. Then we’re doing a lot of work in the team as well around neurodiverse clients, survivors, and tech-facilitated abuse and black women’s experiences of tech abuse as well.
Si Biles: Sorry, you interestingly opened a little door there that I am very curious about. You said neurodiverse as a particular area of study. Is that purely because there’s not a well-defined area of research around it? Or is it a particularly significant statistical issue, that more neurodiverse people are victims, that is cause for that?
Emma Pickering: Well, we don’t know. That’s the problem. There is a lack of research, and everyone in my team has a particular specialism, and it’s based on their lived experiences or areas of interest as well. No one’s given a specialism when they have no association with it. Neurodiverse, we have people looking at assistive technology who are disabled users themselves within the team, so we are looking at different forms of tech and how the intersectionality of technology plays a part.
Si Biles: Okay, that sounds really interesting. I’ll be very… And in the way that you are publishing these, these are going to be published as academic papers?
Emma Pickering: The Churchill Fellowship? No, it’ll just be put out there in the public domain when I’ve complete that in the spring next year. Then the thesis, yes, will be an academic paper.
Si Biles: All right. Well, fantastic. I look forward to seeing both the research and the paper at the end of it.
Emma Pickering: Thank you.
Si Biles: Thank you very much. Well, I’d just like to say thank you very much for coming on and talking to us. I appreciate… Emma, thank you for getting up in the morning. Eva, you’ve had it easy. I’m grateful that you came, but it’s one o’clock in the afternoon. I’m not going to get too grateful.
Just to say, again, thank you very much. It’s obviously something that is an incredibly important topic that we need to discuss and we need to address as an issue, and to have two such experts on is actually a real privilege for us. We do hope that we’ll have you on again in future to discuss the outcomes of your research and the outcomes of your experiments in US states to see whether that does actually make a difference or not. I am fascinated to find out, I must admit.
Just to let the listeners know that the Forensic Focus podcast is available on YouTube, Spotify. Desi will remind me of all of the other places.
Alex Desmond: Our website.
Si Biles: Our website, yeah, that’s one of them. And the Apple Podcasts, and anywhere you can find a reasonably competent casting platform, we will probably be there.
Transcripts are available. We will make efforts to make sure that there are links in the show notes to everything that we’ve talked about, Refuge, EFF, any of the other charities that are dealing with domestic abuse, including rights for women and any others that we can come up with, so that if you do need to get in touch with someone who can help you out, please do contact one of the charities that we will put in the links.
But then, that just leaves me to say, again, thank you very much for your time. It’s very much appreciated, and we will close there. Thank you.
Emma Pickering: Thank you.
Alex Desmond: Thanks all.
Eva Galperin: Thank you.