Christa: Stalking and domestic violence affect huge proportions of people. The Centers for Disease Control and Prevention estimate that about 7.5 million people are stalked in the United States each year, with about 61% of female victims and 44% of male victims experiencing stalking by current or former intimate partners. Much of that occurs via technology, whose ubiquity, ease of use and access, and relative cost effectiveness, make it easy for abusers to spy, track, and harass their targets. With us today to talk about that is Eva Galperin, director of cybersecurity at the Electronic Frontier Foundation and a co-founder of the organization Coalition Against Stalkerware. I’m your podcast host, Christa Miller. Welcome Eva.
Eva: Hi there.
Christa: It’s good to have you.
Eva: It’s lovely to be here.
Christa: Good. I’m going to start off with a question about the Coalition Against Stalkerware. It was formed in November 2019, just a few months before the hashtag #DFIRforGood was coined, to facilitate communication between advocacy groups and the security community. What are some of the organization’s biggest accomplishments since then?
Eva: Well, since then we have accomplished most of our original goals, which we’re really proud of. We set out to increase awareness about the existence of stalkerware and how it works, to put out some educational and training materials, and also to create a sort of working definition for stalkerware that could be used by professionals across the industry, and also to facilitate the sharing of data including stalkerware samples across AV companies.
Christa: So in your TED talk, you described starting this project by way of a tweet you had offered to assess victims’ devices for signs of compromise, and then your mailbox, I think you said, ended up filled for how long after that tweet?
Eva: I still get requests.
Christa: Okay. You described also in that talk various forms of compromise: physical, account, remote access. How many devices and accounts have you ended up assessing over this period of time?
Eva: I have no idea. I stopped keeping track years ago.
Christa: That’s — I’m laughing, but it’s not funny, is it? It’s actually a really scary sign of the times. Another thing is, one disturbing aspect of your talk was the companies that market their product to abusers. I think that says something about those developers and the fact that tech is not neutral. When we in the community talk about balancing hard technical skills with soft people skills when it comes to career mapping, especially, does this also factor into advocacy? Into recognizing and talking about abuse dynamics factoring in technology?
Eva: Well, certainly I think it’s very important when you are building a product or building security into a product, or even deciding what it is you’re going to do, that you should take the abuse case into account. When you are building something, you should think about how it is going to work in order to harass people, and then how it is going to be used in the context of an abusive relationship.
Because frequently when security people think about a secure account or a secure object, they think about something which is in the person’s possession. And they think: if you have if you have the object and you also have the password, then clearly you have legitimate access to the platform, or you have legitimate access to the account.
And it’s very common in abusive relationships for the abuser to both have physical access to the device, and also, in some way, have access to the passwords to an account or a device — either because they have shoulder surfed them, or because they have coerced their victim into giving them up.
Christa: So on that note, I have a two part question. To what extent does understanding those dynamics, those abuse dynamics, help digital forensics professionals to know what kinds of artifacts they should be looking for? But then also when we’re talking about bringing abusers to justice, how do those professionals also guard against their understanding of dynamics leading potentially to bias in an examination?
Eva: Well to be honest, most cases of abuse very rarely even make it in front of a digital forensics professional. It almost never gets there. For one thing, in order to have a digital forensics professional examine your device, you need a lot of money. People don’t usually do this work for free. You also need a certain amount of freedom and free time, which are things that victims of really prolonged and intense abuse do not have. So the chances that you will be seeing this stuff come in front of you are not low, but you will only see a very, very tiny fraction of the abuse.
Furthermore, most of the abuse that surfaces does not ever end up in a court case. It usually doesn’t get reported to the police. Even if it does get reported to the police, the police tend to gaslight whoever it is that is doing the reporting, they don’t know how to recognize the signs that stalkerware is present. They don’t understand how account compromise works much of the time.
And it’s essentially a crapshoot, whether or not you get someone in your local police department who has any idea of what you’re talking about or who simply dismisses you and says, you know, “Come back when your partner has hit you so I can take some pictures,” which is absolutely a thing that I have heard from from victims of abuse. Which is, essentially: “There is nothing we can do until there is violence and you can show us evidence of violence.”
Sometimes they will say, “Well, this is clearly a problem for our computer crimes department.” And then the computer crimes department says, “We really don’t do anything which is not financial. We are primarily… we have a lot of crime, and we really focus on crimes in which there’s been some sort of a financial component. So victims of abuse get a tremendous runaround. And often the court system and the law enforcement system fails them so badly that it is not worth trying. And it does not come up in the vast majority of the cases.
Most of the cases that come to me are cases in which someone is essentially trying to untangle the ways in which they are being abused and the ways in which they are being surveilled. They know that they have an abuser. They know who the abuser is. They’re not even necessarily looking for proof that they are being abused because what are they going to do? Take it to a court of law? No. Usually they’re just looking for safety and peace of mind.
And so often what I do is I will simply go through their device and talk to them about their operational security and the things that they do every day and try to get some idea of what information is leaking and where it is leaking from and how they can lock that down without completely abandoning the idea of living their lives. And that is a very involved process that involves surprisingly little digital forensics.
Christa: Interesting. I’m used to, through my prior career as a marketer, used to dealing with forensics examiners that are “in the know” about, if not abuse dynamics, then certainly in a position to be able to train law enforcement. But it sounds like there’s a much bigger, deeper issue in play here, which that not everybody has those capabilities or that understanding.
Eva: Not everybody has good experiences with law enforcement either.
Christa: Sure, sure. Yep. Absolutely. So in your TED talk, also, you mentioned having to convince antivirus products to recognize and flag stalkerware as malicious, and then Apple introduced AirTags — nearly a year ago, at this point — to help users track small personal items. Not the first tracking technology; Apple did start to introduce some safeguards into its products to mitigate the threat from abusers treating their targets like property. How do you see the field of personal surveillance technology continuing to evolve from here?
Eva: Well, as usual, it’s a cat and mouse game. So you get some products that are doing a better job, that are locking things down better, that are taking the abuse case into account. And then you see abusers finding ways to circumvent it. And you see companies debuting products where they have clearly not given enough thought to the abuse case.
I will credit Apple with one thing, which is that when the criticism surfaced, they worked very quickly to implement as many of the mitigations as they could as quickly as possible. We still don’t have mitigations for the Android that are equal to the kind of mitigations that you get on an iPhone, but that is a thing that requires a level of cooperation between the Android team and Apple that is tricky to get politically.
Christa: I can only imagine.
Eva: So I don’t get to look surprised when that doesn’t happen overnight. You know, I get what I want by being very careful about what I ask for. Having said that, I think that we are going to see more personal trackers and more physical trackers. And we are going to see them attempting to use larger and larger networks in order to increase their effectiveness.
And as they become more effective, I am more likely to be very angry at them for not having any kind of mitigations against stalking. The reason why Apple was really the target of my ire was because they had turned every device with “Find My” on it, every iPhone across the world, without any kind of opt in feature, but just automatically into part of the network, which tracks the location of your AirTags. And that means it’s a much more effective device for for finding your item when it has been — when it goes missing, but also for finding people when you are attempting to stalk them.
And, you know, with greater power comes greater responsibility, and other products such as Chipolo and Tile did not look at this situation and say “Whew, we really dodged a bullet there.” What they said was, “Oh, man, we’re gonna need a bigger network,” with no indication that they’re building any kind of mitigations against the use of their products as methods of stalking and abuse. So I’m pretty concerned about that. I definitely think that that’s the way that things are going, and that I’m going to have to be doing a lot more yelling in the future.
Christa: Why do you think that so many tech companies are resistant, I guess, to this idea that there are actual people being harmed by this technology? Is it just something that is… is it the profit that’s more important to them, or is there something else going on there?
Eva: Yeah. There’s no profit in protecting people. And they tend to hand wave the extent of the harms. Companies don’t like to think of themselves as being responsible for harms. And often, if you cannot argue that they are legally responsible, suddenly they will stop listening because, you know, it’s not costing them money to be responsible for these harms.
But I think that it is perhaps unfair to attribute this behavior to malice or to a pure profit motive. This is also partially because we have done such a terrible job in this society of talking about abuse and how prevalent it is and how prevalent tech enabled abuse is, and stalking is. The numbers of people who have experienced tech enabled abuse in their abusive relationships definitely numbers in the hundreds of thousands to millions.
It is in fact, an extremely common dynamic, and the problem with companies that are making products right now, is that they would rather think of this as sort of an exception to the rule, as some sort of edge case. And it’s not an edge case, it’s actually much closer to the center than anybody wants to admit.
Christa: Yeah, it’s striking me as you’re talking about the gap, for instance, between the patrol officers that are taking an initial report and the computer crimes investigators that are interested only in, or mostly in, financial crimes cases, that there’s really a gap between — again, in my experience, I’m used to working with internet crimes against children investigators, and prosecutors. And so it seems like there’s… I keep using the word “gap” — in between the child crimes investigators and the financial crimes investigators where this whole host of of adult abuse victims is falling through.
Eva: Absolutely. And we also face a culture in which this kind of abuse is normalized and not even recognized as abuse. The Coalition Against Stalkerware funded a poll late last year of like 20,000 respondents in I think something like 18 countries. And what they found was that while 70 percent of respondents responded by saying that installing software covertly on their partner’s device was never okay, 30 percent of them thought, “Oh, no, there are times when this is fine. And of those 30 percent, 64 percent said it was absolutely justified if you felt that the partner was cheating on you.
Eva: Stalkers will just, you know, straight up tell people that they’re stalkers, if you ask them the right way.
Christa: Well, yeah. Because if they don’t think that there’s anything wrong with it, then what have they got to hide, right?
Eva: Exactly. And it’s the same with rapists. It’s been found in poll after poll, if you just ask, “Have you ever had sex with a person when they were unable to consent or when they were drunk or when they were asleep or have you ever gotten somebody drunk in order to make them more likely to consent to sex?” The rapists will just straight up tell you, “Sure. Yeah. This is fine.”
Christa: Yeah. It’s that sense of entitlement, I guess, that goes along with the abuse mindset.
Eva: Absolutely. And because there are so many people who don’t even think that this is wrong, it’s very difficult to mobilize people against it.
Christa: Yeah. Yes, yes. Yeah. So in this landscape, I’m going back to the #DFIRforgood hashtag and the digital forensics examiners that — they may or may not be cognizant a hundred percent of this particular aspect, but that want to do some good, right? And it also seems that with data volumes and caseloads being what they are, that really digital forensics examiners have a lot of skin in this, not a game.
So arguably it would benefit them, not just to know how these dynamics work and what to look for, but also to add their skills and voices to existing advocacy efforts as we’re discussing. How can they get involved either locally or more broadly, and what specific skills are needed?
Eva: Well, there are a couple of things that I recommend. For someone whose technical skills are already quite advanced, I recommend starting with reading a couple of books about trauma and about the dynamics of abuse. I think that one of the biggest problems that we have is that so many of us have technical skills and we want to help, but if you don’t understand the dynamics of abuse and you don’t understand how trauma works, there is a very good chance that you will end up doing more harm than good.
Christa: In what way?
Eva: In the sense that you will become very frustrated trying to deal with this person that you are ostensibly trying to help. You may end up gaslighting them, you may end up not taking their concerns seriously enough. And essentially frustrating them, causing them to go away and remain in danger.
So what I recommend starting with is a book called The Body Keeps the Score, which is a book about a social history of trauma. And the other book that I recommend is Why Does He Do That? Inside the Minds of Angry Men.
One caveat about Why Does He Do That? is that it’s very heteronormative. And it really paints a picture of abuse as a thing that men do to women in heterosexual relationships. And I don’t want the people who are doing this work to really limit themselves to a vision of what abuse looks like, because of the people who come to me with with their stories of abuse, I would say approximately two thirds are women, but a third are men.
And there’s a tremendous stigma against men coming forward with concerns about abuse or standing up to abuse, or even admitting that they’re being abused. And so we need to help everybody, and not just women in heterosexual relationships because they are the strict majority of the kinds of cases that we see.
So those are two books that I very strongly recommend reading. There’s also a book called Helping Her Get Free that, again, very heteronormative, very “men are the abusers and women are the victims,” but if you can bring yourself to see past that framing, the content is extremely useful about how to be that source of safety for a person who is going through a very difficult time.
One of the things that I mentioned in my TED talk that is especially hard about doing this kind of work is you need to be willing to see the person who is coming to you for help go back to their abuser. Sometimes more than once. Sometimes just handing over information to them. Sometimes just admitting everything they’ve been doing in order to try to get away. Apologizing, reconciling, and this may happen more than once, and you need to have patience for it. And if you don’t have patience for that, this is possibly not the right work for you. It is extremely emotionally difficult and trying work.
I feel like all I’m doing is painting it in the worst possible light. Like, “Don’t. Don’t do this, don’t do this.” But really, yes, we need all the help we can get, precisely because it’s so hard. And many hands make light work. When we have more people who are doing this work, we burn fewer people out.
And prior to this, I had mostly been doing work supporting journalists and activists largely throughout North Africa and the Middle East during the Arab Spring. And in some ways it was a very similar experience. You had to be willing to watch very, very bad things happen to the people that you’re trying to help. And often in circumstances where it is entirely outside of your control. And that can be really traumatic. So watch out for that. But it’s also very satisfying.
The most satisfying aspects of my work are when I can bring a victim or survivor of abuse the peace of mind that they need in order to go on with their lives. The knowledge that their abuser is not omniscient. They’re not omnipotent. They’re not a god, they’re just a sad little cowardly bully, and you are safe from them now because you have done these things. And that is a really powerful feeling. And I encourage people to pursue that. It’s definitely the thing that keeps me going.
Christa: That’s hard to hear. It’s hard to listen to that, but I can only imagine how much more difficult it is to witness, but also the rewards that go along with it. So Eva, thank you again for joining us on the Forensic Focus podcast.
Eva: Thank you so much for having me.
Christa: Thanks also to our listeners. You’ll be able to find this recording and transcription along with more articles, information and forums at www.forensicfocus.com. Stay safe and well.