Sophie Mortimer, Revenge Porn Helpline Manager, SWGfL

The Revenge Porn Helpline is a UK service supporting victims of intimate image abuse—to find out more, visit revengepornhelpline.org.uk.

What led you to join the Revenge Porn Helpline, and what does a typical working day look like for you?

I’ve worked at the Helpline for nearly nine years. When I first saw a job available at something called the Revenge Porn Helpline I was instantly intrigued. As I looked into it, I was fascinated in this evolution of abuse online and the accompanying criminal law response.

Not sure there is such a thing as a typical day here! But my first priority is always to ensure that the Helpline is up and running smoothly to offer advice and support to people affected by intimate image abuse. That depends on the amazing team of practitioners here, so ensuring that everything is good with them is really important.

Beyond that, StopNCII.org (our preventative hashing tool) is a big part of my working day: working with our NGO partners around the world to support them to support people in their own communities, as well as planning developments of the tool. I speak at conferences and events, engage with governments and policy makers, speak to press and media contacts…anything and everything!

What are the biggest challenges victims face when they report image-based abuse to law enforcement?

There is a lot of confusion from the police when it comes to investigating and prosecuting online offending. Clients report to us that the officers and call handlers that they have spoken to are unclear of what the law actually says and don’t understand how they can collect digital evidence of these sorts of crimes.


Get The Latest DFIR News

Join the Forensic Focus newsletter for the best DFIR articles in your inbox every month.

Unsubscribe any time. We respect your privacy - read our privacy policy.


There is also an understandable barrier for anyone affected by this form of abuse that stops someone from reporting it to the police: it requires a victim to talk about incredibly personal experiences, and show incredibly personal and intimate images, to a stranger, quite possibly a male police officer, with no knowledge of who else might see them. The majority of people who come to us simply don’t want to report to the police: they want their online images removed and to forget that it ever happened.

Can you describe the emotional and psychological trauma caused by intimate image abuse?

Victims and survivors report a huge range of impacts of intimate image abuse, that reach into every corner of their life. Primarily, there is fear of people they know seeing the images and the impacts that flow from that: feeling shamed and humiliated in front of family, friends and peers, work colleagues and acquaintances.

People are convinced that if their images are online and people can see them, then they have—i.e., that any person who looks at them in the street or in a shop has seen their pictures online. This is an incredibly debilitating burden for someone to carry. It causes depression, anxiety, withdrawal from daily life and, in extremis, suicidal ideation.

Messaging apps and closed forums are increasingly being used to trade explicit images. What challenges does this present for digital forensic investigators?

Abusers will continually find ways and means to evade detection and accountability. In practice, this means that we have seen many forums where we know NCII content is frequently shared disappear behind logins and paywalls, making it much harder to locate and report for removal. At the same time, there is a rise in encrypted messaging services and an emphasis on user privacy that presents a conflict between people’s rights to private spaces and the imperative of keeping people safe from abuse and harm. We are nowhere near squaring this circle, but have to hope that as the nature of the abuse evolves, services like ours can evolve too.

What are the consequences of offenders being allowed to keep or regain access to their devices?

The return of devices to offenders is a huge injustice to victims of intimate image abuse. We have had multiple reports from clients that, even after legal proceedings have concluded with a conviction, devices still retaining intimate content have been returned. This cannot be right, and it reflects a lack of understanding throughout the criminal justice system about how this content can continue to spread online and the harm it causes to victims.

Courts have the powers via deprivation orders to remove devices, but they are simply not being used. And, in many cases, the device is only part of the problem: if images have been shared online then removing a device doesn’t prevent an offender from re-accessing the content online. The harm that recirculating content causes victims is ongoing, debilitating and traumatising: there is no end.

How is AI changing the landscape of image-based abuse cases?

AI technologies are moving so quickly that we are not so much seeing individual changes as experiencing a continuum of change. Most of us are familiar with the rising tide of synthetic sexual images (often called “deepfakes”), but we don’t see that many reports coming to the Helpline, mainly because, we think, the vast majority of victims are simply unaware that these images exist.

The behavioural drivers seem to be different—more about being able to create bespoke images, consume them, and share them peer-to-peer, rather than to cause direct harm (though that does happen). But just because a woman might not know that such images were created, doesn’t mean that they don’t cause harm or that it is OK to create them. And these abuses are evolving fast, so it is vital that our support, regulatory and legislative responses evolve at pace too.

We should also bear in mind that we hope AI technologies will assist in the searching for and removal of such content. As so many millions of pieces of content are uploaded to the internet every minute of every day, we will need these technologies to become ever more sophisticated if we hope to have any vestige of control over what is available in a couple of clicks.

How does inconsistent data collection across police forces affect outcomes for victims, and what types of data are most urgently needed to improve responses?

The UK has 43 separate police forces, all of whom collect data and develop training separately. While the College of Policing provide resources that are intended to bring consistency, it doesn’t seem to work like that. We are dependent on data to understand trends, changes in perpetrator behaviour, victim engagement with the criminal justice system and where the points of failure are in the system. But without consistent data collection, it is impossible to get a clear picture. Without that, we cannot develop appropriate responses—and that fails every victim.

You’ve described this as a “really dangerous time for women.” Should tech companies be doing more to tackle intimate image-based abuse?

I absolutely do think this is a very dangerous time for women, and of course, platforms should do more. But not just platforms; the issue isn’t solely theirs to solve. There is an ecosystem in which the internet operates of platforms, hosts, registrars, multiple regulators and even more legislative frameworks. All of these elements need to work together to create a safer online environment for everyone, but particularly for women who are disproportionately affected, and even more particularly black and minoritized women who fare the worst online.

The demonising of online platforms makes me uncomfortable. They can’t fix this alone: tech companies can’t stop men from wanting to be abusive to women on their platforms. Yes, they have a part to play, but what is needed is a collaborative effort across all stakeholders and jurisdictions to build consistent responses. Our platform StopNCII.org is a great example of what can be achieved if platform stakeholders work together. Coupled with raising awareness and educational initiatives, perhaps we can build the culture change that isn’t just needed, but absolutely necessary.

Leave a Comment