Congress ‘Moving Too Slowly’ on Fake Child Sexual Abuse Material

Fake images have been in the news recently due to the Taylor Swift case, in which tools such as Microsoft’s text-to-image generator Designer were hacked by a Telegram group known for posting abusive images of women online. Fake images of the pop star engaged in sexual activity proliferated online, particularly on X. The platform has attempted to crack down on this kind of activity, cutting off searches using the star’s name, but seems to have difficulties identifying fake imagery. Digital forensic experts have warned that pornography is one of the most common uses of deepfake tech and have also said that this sort of abuse is likely to ramp up as part of political disinformation campaigns in the coming year. But it’s not just the non-consensual imagery of adults that is posing problems; fake child sexual abuse material (CSAM) is also increasing in volume.

Appeals to Congress

Despite appeals from Attorney Generals for Congress to take action on the ‘flood’ of fake CSAM images proliferating across the internet, the US government appears to have been slow to respond. Only a handful of states have addressed the problem – Georgia, Hawaii, Texas and Virginia have criminalised non-consensual deepfake porn – and California and Illinois have introduced the right of victims to sue. But the legal grey area in which much of AI still operates is hampering efforts to bring in effective measures, and the problem is growing.

Challenges Associated With Deepfake Pornography

The effects of the sheer amount of fake pornographic material are multiple. For one thing, they take up human and digital resources as investigators try to sift real images from fake ones. They also normalise child pornography, and can be used to lure children into Child Sex Exploitation (CSE). Social media platforms say that they cannot monitor CSE or CSAM adequately. Law enforcement agencies, meanwhile, report that AI methods of detecting CSAM don’t always produce ‘viable’ results. End-to-end encryption options also make the investigative task even harder.

‘Big Tech and the Online Child Sexual Exploitation Crisis’

The CEOs of the main social media platforms were called to account late in January 2024, summoned to appear before the US Senate Judiciary Committee Hearing on ‘Big Tech and the Online Child Sexual Exploitation Crisis’. Linda Yaccarino (X), Mark Zuckerberg (Meta), Shou Chew (TikTok) and Jason Citron (Discord) were among witnesses invited to comment on the appearance of CSAM on their platforms, although some had to be subpoenaed.

Sen. Dick Durbin (D-IL) described online CSE as ‘a crisis in America,’ commenting that the National Center for Missing and Exploited Children (NCMEC) received over 100,000 cyber tips a day relating to CSAM in 2023. Durbin pointed out in the hearing that Congress needed to shoulder some blame for bringing in Section 230 of the Communications Decency Act in 1996, immunizing emerging internet platforms from liability for user-generated content.


Get The Latest DFIR News

Join the Forensic Focus newsletter for the best DFIR articles in your inbox every month.

Unsubscribe any time. We respect your privacy - read our privacy policy.


He told the hearing that legislation was necessary, such as the proposed Stop CSAM Act, and Sen. Lindsey Graham (R-SC) responded that the Communications Decency Act should also be repealed. Yet legislation moves slowly, and in the meantime, underfunded law enforcement agencies say that they are struggling with a deluge of toxic fake content.

Leave a Comment