Why are we not doing more about deepfakes and the online abuse of women and girls?

“Nudify” apps are making underage girls as well as celebrities victims of fake porn images. Not enough is being done to hold tech giants accountable.

With a single good image of a person’s face, it is now possible, in just a half-hour, to make a 60-second sex video of that person. ILLUSTRATION: NYTIMES
New: Gift this subscriber-only story to your friends and family

Alarms are blaring about artificial intelligence (AI) deepfakes that manipulate voters, like the robocall sounding like President Joe Biden that went to New Hampshire households, or the fake video of Taylor Swift endorsing Donald Trump.

Yet, there’s actually a far bigger problem with deepfakes that we haven’t paid enough attention to: deepfake nude videos and photos that humiliate celebrities and unknown children alike. One recent study found that 98 per cent of deepfake videos online were pornographic, and that 99 per cent of those targeted were women or girls.

Already a subscriber? 

Read the full story and more at $9.90/month

Get exclusive reports and insights with more than 500 subscriber-only articles every month

Unlock these benefits

  • All subscriber-only content on ST app and straitstimes.com

  • Easy access any time via ST app on 1 mobile device

  • E-paper with 2-week archive so you won't miss out on content that matters to you

Join ST's Telegram channel and get the latest breaking news delivered to you.