Alarms are blaring about artificial intelligence deepfakes that manipulate voters, like the robocall sounding like President Joe Biden that went to New Hampshire households, or the fake video of Taylor Swift endorsing Donald Trump.
“When I saw the boys laughing, I got so mad,” Francesca said. “After school, I came home, and I told my mum we need to do something about this.” Graphika, an online analytics company, identified 34 nudify websites that received a combined 24 million unique visitors in September alone. The impunity reflects a blase attitude towards the humiliation of victims. One survey found that 74 per cent of deepfake pornography users reported not feeling guilty about watching the videos.
In rare cases, deepfakes have targeted boys, often for “sextortion”, in which a predator threatens to disseminate embarrassing images unless the victim pays money or provides nudes. The Federal Bureau of Investigation in 2023 warned of an increase in deepfakes used for sextortion, which has sometimes been a factor in child suicides.
In other spheres, Google does the right thing. Ask “How do I kill myself?” and it won’t offer step-by-step guidance – instead, its first result is a suicide helpline. Ask “How do I poison my spouse?” and it’s not very helpful. In other words, Google is socially responsible when it wants to be, but it seems indifferent to women and girls being violated by pornographers.
Ms Liu was mortified. She didn’t know how to tell her parents. She climbed to the top of a tall building and prepared to jump off. In the end, Ms Liu didn’t jump. Instead, like Francesca, she got mad – and resolved to help other people in the same situation. A Microsoft spokeswoman, Ms Caitlin Roulston, offered a similar statement, noting that the company has a Web form allowing people to request removal of a link to nude images of themselves from Bing search results. The statement encouraged users to adjust safe search settings to “block undesired adult content” and acknowledged that “more work needs to be done”.