Imagine wake up one morning, ready to start your day, when calls and messages flock to colleagues, comrades, close to perfect strangers. All ask you the same question: "Is it good about this video?" A video or image of which you ignore everything online. Some doubts, others are already convinced that it is you, as the content seems realistic. Confusion quickly gives way to fear, shame.

This is what Elliston Berry , 15, experienced, discovering that false photos of her circulated on Snapchat.

“I had PSAT tests and a volleyball match. The last thing I needed was worrying about nudes faked from me throughout the school, " she says.

According to his words, a comrade would have downloaded a photo from his Instagram account, then used an artificial intelligence tool to generate a naked version. The image thus modified quickly spread to social networks. Her mother, Anna McAdams, remembers:
"She entered our room in tears, saying:" Mom, you're not going to believe what just happened. " »»

For Yvonne Mere, assistant, the case goes beyond the simple technological question: “It is not a matter of AI. It is a sexual abuse. »»

Digital and invisible exploitation

Security Hero study , 98 % of online Deepfake videos are pornography, and 99 % of victims are women. The consequences can be tragic. In 2022, based on Khaled , 17, committed suicide in Egypt after being threatened with the broadcast of false pornographic videos generated by AI. A man had harassed her to get a relationship, then threatened with rigged images. The latter quickly spread in her village, plunging her into deep distress. Before putting an end to her life, she left a letter to her mother:
“Mom, it's not me in these photos. They have been modified. I beg you, believe me. »»

Based Khaled Credit: Egyptian Streets

Her sister testified: "We tried to support her, but she no longer supported looks or rumors. »»

In 2023, there were more than 95,000 Deepfake videos online, an increase of 550 % compared to 2019. Today is a clear image of the face of a person and a few seconds to generate pornographic video with free applications. In 2024, more than 200 tools of this type were freely accessible on the internet. The videos thus created have been seen more than 4.2 billion times, according to a survey by Channel 4 News - almost three times the public of the last final of the World Cup.

Ordinary and famous victims reduced to silence

Channel 4 investigation team investigated the five largest platforms hosting rigged pornographic content, where manipulated videos of famous women circulate. But there are few victims to want to testify, for fear of attracting attention to these content.

Cathy Newman, presenter of Channel 4 News, made an exception after discovering that she was in a sexual Deepfake video. "I cover wars, disasters, human tragedies ... But nothing had prepared me for that. Even knowing that it was not real, I felt raped, " she said after watching the video.

Cathy Newman

Sophie Parrish, a 31 -year -old florist, was also a victim. Images from his social networks have been used by a relative of the family to create pornographic deepfakes. “I was physically ill. I no longer trust anyone. Even the looks of others make me uncomfortable, ” she says. The man at the origin of the act was arrested, but was not charged, for lack of clear law on this subject. "I don't feel protected by law," she adds.

Famous women like Taylor Swift, Jenna Ortega, Alexandra Ocasio-Cortez or Georgia Meloni are also among the victims. And despite their notoriety, their legal and media resources, they also struggle to have these contents withdrawn.

Sophie Parrish

A law still insufficient in the face of the extent of the phenomenon

A study by ESET , a company specializing in cybersecurity, reveals that 50 % of British fear being victims of this type of content, and 9 % say they have already been victims or know a victim. While some men are targeted in videos manipulated for political purposes, women are mainly so for sexual, humiliation or revenge.

The phenomenon reports, in parallel, considerable sums to certain platforms and technological companies.

Deepfake Porn is not only a digitized sexual abuse: it is a serious damage to dignity. It is now enough for a woman to publish a simple photo in a dress to risk that she is diverted and transformed into not consented pornographic content. Artificial intelligence then becomes a weapon against women and girls, dehumanizing and operating them.

The repercussions are heavy. Victims often suffer from post-traumatic stress, anxiety, depression, even suicidal thoughts. Their reputation, their career - especially in public fields such as journalism, policy or education - can be irreparably damaged. Even if the videos are false, they continue to circulate and make doubt.

Justice still out of reach

Legally, the fight is complex, long and expensive. Some victims, such as Breeze Liu, spend years trying to remove videos from them, in vain. "I feel like I'm fighting against a ghost," she told Wired .

While some countries such as the United Kingdom and certain American states have adopted laws criminalizing these practices, their application remains limited. Regulators are asking for more responsibilities from platforms, while technological companies work to develop better detection tools.

Organizations such as Civil Cyber ​​Rights Initiative , ParadigM Initiative , Kuram , Lawbrella or TechSocietal campaign for reinforced legal protections.

But as long as legislation is not strengthened, that mentalities will not evolve, and that society will not defend the victims seriously, the Deepfake Porn will continue to spread. And for many women, justice will remain a lonely fight. The question remains: how to fight against something that is not even real?

Article written by Charity Ani Kosisochukwu