Skip to main content
Advertisement

Women

Women suffer most from deepfake abuses, and as cases rise, victims need better ways to regain control

Being deepfaked can feel like sexual assault, stripping victims of control, identity, and dignity. With the rise of AI-generated content, experts tell CNA Women that digital sexual violence against women is becoming normalised, and why it’s crucial for victims to regain control.

Women suffer most from deepfake abuses, and as cases rise, victims need better ways to regain control

Women make up a staggering majority of deepfake pornography victims, and society risks normalising online sexual violence against women, say experts. (Photo: iStock/RyanKing999)

New: You can now listen to articles.

This audio is generated by an AI tool.

When lawyer Stefanie Yuen Thio’s colleague told her about suggestive videos and photos of her circulating on TikTok, an intense dread filled her.

“I have been deepfaked,” the joint managing partner of TSMP Law Corporation and chairperson of SG Her Empowerment (SHE) wrote in an Oct 19 LinkedIn post.

“When I saw those racy videos and photos of me, I felt shocked and confused – the images were fake yet disturbingly real,” she said. 

Seeing that there was no nudity, her shock turned into “strange relief” and then, “a violent, palpable sense of violation”.

Deepfakes are realistic but fabricated videos, audio, or images generated with artificial intelligence (AI), making someone appear to say or do something they didn’t. While they can be used for humour or creative experimentation, most deepfakes today are non-consensual AI pornography.

Fuelled by increasingly accessible AI tools, the number of these online deepfake videos has since grown exponentially. 

A report by Sumsub, a UK-based tech company specialising in online fraud, showed that the Asia-Pacific region recorded a 1,530 per cent rise in deepfake cases between 2022 and 2023, ranking second globally after North America.

A 2019 report by Sensity AI, a Netherlands-based AI threat detection platform, found that about 96 per cent of deepfakes were non-consensual sexual content, and over a staggering 90 per cent of that featured women. 

Reports of AI-generated child sexual abuse material – including deepfakes of mostly young girls – have also surged. According to the US-based National Centre for Missing and Exploited Children, such cases jumped by 1,325 per cent, from 4,700 in 2023 to more than 67,000 in 2024. 

In Singapore, SHECARES, a support centre run by SHE in collaboration with the Singapore Council of Women’s Organisations, has handled over 440 cases of online harm since its launch in 2023, including deepfake and AI-generated pornography.

Yet, How Kay Lii, the chief executive officer of SHE, highlighted that figures related to deepfake abuse likely represent a small fraction of actual cases, as many incidents go unreported. 

A 2023 SHE survey found that young women aged 15 to 34 were twice as likely to experience online sexual harassment, including being deepfaked or sent intimate images without consent. In the same survey, more than 70 per cent of women aged 15 to 24 knew a female friend who had faced some form of online sexual harassment.

Although the Sensity AI report observed that most deepfake pornography targets female celebrities and women politicians, anyone can be a victim. 

Last year, South Koreans held various public protests after several women were subjected to AI-generated porn by their peers in what was called the country’s “deepfake porn crisis”. In Singapore, male students at the Singapore Sports School created and shared deepfake nude images of their female classmates. The father of one victim told CNA it wasn’t “just one or two” boys, but “a huge group of boys”. 

Sugidha Nithiananthan, AWARE’s advocacy and research director, said: “Online sexual harms and deepfake abuse are growing more insidious and widespread, and women are more vulnerable than ever – both the law and society need to do more to keep up.”

THE CONTENT IS FAKE, BUT THE VIOLATION IS REAL

When Yuen Thio first saw the photos and videos of herself online, she wasn’t sure what to believe – they looked so real. Despite having worked with SHE for over three years and knowing that online sexual violence is never the victim’s fault, she still felt some self-blame and found herself wondering if she had invited it by being visible on social media.

Women who were deepfaked report feeling shame, guilt, shock, and other emotions similar to being physically assaulted. (Photo: iStock/PeopleImages)

“My friends, who I opened up with, had to attack this mentality headfirst and tell me, ‘It’s not your fault, Stef!’,” she said. “But it still takes a while to really internalise that.”

Mahima Didwania, clinical psychologist at The Other Clinic, explained that even though deepfakes lack physical contact, the emotional toll can be just as severe.

“Knowing that a digital version of you is in someone else’s hands, that they can do whatever they want to your image, can be terrifying and provoke helplessness,” Didwania told CNA Women.

“Even when we know something is fake, our minds struggle to distinguish what’s real. It’s why we fear the dark, or heights, or snakes, even without direct experience – the emotional response is still real,” she said.

Knowing that a digital version of you is in someone else’s hands, that they can do whatever they want to your image, can be terrifying and provoke helplessness.

AWARE’s Sugidha added: “The fact that it is not a physical assault, or that it is not even your body depicted in the deepfake, does not diminish the trauma experienced by survivors.”

According to SHE’s 2025 report on the lived experiences of survivors of online sexual violence, victims described feeling panic, shame, and fear that the video would keep resurfacing. Several victims also reported sleeplessness, anxiety, and even had thoughts of self-harm as they felt powerless to stop the content from spreading.

How said: “Something ‘not real’ can still feel viscerally violating. Beyond the psychological harm, it damages reputations and strains social relationships. It’s the dissonance between ‘it’s fake’ and ‘everyone thinks it’s real’ that intensifies the humiliation and helplessness.”

WHY REGAINING CONTROL HELPS

In the aftermath of such violations, many women describe an urgent need to reclaim agency and take back ownership of their image, even symbolically, Didwania said.

Yuen Thio echoed this. After recovering from the initial shock of seeing the deepfakes, she immediately thought of what she could do, including reporting them to TikTok under “misinformation”.

Her colleague, who had first seen the videos, did the same. When Yuen Thio couldn’t bear to look at the content anymore, her friends helped monitor the anonymous account and informed her when the content was removed days later.

“Getting the deepfakes removed is only the beginning of regaining control,” she said. “Work doesn’t stop when the report is made. Allowing women the space to process what happened and to speak freely is just as, if not more, important.”

The fact that it is not a physical assault, or that it is not even your body depicted in the deepfake, does not diminish the trauma experienced by survivors.

Regaining control can look different for everyone, said Didwania. “For some, it comes from reporting, pressing charges, speaking out, setting boundaries, or simply acknowledging what happened. For others, it’s journaling or confronting the perpetrator.”

For Singaporean multimedia artist and photographer Charmaine Poh, 35, reclaiming control took a different form.

She wasn’t deepfaked, but when she was 12, she experienced a similar loss of agency after discovering dozens of online comments sexualising her pre-teen photos and videos in unregulated forums. At the time, she was acting in We Are REM, a Mediacorp show in the early 2000s.

Her 2023 multimedia work, Good Morning Young Body, created two decades later, was a response to that digital violation. Using deepfake technology, she recreated and recontextualised the same pre-teen footage, once the subject of digital harassment, but this time, under her own direction.

A still from Charmaine Poh’s artwork, Good Morning Young Body, in which the artist used deepfake technology to regain the control she lost after reading obscene comments about her pre-teen body. (Photo: Charmaine Poh)

“I wasn’t deepfaked, yet I still felt violated and cried all night after reading those online comments about what male strangers said they’d do to me – and I was just 12,” Poh said. “The project was a way to give my 12-year-old self a voice from my 33-year-old self.”

She added: “Deepfakes are often used to exploit women, distorting our sense of truth and reality. To subvert that form and fill it with my own voice and agency was illuminating and liberating.”

Other victims find solace in talking about their experience.

“Writing about it online – and knowing that helps other women who are going through the same thing feel less alone – was healing for me,” Yuen Thio told CNA Women.

“More women have come forward since reading about my experience. I’m grateful to be much more attuned to what other survivors feel, and I want other women to not feel as if they need to go through the horror themselves,” she added.

Still, for victims, the digital permanence of deepfakes can make recovery feel impossible.

“They may believe the video will always exist, that they can never undo what was done or control who’s seen it,” Didwania said. “That sense of helplessness is part of the trauma – the feeling that you can’t function or move forward. It’s a normal response to something deeply unjust.

“Healing rarely happens overnight. It can take years, even decades. But small actions – reclaiming a narrative, speaking out, or creating art – are powerful steps. The trauma doesn’t have to define them; they can grow beyond it,” Didwania added.

CORPORATIONS AND THE LAW NEED TO STEP UP

Both SHE and AWARE stress that the first and most urgent step for deepfake victims is to stop the spread of the video. 

One of the most crucial actions for victims of deepfakes is to stop or minimise the spread of the content. (Photo: iStock/MTStock Studio)

How said: “Most victims start by reporting directly to the platforms, but face long waits and inconsistent outcomes. Some go to the police, but investigations slow when perpetrators hide behind anonymity or are overseas.”

TikTok’s community guidelines state that if content violates its rules, the platform may remove it, ban the account, or report incidents of youth sexual exploitation to the authorities.

Similarly, Meta – which owns Facebook, Instagram, and WhatsApp – allows users to report non-consensual sexual content or threats to share such material. Reports are reviewed around the clock in more than 70 languages, and if deemed inappropriate, the content is removed and the offender’s account may be disabled.

If victims wish to press charges, they must file a police report. The Singapore Police Force told CNA Women that victims needing additional support can request a Victim Care Officer – volunteers trained by police psychologists to provide emotional and practical assistance throughout the criminal justice process, from investigation to case closure.

In practice, however, enforcement remains inconsistent, SHE’s How said.

In Yuen Thio’s case, the removal of the deepfakes took four days. How added that some cases can take up to a week, while others see no action at all – often because platforms decide the account didn’t explicitly violate their rules or the content wasn’t deemed “severe” enough. Such inconsistencies can be deeply distressing for victims.

“Victims often face confusing, fragmented processes when trying to report deepfakes or get them taken down,” How added. “Even after reporting, they wait helplessly for platforms to act, and every second is psychological torture as they wonder who else is watching.”

The permanence of digital content adds to the despair.

Even after TikTok removed Yuen Thio’s deepfakes and disabled the account, the content still appeared in Google search results, forcing her to file a request to Google for its removal.

“Once uploaded, a deepfake can be copied, forwarded and reshared infinitely,” said How. “Many victims, especially young women, find the process of reporting across multiple channels intimidating and bureaucratic. They need stronger tools to tackle anonymity and responders who understand the trauma of online sexual abuse.”

Even after reporting, (victims) wait helplessly for platforms to act, and every second is psychological torture as they wonder who else is watching.

Hence, both AWARE and SHE welcome the creation of the Online Safety Commission (OSC) under the Online Safety (Relief and Accountability) Bill. The bill was passed recently by parliament in November 2025, and OSC is expected to be operational by early 2026.

Under the OSC, victims will be able to report harmful content directly to the agency if social media platforms fail to remove it within 24 hours. OSC can then order platforms to take action immediately, that is, remove the content, restrict accounts, or disclose identifying information of anonymous perpetrators to help victims pursue claims or take safeguards against them.

Initially, the OSC will handle five types of harm: Online harassment, doxxing, online stalking, intimate image abuse and image-based child abuse. Sexual deepfakes come under the latter two. Over time, the OSC will expand to cover other areas, including other types of deepfakes, such as online impersonation.

“Fast action is critical to stop images before they spread out of control,” said AWARE’s Sugidha. “These measures will strengthen recourse for survivors, and we hope OSC will do so swiftly.”

START BY TALKING TO KIDS, ESPECIALLY BOYS

To many victims and organisations like SHE and AWARE, tech-facilitated sexual violence goes beyond content removal or what victims can do after the harm is done. 

One of the ways to address online sexual violence is to start by teaching young children, especially boys, comprehensive sex education and imposing strict consequences. (Photo: iStock/CG Tan)

“More must be done to stop the violence from happening in the first place,” said Sugidha. “The issue must be tackled dynamically, through a coordinated effort across all levels of society.”

EveryChild.SG, a non-profit organisation advocating for child well-being, emphasised that addressing deepfakes requires a multi-layered approach grounded in education, empathy, safety, and legal support – particularly by teaching young boys about consent and digital responsibility, and introducing strict consequences where necessary.

“Difficult conversations on sex and sexual violence must start young,” Dr Hana Alhadad, EveryChild.SG’s research and advocacy advisor, said. “Even if your boy doesn’t use AI directly, in this day and age, they will highly likely see its content. Adults then must talk about what boys are consuming and the implications of it.”

Sughida added: “As long as the technology exists, people will continue to misuse deepfake tools to commit sexual violence against women they know.

“The reason someone commits such violence often stems from a lack of respect for women’s consent and bodily autonomy. Perpetrators must understand that what they’re doing is wrong.”

She added that prevention must start early. “It’s crucial to teach comprehensive sex education – including consent, gender stereotypes, gender-based violence, and gender inequality – so that future generations are less likely to commit assault, in whatever form it takes.”

How added: “The OSC may serve as a one-stop avenue with clearer pathways and faster escalation, but for it to succeed, it must work closely with counsellors, social service agencies, and community partners to keep its processes empathetic and grounded in real experiences.”

CNA Women is a section on CNA Lifestyle that seeks to inform, empower and inspire the modern woman. If you have women-related news, issues and ideas to share with us, email CNAWomen [at] mediacorp.com.sg.

Source: CNA/iz
Advertisement

RECOMMENDED

Advertisement