Deepfake Image-Based Sexual Violence Inflicts Profound, Real Harm and Long-Term Trauma, Says Victim-Survivor

Association Concerning Sexual Violence Against Women
on 'Deepfake' Image-Based Sexual Violence and Related Figures

The recent public disclosure by victim-survivors of 'deepfake' image-based sexual violence has brought the profound impact and explicit gender hostility of such acts into the public discourse in Hong Kong. While some may dismiss deepfake content as unreal and harmless, the lived experiences of victim-survivors clearly demonstrate the tangible and lasting consequences of 'deepfake' image-based sexual violence (IBSV).

Janice, a victim-survivor of 'deepfake' IBSV, shared how the incident years ago caused her prolonged fear, highlighting the ease with which individuals can now be harmed through image theft: "Even if it's not real, it's incredibly difficult to explain to the whole world, because not everyone will believe you. I constantly worried about being recognised, followed, or ridiculed, and I just wanted to hide at home every day. At the time, I shut myself off, crying secretly late at night, terrified each day that people around me would receive the fake images and judge me. I deactivated all my social media accounts, avoided friends, and even skipped work. At work, I was scared my company would find out and I’d lose my job. The nightmares and fear never stopped; even after a considerable time has passed, I'm still so worried it might happen again."

"Over the past few years, we have occasionally received requests for assistance related to synthetic or 'deepfake' intimate images. We understand the public's difficulty in comprehending and acknowledging such experiences, which also affects victim-survivors' willingness to seek help. The Association believes that beyond the criminalisation of 'non-consensual creation of synthetic intimate images' at an individual behavioural level, the discussion should also focus on culture and education, enhancing public digital literacy, and including discussions on the use of relevant platforms and artificial intelligence tools. Promoting responsible technology use is crucial to effectively mitigate gender inequality in online spaces and reduce technology-facilitated sexual violence," said Doris CHONG, Executive Director of the Association Concerning Sexual Violence Against Women.

Digital literacy encompasses not only technical skills but also critical thinking about digital content. This incident has shed light on how technologies like artificial intelligence can lead to anxieties and fears among women and other social groups about their photos being misused, thereby eroding their right to safely use online spaces and deepening gender inequality. It also highlights the negative impacts, responsibilities, and potential harm that 'deepfake' technology can inflict on others. Authorities must proactively invest resources to promote education on respect, affirmative consent, and boundaries, while simultaneously working at various levels to protect citizens from infringements caused by 'deepfake' synthesis of personal features (including faces, bodies, and voices).

Many social media and online platforms currently lack effective regulation and strategies for handling 'deepfake' synthetic intimate images, which facilitates the widespread sharing of IBSV content. The Association believes that online platforms should take greater responsibility in preventing the creation and sharing of such content and implement robust measures to prevent the misuse of related technologies, thereby safeguarding user safety.

In the 2024-25 service year, RainLily's Image-Based Sexual Violence Take-Down Support service recorded 11 request-for-assistance involving synthetic or 'deepfake' intimate images (see Appendix). If you are experiencing IBSV and need emotional support, takedown assistance, or legal consultation, please contact the RainLily Sexual Violence Helpline at 2375 5322 via phone or WhatsApp.


Appendix: Request-for-assistance concerning synthetic intimate images handled by RainLily's Image-Based Sexual Violence Take-Down Assistance service over the past three service years.

Service Year Synthetic intimate image cases received by RainLily IBSV Take-Down Assistance service
2022 - 23 7
2023 - 24 8
2024 - 25 11