China Daily: A shelter in the rain

Online image-based sexual violence has wrought havoc on scores of people in Hong Kong. Oasis Hu reports from the city’s first victim crisis center where she finds that tackling the problem is tough and complicated. 

Blue Li Yeuk-lam recalled the repeated shocks she had to endure while going about the harrowing task of dismantling nonconsensual intimate images on the internet.  

The images, depicting victims’ private moments, were uploaded without their consent on pornographic websites, interspersed among other explicit content and often accompanied by degrading titles fabricating a fake and pornographic narrative.

The videos were usually accompanied by comments with repulsive criticisms or sexually-inclined remarks taunting the victims.

Li, in her 20s, is among a handful of people in Hong Kong whose job is to take down these sordid online images. She has been working for three years at RainLily — the city’s first one-stop crisis center set up in 2000 which introduced an assistance service in 2021 to help victims remove nonconsensually distributed images.

Image-based sexual violence has become more prevalent in the past decade, encompassing a range of behaviors like clandestine photography, nonconsensual sharing of intimate images, extortion through the dissemination of private photos, and the manipulation of victims’ likenesses using tools like Photoshop or deepfake technology to fabricate pornographic materials.

These actions form a continuum of violence in which filming, distribution and coercion are intertwined, resulting in substantial physical and psychological harm to victims.

READ MORE: Youngsters on the line

The Hong Kong Special Administrative Region enacted the Crimes (Amendment) Ordinance in 2021, criminalizing four image-based sexual violence offenses — voyeurism; unauthorized recording of intimate parts of victims; publication of images originating from voyeurism or unlawful recording; and publication or threatened publication of intimate images without consent. The offenses carry a maximum penalty of five years’ imprisonment.

Despite the tough new law, crimes involving image-based sexual violence have continued to climb. Last year, the police logged 1,040 such cases, with data indicating there had been an average of more than one case daily in Hong Kong depicting illegal sex photography or voyeurism.

In August, local media uncovered a website that openly solicited money through crowdfunding. Once the target amount was reached, contributors could access a “virtual room” with videos and images of women who had been surreptitiously filmed. The website boasted of having more than 20 such rooms, with an “unlocking fee” as high as HK$200,000 ($25,650). More than 20 women had been victimized.

Doris Chong Tsz-wai, executive director of RainLily, says victims of image-based sexual violence require psychological counseling and legal recourse, and their most pressing need is to get their photos removed from the internet immediately. But it isn’t a walk in the park.

Legal loopholes

According to Chong, nonconsensual images are circulated across a range of platforms in the internet world, such as pornographic websites, social media applications, search engines and content farms. Content farms are websites that produce substantial amounts of low-quality, duplicated content to achieve high rankings on search engine result pages.

If these platforms publish nonconsensual intimate images of individuals under the age of 16, they are illegal. Under the Prevention of Child Pornography Ordinance, activities relating to child sexual exploitation materials, such as producing, duplicating, possessing and filming pornographic videos or pictures involving individuals below 16, are criminal offenses regardless of whether consent has been given. Offenders publishing such materials face a maximum of eight years’ imprisonment and a fine of HK$2 million.

Child sexual exploitation laws are relatively comprehensive on a global scale, including Hong Kong, placing heightened responsibility on internet service providers to  address and remove child sexual exploitation materials.

Under the Crimes (Amendment) Ordinance 2021, only people who leak nonconsensual pornography are deemed to have committed a criminal act, while internet service providers publishing such images may get off the hook.

Many websites hosting nonconsensual intimate images often claim to be mere platforms, not content producers, senior RainLily officer Vince Chan Wun-sing says.

The 2021 ordinance, however, does include a provision for disposal orders, allowing the court to order a defendant to remove nonconsensual images. Such an order can only be obtained after the case is incriminated, and the process from police involvement to court proceedings could take from six months to a year. During this period, the intimate content could have been circulated countless times, says Chan.

Responding to the inquiry from China Daily Hong Kong, police said that when handling these cases, they will promptly request the online service provider to remove the content, and said they will continue to refine procedures when appropriate.

Most victims of online sexual harassment are unable to await proper law enforcement, and will seek help from the Office of the Privacy Commissioner for Personal Data (PCPD).

Under the existing Personal Data (Privacy) Ordinance, the act of distributing intimate images without consent, in the absence of additional personal data such as an address or ID number, is not deemed a breach, even if the images disclose facial features. This legal stance presents a hurdle for victim-survivors seeking the removal of such images on privacy grounds, Chan says.

In response to an inquiry from China Daily Hong Kong, the PCPD office refuted the accusation.

The classification of an intimate image as “personal data” depends on the image’s potential to reveal a person’s identity. If identifiable features like facial characteristics allow for identification, the image may qualify as “personal data” under the PCPD, even without additional personal information, the office says.  

Unfortunately, in practice, victims often find that both the police and PCPD frequently fail when it comes to removing such images. For instance, Jess (not her real name), a female resident of Hong Kong, had her intimate images posted online by her ex-partner. Despite attempts to involve the police, no action was taken.

Li said that she knows of several victims whose intimate videos were uploaded onto Telegram. The victims approached the police and the PCPD to help them remove the videos, but their requests were ignored.

Ultimately, many victims are forced to take matters into their own hands, but will find the process of taking down the images complicated and time-consuming.

Ongoing challenges

Nick (not his real name) and his partner’s private images were nonconsensually shared on a Telegram channel with 30,000 to 40,000 members. When Nick contacted the channel administrators himself, they dismissed his concerns, with some demanding more photos for identity verification and others extorting him for payment to delete the images. Feeling powerless, the couple turned to RainLily.

“This case allowed me to deeply understand the retraumatization victims experienced. I began to realize the significance of my job,” says Li. On receiving a plea for help, she would immediately try to contact the online platforms involved and identify herself as a representative of a sexual violence support service, informing them about the existence of nonconsensual photos and urging them to remove the content.

Given these emails she sent lack legal authority to a certain extent, it is entirely at the discretion of the websites concerned whether or not, or when to respond.

Among online service providers, pornographic sites are the most significant sources of nonconsensual intimate images, with more than 40 percent of reports received by RainLily involving these sites. But, very few of them have dedicated reporting systems for handling such content.

Normally, porn websites prohibit materials that infringe the copyright of content owners. However, in nonconsensual intimate imagery cases, the copyright is usually believed to be held by “uploaders” or “photographers” rather than the victims who were filmed. Hence, it may not be possible to remove the content.

In social media, Meta, Telegram and LIHKG have been identified as the most widely used platforms. Meta, which encompasses Instagram and Facebook, has community guidelines addressing sexual exploitation, as well as mechanisms banning the sharing of nonconsensual intimate imagery. They also generally respond to complaints swiftly.

On the other hand, Meta has a high threshold for proving nonconsent. Victims must meet specific criteria, such as a visible match between the individual depicted in the image and the person who reported the content. The criteria are hard to fulfill sometimes as the materials may lack identifiable faces.

Telegram, LIHKG and content farm websites exhibit a notably low response rate. Sometimes, they completely ignore reports.

In dealing with intimate photos posted online, Li would try to identify all the websites on which they have been circulated, occasionally processing hundreds of links in a single case. If a pornographic website fails to respond, she would contact its associated web-hosting providers. In instances where online forums do not respond, she would approach the platforms’ channel administrators.

There have been occasions when even after Li had successfully resolved a case, the same victim had sought her help again weeks or months after discovering the content had been reposted. She would have to repeat the reporting process all over again.

Empowering victims

Li says she has overcome the uneasiness at the nonconsensual materials she has encountered over the years, but she cannot accept the intolerable feeling of seeing online comments targeting victims.

“I can’t understand why those who’ve been hurt have to bear further scrutiny and condemnation,” she says.

Jess recalled that after her ex-boyfriend posted her private images online, others made the situation worse. “These people lack knowledge of my story or character, yet they continue to criticize me based on these one-sided details,” Jess says, “I wish the city could exhibit more empathy toward individuals in similar situations.”

According to Chong, anonymity is often exploited in tech-facilitated gender-based violence, enabling abusers to evade responsibility and accountability for their actions. In cases of image-based sexual violence, victims are always blamed for agreeing to take or record the images.

However, a person agreeing to being photographed doesn’t imply giving consent to the distribution of such photos, Chong says. Regardless of the reasons for taking the photographs, it does not justify the dissemination of those images without the depicted person’s consent. Blaming victims would not only inflict secondary harm on them, but also serve as a diversion from holding the perpetrator accountable for violating consent and body autonomy.

Tackling the problem at its root calls for more wide-ranging sex education, Chong says, adding that it’s crucial to build a society that does not watch, share, or mock nonconsensual intimate content, and that does not blame the victim.

ALSO READ: HK police urge concerted effort to tackle online child sex abuse

RainLily says that between April 2021 and March last year, it had provided “take-down” services for 171 cases in which 1,342 nonconsensual private videos were reported to have been posted through online platforms. Nearly 90 of them were eventually deleted.

About 70 percent of those who sought help were women and 20 percent were men. In about half of those cases, the victims knew the perpetrators, with 27.5 percent of them being intimate partners and 15.8 percent online acquaintances. 17 percent of the victims had been threatened by the perpetrators when the images were being taken or distributed.

According to Li, she has been contacted by a growing number of people seeking help in recent years. They were from places outside Hong Kong that lack the advice and succor they need, including the Chinese mainland. “Although it’s difficult to support help-seekers outside Hong Kong, we continue to offer as much assistance as possible,” she says.

In cases where photos and videos cannot be taken down for whatever reason, she says she will go all out to get in touch with the victims. She will explain the situation to them, apologize for being unable to help and ask if they would need further support.

Upon receiving such emails, some of the victims have gone further by consoling Li. They say to Li, “You do not need to feel sorry, as you are not the one at fault.”

View Full Article