Advertisement
Hong Kong society
OpinionHong Kong Opinion
Adam Au
Felicia Feiran Chen
Adam AuandFelicia Feiran Chen

Opinion | Deepfake porn scandal in Hong Kong exposes need to update laws

Not since the 2008 Edison Chen sex photo leak has the city faced such a crisis in privacy protection as laws struggle to keep up with AI

Reading Time:3 minutes
Why you can trust SCMP
1
A man takes a photograph on the University of Hong Kong campus on June 4. The case of a student who created hundreds of deepfake photographs of fellow students has sparked discussion about whether Hong Kong’s law can adequately deal with such cases. Photo: Dickson Lee
The deepfake scandal at the University of Hong Kong – hundreds of non-consensual, sexually explicit composites reportedly found on a student’s laptop – feels like a sequel to a much older story from 2008, when intimate photographs copied from actor Edison Chen’s computer were leaked on the internet.

Then, the harm was privacy invasion: real images, created in private, thrust into public view. Now, the harm arrives through fabricated images: generated by artificial intelligence (AI) with enough likeness to stain reputations. The difference? Anyone today with a social media footprint can be targeted.

The Edison Chen episode marked a turning point in our internet culture and gender discussion. Police pursued distributors of the stolen photos under computer misuse laws. Reflecting the gender double standards, the celebrity women victims bore the brunt of the fallout. Framed as a question of morality rather than theft, the women’s consent to having their photos taken in private was conflated with consent to their public display.

Advertisement

Deepfakes invert that logic. When synthetic images leak, the knee-jerk dismissal is that they are not real. Yet their mere existence is enough to tar reputations. All it takes is a single selfie scraped from social media, which can be repurposed offline.

Hong Kong’s legal architecture, including a privacy statute drafted in the 1990s, does not map perfectly onto this shift. Some argue that the Personal Data (Privacy) Ordinance, designed for conventional data-processing models, may not neatly address hyperrealistic fabrications using machine-learning techniques, and that harvesting publicly available photos may not breach collection rules, with lack of distribution creating a legal grey zone.
Advertisement

Hong Kong does not recognise a general right to control one’s likeness, and the traditional legal tools against harassment and defamation were developed for conduct predating AI. Therefore, harm-inflicting actions could escape sanction until they spill into distribution, by which time the damage is often irreparable.

The city’s privacy watchdog has launched a criminal investigation into the deepfake case, but declined to comment further. Its ultimate stance will clarify whether pre‑distribution, creation or possession is covered by existing law and what remedies are available.
Advertisement
Select Voice
Choose your listening speed
Get through articles 2x faster
1.25x
250 WPM
Slow
Average
Fast
1.25x