Last May, the Cybercrime Investigation Unit of the Seoul Metropolitan Police Agency announced the arrest of five individuals for illegally creating and distributing doctored pornographic images using photos of female acquaintances. Two of the offenders, both graduates of Seoul National University (SNU), took photos of 61 victims, including SNU alumni, from their personal social media accounts without consent. They manipulated these photos by combining them with explicit content to create over 400 doctored images, distributing them through a private Telegram channel.
As digital technology advances rapidly, the scale of digital sexual crimes is expanding across online platforms. Deepfake sex crimes, where victims are easily targeted through social media or the internet and their images are manipulated into explicit photos or videos, are becoming increasingly serious. The Readable has conducted multiple interviews with law enforcement agencies and civil organizations that are closely monitoring these crimes.
The Women’s Human Rights Institute of Korea (WHRIK), an organization dedicated to preventing violence against women and supporting victims, published the “2023 Digital Sexual Crimes Victimization Report.” According to the report, the number of victims of digital sexual crimes rose from 1,315 in 2018 to 8,983 in 2023, marking an increase of 7,668 over five years. Additionally, the number of victims of illegally manipulated content increased by 359 during the same period.
Number of Victims and Cases Related to Illegally Manipulated Content
“Generative Artificial Intelligence (AI) technology is advancing rapidly, making it easier to create illegally manipulated content,” said a representative of the Women’s Human Rights Institute of Korea (WHRIK) in an email interview with The Readable. “As a result, crimes involving the creation of fake pornography are on the rise.”
The crime of creating fake pornography is becoming more prevalent among teenagers. On July 30, Jeju Seogwipo Police Station announced that it referred a 14-year-old student to prosecutors. The student, who attends an international school on Jeju Island, created sexually abusive images of 11 victims by combining their faces with explicit photos of women that had already been posted on the internet. The victims are all female students that attend the same school as the offender. Three of his classmates who viewed the abusive images were also referred to prosecutors.
On July 18, the Korea Communications Standards Commission (KOCSC) released statistics highlighting the rise in requests to remove abusive sexual content from the internet, particularly exploitative deep fakes that merge people’s faces with pornographic images or videos. From January to June this year, such requests surged by 84 percent compared to the total number of requests received in 2023.
Number of Correction Requests
Number of Requests for Removal of Abusive Sexual Content on the Internet
Additionally, ‘illegally manipulated content’ ranked second on the list of digital sexual crimes for which the KOCSC received requests.
Meanwhile, the Cybercrime Investigation Division of the Korean National Police Agency stated that the term ‘deepfake’ is now used to describe all forms of illegal synthetic photo compositing. This broad definition aims to help the public understand crimes involving generative AI technology in creating abusive sexual content, even if some cases may not be explicitly covered under current laws.
Fake pornography crimes using deepfake technology are also a significant issue abroad. In January, multiple news outlets in the United States reported that a photo of Taylor Swift, the renowned American singer, had been manipulated with deepfake technology and combined with sexual imagery to create a pornographic image of the celebrity. The modified image garnered over 45 million views in approximately 17 hours on X (formerly Twitter). Although X suspended several accounts and removed the fake image, it had already been disseminated across other social media platforms and continued to spread.
On March 19, the Italian news outlet ANSA reported that Italian politician Giorgia Meloni was pursuing a lawsuit for damages amounting to 100,000 euros in a civil trial against two individuals. The defendants had used deepfake technology to manipulate her photo, merging her face with pornographic images. The altered video, which combined Meloni’s face with footage from pornographic videos, was posted on a U.S. pornographic website and received millions of views, according to ANSA.
The Readable contacted the Busan Metropolitan Police to discuss the severity of deepfake crimes. A spokesperson stated, “Deepfake sexual content is relatively easy to create, leading some offenders to not take the crime seriously. However, victims endure significant mental trauma.”
“As the internet and deepfake technology evolve, abusive content can spread easily,” said Bae Sang-kyun, a Research Fellow in the Criminal Policy Research Division at the Korean Institute of Criminology and Justice. “Thus, prosecutors and police prioritize cases of sexual victimization to prevent secondary trauma.” He further emphasized, “While the creation of fake pornography is a grave issue, the continued spread of illegal videos due to inadequate removal efforts is also a serious societal problem that needs to be addressed.”