An International Perspective: Addressing Challenges to Protect Children’s Rights from Online Sexual Exploitation in the Digital Era

The rapid evolution of technology in the digital era has brought to light the profound severity and prevalence of child sexual exploitation in the online environment. Recent news reports have exposed that thousands of images involving child sexual abuse content existed in LAION-5B, an open-source image database, originally intended to serve as training data for artificial intelligence (AI) models. This incident, far from isolated, serves as a mere glimpse into the broader problem. Simultaneously, despite large platform companies like Meta reporting millions of cases related to child pornography, controversies persist over the algorithmic recommendation system and information encryption technologies. How to determine the boundary between the protection of children’s rights and personal data privacy has become an unavoidable issue. We should learn from international practical experience and actively engage in relevant legislation and policy-making processes, to collaboratively tackle the issue of child sexual exploitation in the digital era, and ensure that every child could enjoy a safe and carefree childhood in the digital era.

1. Current Status of Online Child Sexual Exploitation

On December 20, 2023, The Washington Post reported that researchers from the Internet Observatory at Stanford University discovered over 1,008 images depicting child sexual abuse within LAION-5B, a popular open-source image database. AI image generation models, including Stable Diffusion, rely on this database to create hyper-realistic photos. Representatives of LAION-5B responded that they have temporarily taken down the LAION-5B data set “to ensure it is safe before republishing.” ¹

The rapid development of AI technology in the digital era enables anyone to generate lifelike images by simply inputting a brief description into an AI model. Considering the models’ capability to rapidly grasp, train, and recreate images based on existing materials, the presence of over a thousand images depicting child sexual abuse in the LAION-5B database raises concerns about the potentially alarming capabilities of AI image generators. Every child deserves to have a secured childhood, and in the online environment of the digital era, this is just the tip of the iceberg regarding the issue of child sexual exploitation.

According to the definitions published by World Health Organization (WHO)², sexual exploitation refers to actual or attempted abuse of a position of vulnerability, power, or trust, for sexual purposes, including, but not limited to, profiting monetarily, socially or politically from the sexual exploitation of another. Sexual abuse refers to actual or threatened physical intrusion of a sexual nature, whether by force or under unequal or coercive conditions ³. In the digital era, the widespread proliferation of the internet and rapid advancements in science and technology have led to a rampant increase in child sexual crime content, necessitating the urgency to address child rights protection issues in online spaces.

Data released by the Internet Watch Foundation (IWF), a UK non-profit organization, indicates a staggering 1800% increase in pictures and videos showing children being raped, tortured and sexually abused over the past decade⁴. According to IWF’s 2022 annual report⁵, it assessed a total of 375,230 reports related to child sexual abuse content in 2022, marking a 4% increase from 2021 and a 25% increase from 2020⁶. Of the assessed reports, 255,588 were confirmed as containing child sexual abuse imagery, having links to the imagery, or advertising it. Among these, the number of reports that were tagged as including self-generated content was 199,363, constituting 78% of the reports confirmed as containing child sexual abuse content—a 9% increase from 2021. The self-generated content may be intentionally created and shared by minors, but in most cases, it results from online grooming or sextortion⁷. According to the International Association of Internet Hotlines (INHOPE) 2022 annual report⁸, out of 587,852 reports of potential child abuse cases received throughout the year, 84% of Child Sexual Abuse Material (CSAM) reports were never-before-seen material, depicting new and known child victims of sexual abuse. These data reveal the severity of child exploitation⁹ issues in the digital era, prompting the international community to take more robust measures to address and ensure the safety of children in online spaces.

Data published by the US non-profit organization Child Rescue Coalition (CRC) further underscores the ubiquity of child sexual exploitation. Among the 72.5 million unique IP addresses sharing and downloading CSAM materials identified since its inception, the most widely shared child abuse file has been seen linked with over 2.5 million unique IP addresses. This highlights the alarming speed at which child pornography content propagates through online media, and its clandestine nature makes it challenging for non-profit organizations or judicial authorities to promptly identify and address illegal content—even with over 2.5 million flagged IP addresses. Even more distressing is the data indicating that predators typically have been 50 and 150 victims over the course of their lifetimes, and as many as 1 in 5 girls and 1 in 20 boys will experience some form of sexual abuse before age 18.

2. New Challenges in the Digital Era: Controversies Arising from Meta’s Algorithmic Recommendations to Message Encryption Technologies

In the digital era, the protection of children’s rights becomes urgent and complex. While the big data and algorithmic applications by platform companies provide technical support to locate illegal content in a timely and dynamic manner, they may also inadvertently contribute to the proliferation of illegal activities. Taking Meta as an example, despite its active collaboration with non-profit organizations to investigate and report on child pornography, recent data from the National Center for Missing & Exploited Children (NCMEC) on December 7, 2023, revealed that in 2022 alone, Meta reported more than 20 million incidents of offenders sharing child sexual abuse images via Facebook and Messenger. However, the technology employed by Meta has raised numerous concerns, posing new challenges for child online protection in the digital era.

On June 7, 2023, The Wall Street Journal exposed issues with Instagram promoting child sexual exploitation content. The algorithmic recommendation system applied by the platform is designed to connect people with shared interests, resulting in a de facto facilitation of pedophiles’ search for accounts posting and selling child pornography. Pedophiles utilized hashtags to find Instagram accounts publishing and selling illegal content, with the algorithm recording and learning user behavior patterns to continuously recommend terms and associated accounts related to child sexual abuse. The article also highlighted that Instagram accounts distributing and selling illegal content typically refrained from openly posting such content, but rather use a content “menu” to discreetly invite buyers for specific transactions. This, coupled with inadequate platform regulation, resulted in the rapid growth of an extensive underground trading network.

Following the publication of these research findings, Meta responded swiftly, claiming to disable relevant hashtags and addressing algorithmic recommendation issues by setting up an internal task force, as well as taking measures to remediate illegal accounts on the platform. Worryingly, however, six months after the initial report, The Wall Street Journal released a follow-up report on December 1, 2023, disclosing that the initiatives Mata had taken were less than satisfactory. Tests conducted by The Wall Street Journal and the Canadian Centre for Child Protection showed that Meta’s recommendation system continued to promote child pornography. Despite the company has taken down hashtags related to pedophilia, its systems sometimes recommend new ones with minor variations; even when Meta received reports of suspicious accounts, it had been spotty in identifying and removing them; similar issues extend beyond Instagram to Facebook, where the company claimed to have hidden 190,000 groups in search results and disabled tens of thousands of other accounts, yet the results were deemed unsatisfactory.

Another controversy concerns the collateral damage to the protection of children’s online rights triggered by Meta’s upgraded message encryption technologies in social software communication. On December 6, 2023, Meta announced on its official website that it would apply end-to-end encryption (E2EE) technology to private chats and calls across the Messenger social software. This technology ensures that the content of the messages and calls is encrypted throughout the entire process, from the moment it leaves the sending device to the time it reaches the receiving device, making it inaccessible to any third party during transmission unless users choose to report the information to the platform. This is undoubtedly a significant technological advancement from the perspective of individual data privacy protection. However, on December 7 and December 8, 2023, the NCMEC and the National Police Chiefs’ Council (NPCC) of the United Kingdom released articles on their official websites respectively, urging attention to the impact of technology on the protection of children’s online rights. In the article, NCMEC emphasized that for more than a decade, Meta has chosen to aggressively detect and remove images and videos of children being sexually abused and exploited. Because of their efforts, Meta has demonstrated that Messenger and Facebook are misused by individuals sharing illegal child sexual abuse material, but the end-to-end encryption technology restricts message access strictly to the two ends of transmission, making it impossible for any third party to detect whether the sent information contains content related to child exploitation. Consequently, this inadvertently protects criminals. NCMEC urged Meta to stop the application of end-to-end encryption technology until it develops the ability to detect, report, and remove images of child sexual exploitation within encrypted messages.

3. Reassessing the Issue of Child Sexual Exploitation in the Digital Era

Our discussion on the issue of online child sexual exploitation should not be limited to the other countries. Analyzing cases brought to light by the domestic media, The Beijing News, on September 7, 2023, involving instances of “online harassment” targeting minors, it appears that our country is facing similar challenges. The dissemination of child sexual abuse materials in the digital era occurs predominantly through the internet and is transnational in nature. Combating such crimes necessitates effective international collaboration, demanding us to pay close attention to international developments. On November 8, 2023, 73 member states of the United Nations introduced a revised draft resolution on the promotion and protection of the rights of children, which underscores the urgency of the challenges posed by information technology to the protection of children’s rights in the digital era. China, as a country with leading digital technology and a highly developed platform economy, should actively participate in legislative and policy-making processes during the formulation of relevant international policies, present governance expertise and explore emerging issues.

In the midst of the rapid transformation with respect to algorithmic and artificial intelligence technologies, responding to the demands of the times for the protection of children’s rights has become an urgent necessity. The current situation is too serious to be taken lightly. In addition to focusing on the efforts made by large platform companies, international organizations, and governments worldwide to protect children’s rights, we also need to recognize the brand-new challenges posed by the application of digital technologies. Questions such as how to dynamically review and remove illegal activities that parasitize the online environment, how to balance the protection of personal data privacy and the protection of children’s rights, and how to delineate the boundaries of personal data privacy scrutiny during the process of safeguarding children’s rights all warrant thoughtful consideration. Combating child sexual exploitation is paramount to ensuring that every child can enjoy an undisturbed childhood in the digital era.

Scroll to Top