New rules for the protection of minors online focus on key areas of global concern: i
- Increasing minors’ internet and media literacy,
- preventing their contact with inappropriate content,
- protecting their personal information,
- and preventing excessive use.
A new chapter on the online protection of minors was added to China’s Law on the Protection of Minors (LPM) in 2020, and these new regulations build on that section. A first draft of these rules was actually released even before the LPM revisions in 2017, but was put aside until a second, much amended draft was released in March 2022.
While one would expect these new Regulations to go into substantially more detail about the implementation of the corresponding LPM chapter, they often remain frustratingly vague, and further rulemaking will be required if they are to be meaningfully and consistently implemented. To this end, several articles in the new Regulations call for the drafting of further rules, including identifying the scope of some restricted content, setting protective software standards, assessing internet literacy, and clarifying the scope of online platforms that have special duties to minors.
Improving Minor’s Online Literacy
The main thrust of Chapter II of the regulations is on education; teaching young people to use the internet in a manner that is both productive and safe.(LPM 64). This means that schools are to include internet literacy in the curriculum, emphasizing not just the use of internet resources, but also online ethics, relevant laws, and self-protection skills. The focuses here are security, civility, good habits, and self-defense. (13).
Child protection advocates pushed for self-protection skills to be included together with technical internet usage skills. Their concern was that if these areas were separated, content about mental health, interpersonal relationships online, and even avoiding sexual predators might be sidelined as ‘soft content’ that was less important than job-oriented training. Including them together means that students’ performance in all of these areas will be evaluated together even if they are not given equal emphasis. The evaluation standards to be developed by the Education Departments and the Cybersecurity Administration (13)
Fostering literacy also includes increasing access to the internet. Minors in China are generally prohibited from entering internet cafes, and the new regulations charge county governments with creating plans to create internet access for minors in public spaces. (14) Venues such as schools, cultural centers, and libraries providing internet services are also charged with supervising minors’ usage, including by having professional or volunteer chaperones and software protections. (15)
Protective software for minors must also be pre-installed (or easily accessed) on smart terminals that are specifically for use by minors, and minors’ guardians are encouraged to use it on family devices. (19) The State is to support the development of such protective measures, as well as online resources for minors more generally (18). The specific standards for protective software (and devices designated for use by minors) are to be released by the CAC and other central authorities. (19) The primary functions of such products is to identify illegal or inappropriate content (content screening) and prevent addiction (time and permissions controls). (19).
In August 2023, draft guidelines for “minor’s modes’ were released by the CAC for public comment, detailing required software that would empower parents to better control and monitor their children’s internet usage, including default age-based time use and content restrictions.
China’s regulation of online content, for both adults and children, is notorious, and it is important to remember the broader context of information controls when considering special rules for minors.
The most recent comprehensive content regulation includes encouraged ‘positive content’, prohibited illegal content, and discouraged ‘negative content’. The new regulations on the protection of minors follow the LPM is addressing two additional categories related to minors: content that is harmful to minors, which is prohibited, and content that might impact minors physically or psychologically, which must be clearly labeled. (23, LPM 50)
The first category of harmful information includes the promotion of obscenity, pornography, gambling, violence, cults, self-harm, terrorism, or extremism. (22) This is content that is essentially all illegal under the broader regulatory scheme.
The Regulations add that the second category includes content that would lead minors to imitate unsafe behaviors or violate social mores, that would cause extreme feelings, or lead to bad habits. (23) Such content is difficult to identify as it is defined by its impact on the minors rather than the subject matter. Moreover, as I noted when reviewing an early version of these rules, “unsafe behaviors”, “violating social mores”, “extreme feelings” (“negative” feelings in the earlier draft), and “bad habits” sounds like a caricature of ordinary teenage psychology.
Unfortunately, the consequences for mishandling such information can be very real. Where this content in not properly labeled by users, online service providers must police their users and require them to correct the situation or remove the content. Providers must also ensure that such content is never in promoted areas such as “trending topics”, pop-up windows, or home pages, and nobody may send this content to minors or encourage them to interact with it. Violations will result in orders to make corrections, which if not heeded are enforced through hefty fines, confiscation of profits, and cancellation of business licenses. (55) Various departments are tasked with developing more precise standards for the covered information and format of warnings (24) but until then, platforms may be led to over-censor in an abundance of caution.
Personal Information Protection
Even before the adoption of China’s landmark Personal Information Protection Law (PIPL) in 2021, China recognized the special risks posed by minors’ personal information, and released the Provisions on the Protection of Children’s Information Online. Many of the protections outlined in that document were ultimately extended to adults in the PIPL—illustrating how advances in protections for minors can become a laboratory for exploring protections with more general applicability.
Personal information, as used in the PIPL, generally refers to any information that can identify a user. All personal information of minors under 14 is further considered ‘sensitive personal information’ – information that if leaked or improperly used could readily cause injury to dignity, health, or property. Such information is only to be collected when truly necessary and requires independent consent for its use.
Facilitating User Control of Personal Information
The new Regulations largely follow the existing law (PIPL 45-47) emphasizing that those working with minors’ personal information must provide the minors and their guardians with easy channels to:
- Make inquiries: into the types and quantities of one’s own personal information that is collected. No restrictions can be imposed on ‘reasonable requests.
- Access and Correct: Access, reproduce, correct, supplement, or delete information that has been collected. If requests are denied, an explanation must be provided.
- Transfer: Have collected information transferred to third parties such as other service providers.
- Revoke Consent: To end information collection and close accounts.
Principle of minimum necessity’
In China’s personal information protection system, the collection of personal information is generally restricted to the smallest scope necessary to achieve the indicated goals of the data handlers. Additional information collection cannot be compelled and requires independent consent. (32)
The rules for minors follow this and further expand it to specify that the scope of staff with access to children’s data should be similarly limited to the smallest possible range. Moreover, access to children’s information by service providers’ employees must be specially granted by their corporate information protection officers or managers, the access is to be logged, and technical measures should be in place to stop any improper downloading or reproduction of children’s materials. (36)
Risks Alerts and Prevention
Those handling minors’ personal information have a duty to notify affected minors and their guardians where personal information has been (or may have been) disclosed, altered, or lost. Generally, this notice is to be individualized, but may also be by publication. (35)
Service providers also have a proactive duty to protect minors’ “private” information that is published online, even by the minors themselves. This includes giving alerts and stopping the further spread of the information. A new provision clarifies that in disposing of such information, platforms must not ignore evidence of violations against minors- and should save and report information that reveals such harms. (38)
As elsewhere, China has been very concerned about the physical and psychological impact of minors’ constant internet use. Efforts to prevent this ‘addiction’ have focused on limiting excessive spending of time and money online, with a particular focus on online games. The regulations largely import existing rules:
Schools and families are to be alert to potential overuse and work together to guide minors towards more healthy usage (40,41)
Online product and service providers are to avoid addictive content. They must review whether specific content, functions, or game rules are encouraging excessive use, and report annually to the public on these efforts to prevent addiction. (42, 47). They must avoid activities and content models that encourage binge participation, such as online voting for rankings, fundraising calls, traffic competitions, and so forth. (45).
Social media, livestreaming, and gaming platforms are specifically to implement minor modes that let guardians limit the times that they are used by minors, the duration of use, and permissions such as for app and content downloads. (43) There must also be age-specific spending limits, both as to single payments and daily total spending.
Online video games are a special focus in the push against internet addiction. All games must be reviewed and labeled for age appropriateness before they are allowed on the market. (47). Creators are required to consider whether specific rules encourage addiction and the appropriateness of content (47).
The LPM requires that online games must not be available to minors during the hours of 10 PM to 8 AM. Subsequent authority has gone even further and the current rules ban all online gaming by minors except on weekends and holidays, when a single hour of gameplay is allowed between 8:00 P.M. and 9:00 PM. The burden is on the service providers to block minors at other times.
Cyberviolence and bullying have recently become major focuses in China’s online governance. The LPM prohibits online abuse of minors including insults, defamation, threats, or malicious attacks in any form. (LPM 77) . In its Provisions on Schools’ Protection of Minors, article 21, the Ministry of Education expanded this to include students’ transmission of false information that belittles other students or the disclosure of personal information. A broader Opinion on Cyberviolence guides the justice sector in how to address abusive conduct online.
The new regulations build on this by adding several obligations for online service and product providers. They must first have mechanisms in place for early detection, identification, and handling of cyberbullying. (26) This includes the establishment of a trait database, and other information technology tools, to aid in the discovery and identification of cyberbullying, using a combination of automated and manual reviews. (26).
Tools must also be provided to minors accounts to enable them to avoid bullying. This includes the ability to block unknown users, to limit the scope of access for one’s own posts, and limiting what information is received. Failure to provide such options or to actively address bullying can result in fines and the cancellation of permits or licenses (56).
Article 20(3) calls for some platforms to create compliance systems including an independent review body composed of external personnel. The language used tracks that of PIPL article 58, calling for the creation of an independent body for the review of personal information protections on large platforms.
Article 20’s requirements for platforms are limited to those that have a “huge” volume of minor users OR that have a notable impact on minors as a group. The CAC is tasked with providing further detail on what is included in those categories.