Press "Enter" to skip to content

No gaming before breakfast. It’s the law.

‘Childhood’ is marked, or even defined, by a lack of access to certain “adult” information. Most modern societies have done their best, with varying degrees of success, to shield children from what they consider to be the darker truths about ourselves: violence, cursing, sexuality, cynicism, and so on. As information has become easier to access, however, this has become an increasingly difficult task, and now, in the internet age, ‘inappropriate information’ has a way of finding even those kids who aren’t actively looking for it. Regulation of children’s use of information technology is thus a complicated issue that requires difficult subjective judgments about what content is appropriate for children, and what decisions government, rather than individual families, should make.

China’s new draft regulations on online protections for minors, currently open for public comment are the latest piece in its attempt to create a legal framework to protect minors online and to prevent them from growing up too fast. The draft emphasizes education and awareness of the problems, but also mobilizes some big guns in the form of blanket prohibitions and stiff penalties. Unfortunately, like the rest of us, the drafters seems unclear of where those guns should be aimed that might actually help.

The main substantive focuses of the draft will be discussed below:

Content Regulation

In terms of content regulation, the draft discusses two types of information: that which violates laws, regulations, and rules (forbidden to everyone) and that which is ‘unsuitable for minors’ (forbidden to those under 18). Seeing these categories so clearly associated, it’s hard not to think more broadly about China’s many content restrictions for adults, and feel a pang of teenage rebellion at the uber-adult who decides what content is unsuitable for most citizens.

The second category, ‘unsuitable for minors’, is defined so broadly as to be entirely unworkable. Rather than listing specific content viewed as harmful to minors’ development, the draft lists traits that are undesirable in minors, and makes rules for anything that ‘might’ encourage them. For example, rather than covering portrayals of persons drinking alcohol, it includes all content that might induce minors to drink.

These overbroad categories will lead to uneven enforcement as both content providers and government authorities struggle to understand what is covered; after all, is there anything that doesn’t lead to self-loathing in teenagers? Perhaps limiting included information to that ‘intended’ or ‘known’ to cause such results would be a first step.

Content is unsuitable for minors under the draft where it might:

  1. lead minors to commit violence, bullying, suicide, self-injury, sexual contact, vagrancy, begging or other negative conduct; [negative behavior]
  2. lead minors to use tobacco, alcohol, or other products not suitable for minors; [substance abuse]
  3. lead minors to have negative emotions such as hating learning, cynicism, self-loathing, or depression; [anti-social attitudes]
  4. have a negative impact on minors’ physical and psychological health. [harm]
[Brackets editorials mine, see article 8]


The designation as ‘unsuitable for minors’ has real consequences. Content that falls within these categories must have a warning shown before it is displayed, and where they do not, penalties as high as 300,000 RMB (~$43,314 U.S.) fines may be imposed. (Article 28).

The draft also emphasizes the development and implementation of software protections for minors, which focus on helping minors avoid unsuitable content. (article 11). Should the draft take effect as written, all manufacturers and importers of cell phones, tablets, computers and the like would be required to pre-install such software, or facilitate its installation by end users, (article 12) and public internet venues serving minors such as schools and libraries, would also be required to ensure that such filters are in place. (article 11). This is reminiscent of a previous proposal to pre-install ‘Green Dam Youth Escort’ software, which was eventually abandoned after public backlash and allegations of plagiarism several years back.

Content filtering software is widely available, and has been used in public spaces in many nations, but with these vague categories of information to block, standard keyword filters and site blacklists will be insufficient to distinguish permissible and unsuitable conduct.

Internet Addiction

A good portion of the draft is spent discussing the prevention of internet addiction, with a special focus on gaming. Here the drafts’ guidance is a combination of the unworkably broad and the strangely precise.

Online gaming services are required to:

  • Implement real name registration and age verification systems to distinguish young gamers (Clearer)
  • Change the rules of games to make them less addictive (unclear)
  • Keep minors’ from unsuitable content (unclear as discussed above)
  • Limit the amount of time young gamers can play in one sitting, and the total amount of time they can play daily (Clearer)
  • Prohibit minors from playing games from midnight to 8:00 A.M. Daily. (Crystal Clear)

A clear positive in the sections on addiction is that those treating internet addiction are expressly prohibited from using insulting or abusive methods that harm the subjects. (Article 20). A response, no doubt, to the much publicized use of shock therapy, boot camp style facilities, and other dangerous measures in treating internet addiction

Protection of Minors’ Personal Information

Half of a the draft’s chapter on safeguarding minor’s rights and interests online is dedicated to protection of minors’ personal information. This includes requirements that those collecting such information do so only to the extent necessary, seek the consent of the minors’ and their guardians, honor their requests to delete or block such information, and do not display it in search results. (Articles 16-18). There is no discussion of what might happen where the guardians and minors disagree about whether information should be collected or deleted.

The remaining portions of that chapter concern education for minors and education and their guardians on the safe use of the online networks, and increasing free internet access for minors in regulated safe spaces.


Not much of the text is actually devoted to cyberbullying, but there is a prohibition on attacking, threatening, berating, or injury of minors. Generally, guardians and schools are to help victims of bullying, and report it to the police ‘when necessary’. Administrative and even criminal punishments are already available under existing law where these situations are serious.


Click to rate this post!
[Total: 0 Average: 0]

Print this entry

Jeremy Daum is a Senior Fellow of the Yale Law School Paul Tsai China Center, based in Beijing, with over a decade of experience working in China on collaborative legal reform projects. His principal research focus is criminal procedure law, with a particular emphasis on protections of vulnerable populations such as juveniles and the mentally ill in the criminal justice system, and is also an authority on China’s ‘Social Credit System’. Jeremy has spoken about these issues at universities throughout China and in the U.S.; and has co-authored a book on U.S. Capital Punishment Jurisprudence for Chinese readers. He is also the founder and contributing editor of the collaborative translation and commentary site, dedicated to improving mutual understanding between legal professionals in China and abroad.
He translates, writes, edits, does web-design, graphic design, billing, tech support, and social media outreach for China Law Translate.

Be First to Comment

    Leave a Reply

    Your email address will not be published. Required fields are marked *