Press "Enter" to skip to content

Governing the E-cosystem

China’s latest draft regulations on internet content seek to make cyberspace a nicer place. In promoting ‘governance of the online ecosystem’ they set out to not only enforce existing restrictions on illegal content, but also to discourage broader ‘negative content’ and even to encourage more upbeat content.

Table 1: Summarizes the non-exclusive lists that define each information type


  1. Spreading and explaining Party doctrine
  2. Spreading Party action
  3. Spreading economic and social achievement
  4. Spreading the Core Socialist Values
  5. Guidance to the public on social concerns
  6. Increasing international influence
  7. Other positive and wholesome content

(Article 5)

  1. Sexual innuendo, suggestion, or enticement
  2. Gore and horror
  3. Promoting decadence and material excess
  4. Excessive celebrity intrigue
  5. Coarse or vulgar language and behavior
  6. Making light of tragedies
  7. Incitement of discrimination
  8. Sensationalizing headlines
  9. Harmful to minors
  10. Other content harmful to public morality, online ecosystem

(Article 7)

  1. Content contrary to the basic principles set forth in the Constitution;
  2. Content endangering national security, divulging State secrets, subverting the national regime, and destroying national unity;
  3. Content harming the nation’s honor and interests;
  4. Content demeaning or denying the deeds and spirit of heroes and martyrs;
  5. Content promoting terrorism or extremism, inciting ethnic hatred or ethnic discrimination, or destroying ethnic unity;
  6. Content undermining the nation’s policy on religions, promoting cults and superstitions;
  7. Dissemination of false information, disrupting economic or social order;
  8. Obscenity, eroticism, gambling, violence, murder, terror or instigating crime;
  9. Content insulting or defaming others, infringing other persons’ honor, privacy, or other lawful rights and interests;
  10. Other content prohibited by laws or administrative regulations.

(Article 6)

All sectors of society have a role to play in online governance, but the regulations primarily address three broad groups:

  • Content Producers: Organizations or individuals producing online content.
  • Content Platforms: Online services for the reproduction, publication, or transmission of content.
  • Content Users: Organizations and individuals that use online information services.

The relationships of each group with the various types of content is shown in Table 2:

ProducersEncouraged to produceMust not produceProhibited from producing
PlatformsMust not reproduce, publish or transmitProhibited from reproduction, publication, or transmission
UsersConsciously resist negative informationMust not reproduce illegal information

For Producers and Platforms, the difference in language used for the restrictions on negative and illegal information — “must not” (不得)and “prohibited” (禁止) respectively — is insufficiently distinct. These terms have been recognized as being largely synonymous in Chinese law; [See: the Norms on Common Legislative Technical Language, item 15《立法技术规范(试行)》] and given that only ‘Illegal’ information is actually forbidden by laws and regulations, limits on the broader category of legal but ‘Negative’ information shouldn’t be so stern.

One way to understand whether this is merely sloppy drafting, or an intentional expansion of content prohibitions is to compare the consequences of violating each type of restriction.

Table 3 Shows the consequences of violations of content prohibitions, discussed below

Negative (Article 36)Illegal (Article 35)
ProducersPlatforms are to address the producers in accordance with laws and agreements by:

  • giving warnings to make corrections
  • limiting functions
  • suspending updates
  • etc.


and are to store records for authorities, that will make dispositions as allowed by law.

Platforms are to address the producers in accordance with laws and agreements by:

  • giving warnings to make corrections
  • limiting functions
  • suspending updates
  • closing accounts
  • eliminating the illegal content.
  • etc.

and are to store records for authorities, that will make dispositions as allowed by law.

PlatformsWhere either reproducing, publishing, or transmitting negative content


Given a talking to or ordered to make corrections

If refuse to cooperate or situation is serious, may be given warnings or fines, have operations suspended, or even have permits revoked.



Where either reproducing, publishing, or transmitting illegal content OR failing to delete and stop illegal content they find on the platform:

Given a warning and have illegal profits confiscated.

If they refuse to cooperate or the situation is serious, may be given a fine of btwn 100,00 – 500,000, have operations suspended, suspend operations for corrections, have websites closed of have permits canceled.

Directly responsible management and other personnel may be fined btwn 10,000 – 100,000.


Where producers create negative or illegal content online, it is the platforms that are charged with addressing it in the new Provisions. In this way, some enforcement is effectively outsourced to the platforms’ managers, and because users who create or operate groups or forums on a platform are also responsible for the content, they too are recruited into enforcing censorship. (article 21).

For both negative and illegal content, Platforms are to give warnings to the Producers to change their behavior, limit their use of the Platform, and suspend updates. Where the content is illegal, the Platforms are further instructed to close offending accounts and directly eliminate the illegal conduct. Again in either situation, the Platforms are also to record the incident and report it to the authorities.

The consequences of producing illegal content are more severe, as authorized by other laws and regulations, but the consequences for even negative information are still meaningful, particularly reporting to the authorities. Where content is truly prohibited by other laws, reporting makes sense, as there may be official enforcement action as well. In the case of merely ‘negative’ information however, there is a real question about whether platforms (or the authorities) can truly process all such information. Imagine, for example, trying to truly report all ‘sexual innuendo’ online.


Platforms that reproduce, publish, or transmit negative information can be given a talking to or ordered to make corrections by the relevant authorities. Where they refuse to cooperate, they can be fined, have business suspended, or even lose their permits.

Where the content is Illegal content, the punishments are more serious, and can also be given for failure to delete and prevent illegal content found on the platform. Fines can reach 500,000 RMB, and related illegal profits are to be confiscated, and websites may even be closed down. Moreover, directly responsible managers and other personnel can be personally fined up to 100,000 RMB for violations; individual liability for an organization’s violations of laws has become increasingly common in Chinese regulation, but seems particularly jarring in the context of outsourced enforcement of speech restrictions. Regardless of whether it is the organization or individuals held accountable, however, the result is likely to be overzealous censorship as there is no real penalty for over-enforcement while a failure can have serious consequences.

Platforms also have a number of additional management duties listed in Table 3 below with their corresponding punishments.

DutyPunishment for Failure
  • Establish systems for:
    • Review of publications, comments etc. (9)
    • Real time content inspections, emergency response (9)
    • Addressing rumors or leads on illegal business activity (9)
  • Cooperate with Police (10P2)
  • Management of all content whether served by automated or manual process (13)
  • incorporate mainstream values in personalized content delivery (15)
    • Allow manual intervention and user choices in personalized content delivery.
  • Have service agreements that clearly state user rights and obligations
  • Maintain user ‘credit’ archives and provide services commensurate with credit levels (17)
  • Have conspicuous and convenient complaint and report channels, and public disclosure of complaint outcomes. (18)
  • Submit annual reports on governance activity (19)
Given a talking to or ordered to make corrections within a set period of time.

Where they refuse to cooperate or the circumstances are serious, may:

  • give warnings or fines,
  • order that operations be suspended for rectification,
  • revoke permits; and
  • where a crime is constituted, criminal responsibility is pursued in accordance with law.
Review content and placement of advertisements (16)Punishments in accordance with other laws and regulations
Encouraged to create child-friendly models

Prevent minors from encountering illegal information.

No mention of consequences.


Although several specific obligations are mentioned for users, the Provisions all refer to ‘other laws and regulations’ for their enforcement. This suggests that none of the requirements are truly new, and must be restating requirements that exist elsewhere. It is thus interesting to see what the drafters felt needed emphasis. Remember also that online platforms are called upon to create internal ‘credit’ management and penalties against users who violate these provisions.

The specific additional requirements for users are:

  • Founders and managers of groups, forums, boards must manage the content publication and transmission (21)
  • Must not commit online torts and cyber-violence (doxing, defamation, etc.), damaging others’ rights in their reputation, person or property (22)
  • must not post or delete information for purpose of extortion (not just personal information for blackmail purposes, but may also include negative reviews on-site ratings, etc.) (23)
  • Must not use new technology such as deep learning and virtual reality for illegal activity. (24)
  • Must not game traffic through fake accounts, selling accounts, hijacking traffic etc. (25)
  • must not use Party and State symbols and activities to carry out commercial activity (26)
Click to rate this post!
[Total: 1 Average: 5]

Print this entry

Jeremy Daum is a Senior Fellow of the Yale Law School Paul Tsai China Center, based in Beijing, with over a decade of experience working in China on collaborative legal reform projects. His principal research focus is criminal procedure law, with a particular emphasis on protections of vulnerable populations such as juveniles and the mentally ill in the criminal justice system, and is also an authority on China’s ‘Social Credit System’. Jeremy has spoken about these issues at universities throughout China and in the U.S.; and has co-authored a book on U.S. Capital Punishment Jurisprudence for Chinese readers. He is also the founder and contributing editor of the collaborative translation and commentary site, dedicated to improving mutual understanding between legal professionals in China and abroad.
He translates, writes, edits, does web-design, graphic design, billing, tech support, and social media outreach for China Law Translate.

One Comment

  1. […] to track down all ‘sexual innuendo’ online in twitter posts and comments sections.The prior draft articulated the specific punishments for violations of these rules, but this document now only […]

Leave a Reply

Your email address will not be published. Required fields are marked *