China’s latest draft regulations on internet content seek to make cyberspace a nicer place. In promoting ‘governance of the online ecosystem’ they set out to not only enforce existing restrictions on illegal content, but also to discourage broader ‘negative content’ and even to encourage more upbeat content.
Table 1: Summarizes the non-exclusive lists that define each information type
(Article 5) (Article 7) (Article 6)
All sectors of society have a role to play in online governance, but the regulations primarily address three broad groups:
- Content Producers: Organizations or individuals producing online content.
- Content Platforms: Online services for the reproduction, publication, or transmission of content.
- Content Users: Organizations and individuals that use online information services.
The relationships of each group with the various types of content is shown in Table 2:
Positive | Negative | Illegal | |
Producers | Encouraged to produce | Must not produce | Prohibited from producing |
Platforms | Must not reproduce, publish or transmit | Prohibited from reproduction, publication, or transmission | |
Users | Consciously resist negative information | Must not reproduce illegal information |
For Producers and Platforms, the difference in language used for the restrictions on negative and illegal information — “must not” (不得)and “prohibited” (禁止) respectively — is insufficiently distinct. These terms have been recognized as being largely synonymous in Chinese law; [See: the Norms on Common Legislative Technical Language, item 15《立法技术规范(试行)》] and given that only ‘Illegal’ information is actually forbidden by laws and regulations, limits on the broader category of legal but ‘Negative’ information shouldn’t be so stern.
One way to understand whether this is merely sloppy drafting, or an intentional expansion of content prohibitions is to compare the consequences of violating each type of restriction.
Table 3 Shows the consequences of violations of content prohibitions, discussed below
Negative (Article 36) | Illegal (Article 35) | |
Producers | Platforms are to address the producers in accordance with laws and agreements by:
and are to store records for authorities, that will make dispositions as allowed by law. | Platforms are to address the producers in accordance with laws and agreements by:
and are to store records for authorities, that will make dispositions as allowed by law. |
Platforms | Where either reproducing, publishing, or transmitting negative content
Given a talking to or ordered to make corrections If refuse to cooperate or situation is serious, may be given warnings or fines, have operations suspended, or even have permits revoked.
| Where either reproducing, publishing, or transmitting illegal content OR failing to delete and stop illegal content they find on the platform: Given a warning and have illegal profits confiscated. If they refuse to cooperate or the situation is serious, may be given a fine of btwn 100,00 – 500,000, have operations suspended, suspend operations for corrections, have websites closed of have permits canceled. Directly responsible management and other personnel may be fined btwn 10,000 – 100,000. |
PRODUCERS
Where producers create negative or illegal content online, it is the platforms that are charged with addressing it in the new Provisions. In this way, some enforcement is effectively outsourced to the platforms’ managers, and because users who create or operate groups or forums on a platform are also responsible for the content, they too are recruited into enforcing censorship. (article 21).
For both negative and illegal content, Platforms are to give warnings to the Producers to change their behavior, limit their use of the Platform, and suspend updates. Where the content is illegal, the Platforms are further instructed to close offending accounts and directly eliminate the illegal conduct. Again in either situation, the Platforms are also to record the incident and report it to the authorities.
The consequences of producing illegal content are more severe, as authorized by other laws and regulations, but the consequences for even negative information are still meaningful, particularly reporting to the authorities. Where content is truly prohibited by other laws, reporting makes sense, as there may be official enforcement action as well. In the case of merely ‘negative’ information however, there is a real question about whether platforms (or the authorities) can truly process all such information. Imagine, for example, trying to truly report all ‘sexual innuendo’ online.
PLATFORMS
Platforms that reproduce, publish, or transmit negative information can be given a talking to or ordered to make corrections by the relevant authorities. Where they refuse to cooperate, they can be fined, have business suspended, or even lose their permits.
Where the content is Illegal content, the punishments are more serious, and can also be given for failure to delete and prevent illegal content found on the platform. Fines can reach 500,000 RMB, and related illegal profits are to be confiscated, and websites may even be closed down. Moreover, directly responsible managers and other personnel can be personally fined up to 100,000 RMB for violations; individual liability for an organization’s violations of laws has become increasingly common in Chinese regulation, but seems particularly jarring in the context of outsourced enforcement of speech restrictions. Regardless of whether it is the organization or individuals held accountable, however, the result is likely to be overzealous censorship as there is no real penalty for over-enforcement while a failure can have serious consequences.
Platforms also have a number of additional management duties listed in Table 3 below with their corresponding punishments.
Duty | Punishment for Failure |
| Given a talking to or ordered to make corrections within a set period of time. Where they refuse to cooperate or the circumstances are serious, may:
|
Review content and placement of advertisements (16) | Punishments in accordance with other laws and regulations |
Encouraged to create child-friendly models Prevent minors from encountering illegal information. | No mention of consequences. |
USERS
Although several specific obligations are mentioned for users, the Provisions all refer to ‘other laws and regulations’ for their enforcement. This suggests that none of the requirements are truly new, and must be restating requirements that exist elsewhere. It is thus interesting to see what the drafters felt needed emphasis. Remember also that online platforms are called upon to create internal ‘credit’ management and penalties against users who violate these provisions.
The specific additional requirements for users are:
- Founders and managers of groups, forums, boards must manage the content publication and transmission (21)
- Must not commit online torts and cyber-violence (doxing, defamation, etc.), damaging others’ rights in their reputation, person or property (22)
- must not post or delete information for purpose of extortion (not just personal information for blackmail purposes, but may also include negative reviews on-site ratings, etc.) (23)
- Must not use new technology such as deep learning and virtual reality for illegal activity. (24)
- Must not game traffic through fake accounts, selling accounts, hijacking traffic etc. (25)
- must not use Party and State symbols and activities to carry out commercial activity (26)
[…] to track down all ‘sexual innuendo’ online in twitter posts and comments sections.The prior draft articulated the specific punishments for violations of these rules, but this document now only […]