Press "Enter" to skip to content

Notice on Carrying Out the “Summer 2023-Clearnet Governance on the Online Juvenile Environment for Juveniles” Special Action

[Source]http://politics.people.com.cn/n1/2023/0627/c1001-40022337.html
Promulgating Entities: Secretariat of the Cyberspace Administration of China
【Date of Promulgation】June 21, 2023

To all provincial, autonomous region, or directly governed municipality Party Committee Internet information offices, and the Party Committee internet information office of the Xinjiang Production and Construction Corps:

In order to further strengthen the online protection of minors and to create a healthy and safe online environment, The Cybersecurity Administration of China is to carry out a two-month “ “Summer 2023-Clearnet Governance on the Online Juvenile Environment for Juveniles” special action.

I. Work Goals

By carrying out this special action, focus on correcting prominent issues involving minors online, fully reduce the space for hidden and variated harmful information, resolutely curb illegal conduct that infringes on the rights and interests of minors, further increase information content security standards for products and services such as study APPs and smart equipment for children, effectively resolve the problem of internet addiction, and create an internet environment that is conducive to the healthy and safe growth of minors.

II. Key Corrections

Deeply and comprehensively correct, and strictly address and punish, online platforms and smart equipment for children that have a large number of minor users or that have a notable impact on minors, and continue to purify the online ecology. Key focus is to be given to the following 7 issues:

1. The problem of harmful content that is hidden or in changed form. First is the transmission of content that is vulgar, about gambling, or superstitious by using homophones, variant characters, and emojis; second is using video editing, re-creation of film and television, or modifying cartoons so as to concentrate scenes such as those showing gore and violence involving children; third is using external links, floating windows, QR codes, account information, and so forth to conduct pornographic traffic draws; fourth is children’s smart equipment coming with third-party Apps and there is sexual or violent content in voice or text search results.

2. The problem of online bullying. First is using coarse online language to insult and berate minors; second is conducting “human flesh search engines” [coordinated doxxing] against minors and maliciously disseminating bullying videos that disclose minors' private matters; third is smearing minors by means such as maliciously photoshopped images and fabricating rumors.

3. The problem of remote indecency. First is enticing or coercing minors to chat naked online or make obscene actions in the name of “personalized friend matching”, “recruiting child stars”, or “payment for photos”; second is sending sexual photos or obscene language to minors through channels such as chat groups or private information; third is publishing vulgar or sexual comments on accounts or article pages involving minors, leading them to make bad friends or seducing them.

4. The problem of online fraud. First is carrying out online fraud by fake sales of performance tickets to minors in the guise of “insiders” or “regrettably letting them go”; second is enticing minors to join group chats in the guise of free gaming equipment, cash-back for voting, mystery box drawings, fan raffles, and so forth, and tricking them into transferring funds; third is enticing minors to provide social media account, payment passwords, and so forth, to us in online fraud activities.

5. The problem of negative content. First is publishing videos of the sort that elaborate campus stories, tarnishing the school’s image in extreme ways, raising a commotion about student-teacher conflicts, or glamorizing butting heads with teachers, etc.; second is conduct such as painting parents as “oppressing” or “squeezing” children, or maliciously concocting fake parent conflicts; third is the enticement to smoke, drink, get tattoos, skip classes, or other negative conduct; fourth is staging scenes to play up abnormal aesthetics such as ‘thug style’ or ‘domestic violence makeup’, transmitting negative content such as bullying the weak and using violence to fight violence.

6. The problem of internet addiction. First is enticing minors to watch livestream and short videos for long periods; second is using algorithms to focus the push of content then leads to addiction to minors, forming information bubbles; third is enticing minors to spend through entertainment functions such as chat and friend-making, and virtual decorations; fourth is providing minors with services for renting or selling online game accounts in violation of the rules, instructing minors in how to bypass anti-addiction systems, or work around youth modes.

7. The problem of risks from new technologies and applications. First is using technologies such as “AI face-swapping”, “AI drawing”, or “AI One-button undressing” to produce vulgar sexual pictures and videos involving minors; second is using so-called ‘burn after reading” secret chat software to trick minors into providing personal information or to entice them into carrying out illegal conduct; third is using generative artificial intelligence technology to create and publish harmful information involving minors.

III. Work Requirements

Each local internet information department should fully recognize the important significance of carrying out the summer special action on governing the online environment for minors, carefully organizing and making arrangements, and firmly promoting implementation.

(1) Highlight key points of work. Website platforms, product functions, and location sections where minors are more active should be the key focus of inspections, concentrating time and force on correcting prominent problems that endanger minors’ physical and mental health, and promptly discovering and addressing all types of problems that repeat or return, are hidden or in changed form. Oversight and guidance of key challenge issues should continue, staying on them until they are resolved, to ensure correction efforts achieve solid success.

(2) Solidify platform responsibility. Online platform’s primary responsibility for the online protection of minors should be solidified, urging platforms in the area to establish special work teams, follow the requirements of the special action, fully screen their own products and services, draft detailed correction plans, strengthen key stages in content management, strictly regulate the positioning of function sections, innovating and optimizing protective measures, and truly increasing the capacity for and level of online protections for minors.

(3) Strictly dispositions and punishment.Strictly fight against conduct involving minors that violate laws and regulations, employ measures to strictly address violative accounts, such as banning words, closing them, or blacklisting them, and employ measures against online platforms with prominent problems such as taking them off the market or closing them down. Example cases should be promptly circulated and exposed, to create a forceful deterrent.

 

Click to rate this post!
[Total: 0 Average: 0]

Be First to Comment

    Leave a Reply

    Your email address will not be published. Required fields are marked *

    Translate