Chinese authorities have approved a new set of comprehensive regulations that expand the scope of online censorship, emphasize the war against “negative” content and make platforms more liable for content violations.

While China previously had numerous, separate regulations floating about for everything from live-streaming to news media to chat groups, the new “Provisions on the Governance of the Online Information Content Ecosystem” consolidate them into a more coherent system of global rules for everything that happens on the country’s Internet. The new rules were approved in mid-December and will take effect in March.

China, one of the world’s most censorious nations, is prone to handling speech issues with a “big stick bludgeon technique that doesn’t give people enough guidance or clarity and leaves people guessing and unsure,” says Jeremy Daum, senior fellow at the Yale Law School’s Paul Tsai China Center, who notes that “the laws for what counts as illegal or now ‘negative content’ are quite vague.”

The new regulations target content producers, platforms and Internet users, describing what kind of content should be banned as illegal, restricted for being “negative,” or actively promoted.

Popular on Variety

The law begins by laying out what kind of content is in fact encouraged. The top three out of seven listed criteria are ideological. Such “uplifting” fare should “publicize Xi Jinping Thought on socialism with Chinese characteristics,” promote the Communist Party’s major policies and political thinking, as well as push “core socialist values.” It should also “increase the international influence of Chinese culture, presenting the world with the true, three-dimensional, and complete China.” Encouraged content should “respond to social concerns” but also “promote unity and stability” and highlight China’s economic progress.

Such content must be actively displayed in prominent online locations such as home pages, pop-up windows, hot topic or default search lists, and other “key areas that can easily attract attention.”

The document lays out what constitutes illegal content in sweeping terms. Content that “undermines ethnic unity” or “undermines the nation’s policy on religions” is forbidden, as is anything that “disseminates rumors that disrupt economic or social order” or generally “harms the nation’s honor and interests,” among other rules.

Daum points out that, worryingly, the law’s final version modifies an earlier draft to include extra language that explains it is based on China’s tough national security law, which defines what such “national interests” and security might mean very broadly.

The new regulations then go on to dictate that content producers must “employ measures to prevent and resist the making, reproduction or publication of negative information.” This includes the following: the “use of exaggerated titles,” gossip, “improper comments on natural disasters, major accidents, or other disasters,” anything with “sexual innuendo” or that is “readily associated with sex,” gore or horror, or things that would push minors towards behaviors that are unsafe or “violate social mores.” Negative content, it concludes broadly, is actually just anything at all that would have a “negative impact” on the Internet ecosystem.

Platforms are the ones responsible for policing all these restrictions, the rules say, and should establish mechanisms for everything from reviewing content and comments to “real-time inspections” to the “handling of online rumors.” They are to have designate a manager for such activities and improve related staff.

Daum says that for the most part, the law is largely “housekeeping.”

“This isn’t going to suddenly change the playing field, but it’s in line with the general trend of changes we’ve been seeing in recent years towards increasing restrictions and the outsourcing of censorship” — that is, making platforms rather than individual users liable for content monitoring, which tends to leave them overzealous in fear of messing up and losing market access.

“China loves hierarchies, and here, they’re sort of making one for how to fix content, so that there’s a chain of command” for censorship issues, he assessed.

Another key new development is that the regulations define content producers as any individual posting something to the Internet — expanding censorship’s scope, but also making total enforcement essentially impossible. “Imagine trying to round up every sexual innuendo on Weibo — it’s a fool’s errand,” he reasons.

For foreign companies, the takeaway is perhaps that firms should at any moment be prepared to be told that online content is unacceptable and must be removed. Such regulations are also likely of interest to those concerned about the growing popularity of Chinese-run apps like TikTok, whose parent company Bytedance must comply with such Communist Party strictures.