China’s powerful cyberspace regulator has taken the first step in a pioneering—and uncertain—government effort to rein in the automated systems that shape the internet.

Earlier this month, the Cyberspace Administration of China published summaries of 30 core algorithms belonging to two dozen of the country’s most influential internet companies, including TikTok owner ByteDance Ltd., e-commerce behemoth Alibaba Group Holding Ltd. and Tencent Holdings Ltd., owner of China’s ubiquitous WeChat super app.

The milestone marks the first systematic effort by a regulator to compel internet companies to reveal information about the technologies powering their platforms, which have shown the capacity to radically alter everything from pop culture to politics. It also puts Beijing on a path that some technology experts say few governments, if any, are equipped to handle.

The public versions of the filings explain in plain language what types of data a given algorithm uses and what it does with the data. In many instances, they provide less detail than what Facebook voluntarily discloses to users about how it ranks content in its news feed.

The full filings, which aren’t public, contain more-extensive descriptions of the data and algorithms, some of it considered confidential business information, people familiar with the submissions said. They also contain a self-assessment of potential security risks, according to public documentation of what the regulator asked companies to provide.

Companies submitted the information in response to a new law that came into effect in March that tasks regulators with cleaning up the negative effects of algorithms such as the amplification of harmful information, infringement of user privacy and abuse of gig workers. The law also requires algorithms to be used to promote “positive energy,” a Xi Jinping-era phrase for content that uplifts public opinion and favorably treats the Communist Party.

Beijing isn’t alone in seeking to restrain the power of algorithms underpinning the internet. Regulators in the U.S. and European Union are grappling with similar issues, such as how to protect teen mental health and stamp out viral misinformation.

The Chinese law, however, represents the most assertive attempt to police algorithms directly. Ultimately, it can be applied to any service in the country that uses algorithmic technology.

“They are doing things that no one else has tried yet, and the rest of the world can learn from what works and doesn’t work,” said Graham Webster, who runs the DigiChina Project, which tracks China’s digital-policy developments, at Stanford University.

One important question the effort raises, algorithm experts say, is whether direct government regulation of algorithms is practically possible.

The majority of today’s internet platform algorithms are based on a technology called machine learning, which automates decisions such as ad-targeting by learning to predict user behaviors from vast repositories of data. Unlike traditional algorithms that contain explicit rules coded by engineers, most machine-learning systems are black boxes, making it hard to decipher their logic or anticipate the consequences of their use.

Beijing’s interest in regulating algorithms started in 2020, after TikTok sought an American buyer to avoid being banned in the U.S., according to people familiar with the government’s thinking. When several bidders for the short-video platform lost interest after Chinese regulators announced new export controls on information-recommendation technology, it tipped off Beijing to the importance of algorithms, the people said.

The Cyberspace Administration of China moved swiftly to draft a new law on algorithmic recommendation systems, seeking in particular to understand how the country’s tech companies shape online discourse and how to curb that influence, people familiar said.

By January of 2022, the law was ready, and it went into force two months later—an impressive pace for a government that sometimes sits on draft legislation for years, Mr. Webster said.

The cybersecurity regulator didn’t respond to a request for comment.

China’s law shocked people in U.S. tech policy circles for its scope and aggressiveness, according to Suresh Venkatasubramanian, a computer science professor at Brown University who served as an assistant director of the White House Office of Science and Technology Policy until this month.

Some in the U.S. government were intrigued when Facebook whistleblowerFrances Haugen argued in Congress last fall for placing limits on the social-media company’s algorithms, according to Mr. Venkatasubramanian. Regulators worried, however, that it would set a precedent for state control over the flow of information.

“Once you go down that path it’s very hard to go back,” he said.

EU regulators facing the same questions have been more forceful but have still avoided direct government reviews of algorithms.

In July, the European Parliament adopted legislation that requires the largest platforms, such as Google and Facebook, to conduct regular assessments of their systemic risks, such as whether they are spreading illegal content. The companies can choose how they address those risks, including adjusting their algorithms, but must submit to independent audits to prove their solutions actually worked.

Implementation and enforcement details in the EU law are vague, policy experts say. “It will take years and years of struggles and maybe even lawsuits” to interpret the law, said Matthias Spielkamp, executive director of AlgorithmWatch, a Berlin-based nonprofit research and advocacy organization.

Beijing’s approach remains vague as well. In theory, the Chinese law could give the government full control over the key mechanisms that orchestrate online spaces and, increasingly, offline life as well. Yet Beijing could very well trip over its own ambitions, tech experts say.

Social-media recommendation engines represent some of the most complicated algorithmic systems, with apps like Facebook and TikTok using hundreds or even thousands of algorithms to determine who sees what information.

Having detailed documentation, or even the code, for these systems isn’t enough to understand how they will affect something as broad as online discourse, according to Cathy O’Neil, an algorithmic auditor who works with U.S. government agencies to scrutinize company algorithms. “What’s actually important is the data that is going through the algorithm,” she said.

Even with full access to that data, which changes with each user post and interaction, a tech company’s own engineers still struggle to precisely tune the behaviors of its systems, according to Ms. O’Neil. Targeted changes like promoting more propaganda are feasible, she said, “but it is actually impossible to control what a recommendation engine does overall.”

Tech analysts and industry insiders also question whether the Cyberspace Administration, which began as a propaganda division, has the technical expertise to enforce its own rules.

‘…it is actually impossible to control what a recommendation engine does overall.’

— Cathy O’Neil, an algorithmic auditor

Shortly after the Chinese law came into force, government-relations managers and algorithm engineers at ByteDance met with Cyberspace Administration officials to explain the documents they submitted, people familiar with the matter said. During one of those meetings, officials at the agency displayed little understanding of the technical details and company representatives had to rely on a mix of metaphors and simplified language to explain how the recommendation algorithm worked, one of the people said.

Companies haven’t been required to submit code or user data, the people said.

Chinese government guidelines issued last year called for multiple agencies to expand staff to supervise algorithms.

“They’re trying to build the tools, hire the people and get the technical expertise to tackle this kind of stuff,” said Kendra Schaefer, head of tech policy research at Beijing-based strategic advisory consulting firm Trivium China. “So enforcement of this is going to ramp up slowly over the next five to 10 years.”