While America’s tech giants are pouring billions into unlocking the secrets of rival proprietary artificial-intelligence (AI) systems, China is waging a different fight.
It is what Andrew Ng, the Stanford-based AI expert, recently described as a “Darwinian life-or-death struggle” among developers of China’s relatively open large language models (LLMs).
Their fierce competition should serve as a wake-up call for the West.
DeepSeek’s shockwave
In January, DeepSeek—a little-known Chinese startup—shook global markets by releasing a powerful AI model for free, built on a tight budget.
Since then, Chinese models from Alibaba and other firms have quietly gained ground abroad.
These days, when entrepreneurs pitch at Andreessen Horowitz (a16z), a major Silicon Valley venture-capital firm, there’s a high chance their startups are running on Chinese models.
“I’d say there’s an 80% chance they’re using a Chinese open-source model,” notes Martin Casado, a partner at a16z.
What “open” really means in China
Technically, China’s strength lies in open-weight models.
Unlike traditional open-source software—where the full source code is available—most open LLMs publish only the numerical “weights” learned during training, not the raw data or code.
Still, whatever name you give them, Chinese open models launched this year have outperformed similar American offerings, including those from Meta.
And their abilities are edging closer to the best proprietary systems.
OpenAI feels the heat
OpenAI, the maker of ChatGPT, embodies the growing pressure.
Although it championed openness in the mid-2010s, by 2020 it had gone fully proprietary to make money and control risks.
Yet with customers increasingly adopting open-weight alternatives—many from China—OpenAI has moved to rejoin the trend.
This month it launched gpt-oss, its first open-weight model since 2019.
Small gestures, big contrasts
The lowercase name tells a story: gpt-oss is modest in size.
It was released in the very week OpenAI rolled out its hyped—but underwhelming—GPT-5.
That timing made OpenAI’s renewed “openness” look hesitant.
Other American firms appear equally cautious.
Ali Farhadi of the Allen Institute for AI says that while Chinese companies openly release their best models, American firms keep their cutting-edge work proprietary.
“As painful as it is to admit, I think we’re behind on open weights now,” he observes.
Meta’s shifting stance
Even Meta reinforces this perception.
Its decision to make Llama openly available won it praise in the AI community.
But CEO Mark Zuckerberg is now prioritising the pursuit of “superintelligence” and has warned that Meta will be far more selective about what it makes public in the future.
The business question
From a revenue perspective, the gap looks clear.
American proprietary models generate vastly more money than Chinese open-weight ones.
Valuations tell the same story: OpenAI’s worth is estimated at up to $500bn, while Alibaba’s entire market cap is just $285bn.
Proprietary systems are easier to monetise, and profits can be reinvested in further breakthroughs.
Why open still matters
But open models are not just for second-tier players.
Percy Liang, co-founder of Together AI, notes that open-weight systems allow adoption in ways proprietary ones cannot.
They are easier to tailor for governments, companies, and researchers, and they let users run AI locally instead of relying on cloud services.
Revenues can still flow from support, integration, and customisation.
The bigger picture
In short, while American labs chase massive profits by pushing intelligence to new frontiers, their Chinese rivals are betting on widespread adoption.
If they succeed, the DeepSeek shock may prove to be only the beginning.