Skip to main content
AI in ASIA
Chinese AI models
Business

Chinese AI models dominate global derivatives

Chinese AI models capture 30% of global open-source usage as DeepSeek's breakthrough sparks massive investment surge and market dominance across Asia.

Intelligence Deskโ€ขโ€ข4 min read

AI Snapshot

The TL;DR: what matters, fast.

China released 1,509 LLMs by July 2025, representing 40% of global AI model releases

Chinese open-source models grew from 1.2% to 30% of global usage within 12 months

DeepSeek R1 matched OpenAI's performance at 82% lower training costs, sparking market confidence

Advertisement

Advertisement

Chinese Models Reshape Global AI Landscape Through Derivatives Dominance

A year after DeepSeek's R1 reasoning model sent shockwaves through Silicon Valley, China's AI sector has fundamentally altered the global competitive landscape. New analysis from CICC and the RAND Corporation reveals that whilst American firms maintain their grip on overall AI traffic, Chinese platforms have made extraordinary advances in open-source models and emerging market penetration.

The transformation has been nothing short of remarkable. China released 1,509 large language models by July 2025, accounting for 40% of global releases and surpassing all other countries, including the United States. This prolific output has translated into tangible market gains, with Chinese open-source models growing from just 1.2% to 30% of global usage within 12 months.

The Numbers Tell a Dramatic Story

DeepSeek R1's launch on 20th January 2025, reportedly matching OpenAI's o1 model at a fraction of the training cost, instilled newfound confidence within China's AI community. Wu Chenglin, founder of DeepWisdom, a Beijing-based startup, noted this shift after securing $30 million in funding following three near-collapses before the DeepSeek breakthrough.

The ripple effects extended across Asia's financial markets. Chinese AI firms like Zhipu AI and MiniMax listed on Hong Kong exchanges in late 2025, with stocks soaring on strong demand for their models. Zhipu climbed over 20% after releasing GLM-5, whilst MiniMax rose 15% following its M2.5 launch.

"Chinese AI models achieved 90% of US frontier model performance while spending 82% less on capital expenditure. This demonstrates remarkable cost-efficiency in AI development," according to recent industry analysis.

By The Numbers

  • China released 1,509 LLMs by July 2025, representing 40% of global releases
  • Chinese open-source models grew 2,400% in global usage over 12 months
  • Alibaba's Qwen family spawned over 180,000 derivative models globally
  • Nine of the top 14 global models are Chinese and open-source
  • DeepSeek achieved 89% market share in China and 56% in Belarus

Strategic Investment Fuels Rapid Expansion

China's progress stems from deliberate strategy encompassing massive state investment, open-weight model releases, and expanding talent pools. The third phase of the National Integrated Circuit Industry Investment Fund alone totals 344 billion yuan, with additional government-guided funding pushing direct state investment to roughly $75 billion. This represents approximately seven times the US federal AI research budget.

The investment surge has been palpable since DeepSeek's debut. Shi Yaqiong, vice-president at Beijing-based Jinqiu Capital, observed the dramatic shift: "The kind of projects with an initial valuation in 2024 of $10-20 million were, in 2025, expected to have initial valuations around $20-40 million."

This influx of capital has fuelled innovation and international expansion, particularly in cost-sensitive emerging markets. Chinese models have found particular favour across Africa, the Middle East, and South America, with the AI wave shifting to the Global South becoming increasingly evident.

Region DeepSeek Market Share Growth Timeline
China 89% 12 months
Belarus 56% 8 months
Cuba 49% 6 months
Russia 43% 10 months

Open Source Strategy Drives Global Adoption

The open-source approach has proven particularly effective for Chinese developers. Alibaba's Qwen family has spawned over 180,000 derivative models globally, exceeding Meta's Llama in download velocity. This strategy has enabled Chinese models to capture 17.1% of global open-source downloads, surpassing the US share of 15.8% for the first time.

The momentum has been building steadily, with Chinese AI models now leading global token rankings across multiple metrics. By December 2025, over 62% of model derivatives globally were based on Chinese large models, according to CICC research.

Key factors driving this success include:

  1. Aggressive pricing strategies that undercut Western competitors significantly
  2. Open-weight releases that enable customisation and local deployment
  3. Rapid iteration cycles with frequent model updates and improvements
  4. Strong performance in cost-sensitive markets across developing economies
  5. Government backing that provides sustained funding and strategic direction

Persistent Challenges Remain

Despite impressive momentum, China continues grappling with persistent bottlenecks, primarily revolving around chip shortages. At a recent Beijing conference, Tang Jie, founder of Zhipu AI, warned that "the divide is actually expanding" due to these constraints.

Lin Junyang, who leads Alibaba's Qwen model development, estimated China's chances of surpassing OpenAI and Anthropic within five years at 20% or less, noting that "most of our resources are consumed just by meeting delivery demands." This indicates that whilst making great strides, the infrastructure to support large-scale, cutting-edge AI development remains a critical challenge.

"We show that it's not just missing news. It's missing negative Chinese news that drives this bias," states researcher Wang on US models like ChatGPT underperforming Chinese ones like DeepSeek in stock predictions.

The competitive landscape continues evolving rapidly, with DeepSeek versus Silicon Valley dynamics highlighting the increasing intensity of global AI competition.

What Lies Ahead

OpenAI's intelligence team recently issued warnings about a potential "seismic shock" from China, possibly as early as the Lunar New Year. DeepSeek is reportedly preparing its next-generation V4 model for a mid-February release.

OpenAI noted: "China now has a broad field of near-frontier models, many of them open-weight and aggressively priced, making them easier to deploy across industries and government systems." This suggests a future where Chinese AI may become even more accessible and prevalent globally.

The implications extend beyond pure competition, with the race for AI supremacy highlighting China's war of a hundred models approach fundamentally different from Silicon Valley's concentrated efforts.

How do Chinese AI models achieve such cost efficiency?

Chinese models leverage government subsidies, lower operational costs, and aggressive pricing strategies. They focus on achieving 90% of frontier performance whilst spending 82% less on capital expenditure through optimised training methods and infrastructure.

What makes open-source models so popular globally?

Open-source models allow customisation, local deployment, and cost savings. They enable developers to modify models for specific use cases without vendor lock-in, making them particularly attractive for resource-constrained organisations.

Why are Chinese models succeeding in emerging markets?

Cost sensitivity drives adoption in developing economies. Chinese models offer comparable performance at significantly lower prices, making advanced AI accessible to organisations with limited budgets across Africa, Latin America, and Asia.

Can China overcome semiconductor constraints?

Current chip shortages limit scaling capabilities, but China is investing heavily in domestic semiconductor production. Success depends on technological breakthroughs and continued government support, though timeline remains uncertain given current restrictions.

What impact will derivatives have on global AI development?

With over 62% of global model derivatives now based on Chinese models, innovation patterns are shifting. This could accelerate AI adoption globally whilst potentially creating dependencies on Chinese foundational technologies.

The AIinASIA View: China's derivatives dominance represents a fundamental shift in global AI dynamics. We're witnessing the emergence of a parallel AI ecosystem that prioritises accessibility over exclusivity. This open approach, combined with aggressive pricing and government backing, has created sustainable competitive advantages that extend far beyond China's borders. The real question isn't whether Chinese models can compete, but whether Western firms can adapt their strategies to counter this systematic approach to global AI deployment.

The transformation of global AI markets through Chinese model derivatives marks a pivotal moment in technology history. As these models continue proliferating across emerging markets and open-source communities, the implications for innovation, competition, and technological sovereignty become increasingly profound. What do you think this shift means for the future of AI development and global technology leadership? Drop your take in the comments below.

โ—‡

YOUR TAKE

We cover the story. You tell us what it means on the ground.

What did you think?

Written by

Share your thoughts

Join 4 readers in the discussion below

This is a developing story

We're tracking this across Asia-Pacific and may update with new developments, follow-ups and regional context.

Advertisement

Advertisement

This article is part of the Funding & Deals learning path.

Continue the path รขย†ย’

Latest Comments (4)

Lakshmi Reddy
Lakshmi Reddy@lakshmi.r
AI
8 February 2026

It's to see the CICC research on how Chinese large models now form the basis for over 62% of model derivatives globally. This shift, particularly in open-source models, could have a profound impact on how innovation spreads, especially for regions like ours. My concern is whether this dominance translates into better support or accessibility for developing Indic language models, as opposed to just a new center of gravity for English-centric AI development. We need to ensure that the advancements aren't just about market share but also about fostering diversity in linguistic applications.

Oliver Thompson@olivert
AI
5 February 2026

right, the CICC research on 62% of model derivatives being Chinese is certainly a figure. but what does that actually mean for practical application here in London? we're still seeing the vast majority of useful tooling and robust APIs coming out of the usual suspects. derivatives are one thing, production-ready, enterprise-grade solutions that integrate without a right palaver are another entirely. it feels like a bit of a statistical parlour trick when you're actually trying to deploy something that won't fall over when the data gets a bit messy.

Kavya Nair
Kavya Nair@kavya
AI
3 February 2026

hey everyone, so I was looking at this article about the Chinese models and the global derivatives. it says Qwen is taking on Google's Nano Banana? I've been mostly looking at the bigger models so I haven't heard much about Nano Banana. is it similar to like Gemini or something smaller, maybe for specific tasks? just curious what it's used for.

Jake Morrison@jakemorrison
AI
2 February 2026

pretty wild how Qwen picking up that fast against Nano Banana. I remember when everyone here was only talking about Meta's stuff for open source. now it's like a whole different ballgame.

Leave a Comment

Your email will not be published