A year after DeepSeek's R1 reasoning model shook Silicon Valley, China's AI landscape has undergone a significant transformation. New analysis from CICC and the RAND Corporation indicates that while American firms still command the lion's share of global AI traffic, Chinese platforms have made considerable advances, particularly in open-source models and developing economies.
DeepSeek R1's launch on 20th January 2025, which reportedly matched OpenAI's o1 model at a fraction of the training cost, instilled a newfound confidence within China's AI community. Wu Chenglin, founder of DeepWisdom, a Beijing-based start-up, noted this shift, revealing his company has since secured $30 million in funding after nearly collapsing three times before the DeepSeek breakthrough.
Shifting Global AI Dynamics
American AI models continue to dominate overall usage. ChatGPT, for instance, holds approximately 68% of the global chatbot market, with Google's Gemini at 18% and DeepSeek trailing at about 4%, according to web traffic analytics firm Similarweb. A RAND Corporation study further highlights this, reporting that US models like ChatGPT, Claude, and Copilot accounted for roughly 93% of global large language model website visits as of August 2025.
However, China's growth trajectory is unmistakable. Chinese open-source models saw their global usage surge from around 1.2% in late 2024 to nearly 30% at their peak in 2025, as per OpenRouter's analysis of over 100 trillion real-world interactions. A collaborative study by MIT and Hugging Face found that Chinese models captured 17.1% of global open-source downloads over the past year, surpassing the US share of 15.8% for the first time. By December 2025, over 62% of model derivatives globally were based on Chinese large models, according to CICC research. This mirrors the global interest in models such as Qwen which launched to take on Google's Nano Banana.
Beijing's Strategic Advantage
China's progress stems from a deliberate strategy encompassing significant state investment, the release of open-weight models, and an expanding talent pool. The third phase of the National Integrated Circuit Industry Investment Fund alone totals 344 billion yuan, with additional government-guided funding pushing direct state investment to roughly $75 billion. This is approximately seven times the US federal AI research budget.
Shi Yaqiong, vice-president at Beijing-based Jinqiu Capital, observed the investment surge since DeepSeek's debut, stating, "The kind of projects with an initial valuation in 2024 of $10-20 million were, in 2025, expected to have initial valuations around $20-40 million." This influx of capital has undoubtedly fuelled innovation and expansion. We've seen similar patterns with Nvidia's £100m+ AI Startup Investments Revealed.
Crucially, Chinese models have found particular favour in cost-sensitive emerging markets across Africa, the Middle East, and South America. A Microsoft report revealed DeepSeek's market share reached 89% in China, 56% in Belarus, 49% in Cuba, and 43% in Russia. This suggests a targeted and successful strategy for international adoption in specific regions.
Enduring Challenges and Future Outlook
Despite this impressive momentum, China continues to grapple with persistent bottlenecks, primarily revolving around chip shortages. At a recent Beijing conference, Zhipu AI founder Tang Jie warned that "the divide is actually expanding" due to these constraints. Lin Junyang, who leads Alibaba's Qwen model development, estimated China's chances of surpassing OpenAI and Anthropic within five years at 20% or less, noting that "most of our resources are consumed just by meeting delivery demands". This indicates that while they're making great strides, the infrastructure to support large-scale, cutting-edge AI development remains a critical challenge, as highlighted in this report on global AI competitiveness by the Center for Strategic and International Studies (CSIS).
OpenAI's intelligence team recently issued a warning about a potential "seismic shock" from China, possibly as early as the Lunar New Year. DeepSeek is reportedly preparing its next-generation V4 model for a mid-February release. OpenAI noted, "China now has a broad field of near-frontier models, many of them open-weight and aggressively priced, making them easier to deploy across industries and government systems." This suggests a future where Chinese AI may become even more accessible and prevalent globally, echoing the increasing competition we see in the AI space, such as OpenAI's leak suggesting new ChatGPT capabilities.
What's your take on China's rapid advancements in AI? Do you think they can overcome the chip barrier? Share your thoughts in the comments below.






Latest Comments (4)
It's to see the CICC research on how Chinese large models now form the basis for over 62% of model derivatives globally. This shift, particularly in open-source models, could have a profound impact on how innovation spreads, especially for regions like ours. My concern is whether this dominance translates into better support or accessibility for developing Indic language models, as opposed to just a new center of gravity for English-centric AI development. We need to ensure that the advancements aren't just about market share but also about fostering diversity in linguistic applications.
right, the CICC research on 62% of model derivatives being Chinese is certainly a figure. but what does that actually mean for practical application here in London? we're still seeing the vast majority of useful tooling and robust APIs coming out of the usual suspects. derivatives are one thing, production-ready, enterprise-grade solutions that integrate without a right palaver are another entirely. it feels like a bit of a statistical parlour trick when you're actually trying to deploy something that won't fall over when the data gets a bit messy.
hey everyone, so I was looking at this article about the Chinese models and the global derivatives. it says Qwen is taking on Google's Nano Banana? I've been mostly looking at the bigger models so I haven't heard much about Nano Banana. is it similar to like Gemini or something smaller, maybe for specific tasks? just curious what it's used for.
pretty wild how Qwen picking up that fast against Nano Banana. I remember when everyone here was only talking about Meta's stuff for open source. now it's like a whole different ballgame.
Leave a Comment