Industry & Business

China Now Downloads More Open-Source AI Models Than the US — Hugging Face's Spring 2026 Report Reveals a Power Shift

China Now Downloads More Open-Source AI Models Than the US — Hugging Face's Spring 2026 Report Reveals a Power Shift

The center of gravity in open-source AI has shifted, and not in the direction most people expected. Hugging Face's State of Open Source report for Spring 2026, published March 17, lays out the numbers in stark detail: Chinese-developed models now account for 41% of all downloads on the platform, surpassing the United States in both monthly and cumulative download volumes. This isn't a blip — it's a structural transformation of the AI ecosystem.

The Numbers That Tell the Story

Hugging Face has grown to 13 million users, over 2 million public models, and more than 500,000 public datasets. Those numbers nearly doubled from the previous year. But the headline isn't growth — it's who's driving it and how.

The platform is top-heavy: the top 200 most-downloaded models (just 0.01% of all models) account for 49.6% of total downloads. Think of it like music streaming — millions of tracks exist, but a tiny fraction generates almost all the plays. The AI ecosystem has its own hit parade, and the chart-toppers are increasingly Chinese.

Alibaba's Qwen family alone has spawned over 113,000 derivative models — fine-tuned versions, merged variants, quantized editions. When you count all models that tag Qwen, that number balloons past 200,000. That's more derivatives than Google and Meta combined. DeepSeek-R1 has become the single most-liked model on the entire platform, dethroning Meta's Llama family from the top spot it held for years.

The DeepSeek Effect

The catalyst is clear: DeepSeek's R1 release in January 2025 triggered a chain reaction across China's tech industry. Companies that had been playing it close to the vest suddenly went all-in on open source. Baidu went from zero Hugging Face releases in 2024 to over 100 in 2025. ByteDance and Tencent each increased their releases by eight to nine times. MiniMax, previously a closed-model shop, pivoted decisively to open weights.

It's like watching a dam break. One company proved you could build world-class AI openly, and suddenly everyone wanted in. The competitive dynamic flipped from 'keep it secret' to 'ship it fast, ship it open, let the community amplify your work.'

The Rise of the Individual Developer

Perhaps the most surprising finding isn't about countries or corporations at all. Independent developers — individuals without corporate affiliation — now account for 39% of all downloads, up from 17% before 2022. At times during 2025, unaffiliated developers generated more than half of total platform usage.

These aren't people training models from scratch. They're quantizers, fine-tuners, mergers, and redistributors — the essential middle layer that turns raw research artifacts into things people can actually run. They're the mechanics who take a race car engine and put it in something you can drive to work.

Meanwhile, industry's share of overall development fell from 70% pre-2022 to just 37% in 2025. Big Tech still builds the foundation models, but the community decides what gets used, how, and where.

The Sovereignty Play

Open-source AI has become a geopolitical tool. South Korea launched a National Sovereign AI Initiative naming five domestic champions (LG AI Research, SK Telecom, Naver Cloud, NC AI, and Upstage) to build competitive models. Three South Korean models trended simultaneously on Hugging Face in February 2026. Switzerland's Swiss AI initiative and various EU-funded projects reflect similar priorities.

The logic is straightforward: if your country's AI stack runs on models controlled by foreign companies through APIs, you're one policy change away from losing access. Open-weight models that can be deployed on domestic hardware reduce that dependency. It's the same argument nations make about energy independence, applied to intelligence.

What the Research Shows

The report also reveals that China's Big Tech companies — particularly ByteDance — are publishing high-impact research papers at an accelerating rate. The most upvoted papers on Hugging Face's Daily Papers skew heavily toward Chinese organizations. Medical AI papers are unexpectedly influential, while Big Tech's dominance in pure research output is sparser than you'd expect.

Over 30% of Fortune 500 companies now maintain verified accounts on Hugging Face. Airbnb and other established American companies have increased their engagement with the open ecosystem. The 'build vs. buy' calculation has shifted: open models offer comparable quality with dramatically more flexibility in deployment and customization.

The AI industry's center of gravity hasn't just shifted — it's been redistributed. Power is flowing from centralized labs to distributed communities, and from the US to a genuinely global ecosystem.

Key Takeaways

  • Chinese-developed models now account for 41% of Hugging Face downloads, surpassing the US
  • Independent developers control 39% of ecosystem activity, up from 17% in 2022
  • Alibaba's Qwen family has 200,000+ derivative models — more than Google and Meta combined
  • National AI sovereignty initiatives are proliferating, with South Korea, Switzerland, and the EU investing heavily
  • Over 30% of Fortune 500 companies now have verified Hugging Face accounts

Our Take

This report should be required reading for anyone making strategic decisions about AI. The narrative that the US leads in AI and China is catching up needs serious revision — at least in the open-source domain, China has caught up and arguably pulled ahead. The DeepSeek effect showed that a single high-profile open release can reshape an entire national AI strategy overnight. But the bigger story might be the democratization angle. When individual developers without corporate backing account for nearly 40% of platform activity, the traditional power dynamics of the tech industry start looking obsolete. You don't need a billion-dollar GPU cluster to matter in AI anymore — you need good taste in model selection, efficient fine-tuning skills, and the ability to serve a community's specific needs. The sovereignty angle is also worth watching. As more nations treat AI capability as critical infrastructure, the demand for open-weight models that can be deployed domestically will only grow. Hugging Face is positioning itself as the neutral platform where this global competition plays out — part GitHub, part United Nations of AI models. Whether that neutrality holds as geopolitical tensions increase is an open question, but for now, they're the closest thing the AI ecosystem has to common ground.

Sources