******小型/高效模型民主化与中国Token爆发&开源竞赛(DeepSeek-R2/GLM-5.1/MiniMax/Qwen/Doubao/Gemma/Meta/Fireworks)******
Key Questions
What is China's AI token usage compared to the US?
China's daily AI token usage reached 140T in March, surpassing the US for five weeks with weekly calls 4x higher. This reflects explosive growth in AI infrastructure. Domestic models drive this surge.
What achievements does GLM-5.1 have?
GLM-5.1 from Z.ai is an open-source flagship for agentic coding, achieving SOTA on SWE-Pro, beating Opus 4.6 and GPT-5.4. It excels in autonomous planning and iteration for software tasks. Released as China's open-source push.
What is DeepSeek-R2 and its specs?
DeepSeek-R2 is a 600B/37B MoE open-source model crushing GPT-4o in performance. It prioritizes domestic chips like Huawei's for V4 optimization. Giants are pre-ordering hardware for it.
How does Qwen3.6 contribute to the trend?
Qwen3.6-Plus handles 4.6T tokens with massive downloads over 400M. Alibaba's FIPO algorithm boosts its inference. It exemplifies China's efficient model push.
What financing news involves Chinese AI firms?
Mianbi Intelligence raised hundreds of millions, totaling over 1B RMB in 2026, becoming a base model unicorn with investors like Shenzhen Innovation. This supports MiniCPM and edge models. Funding fuels open-source competition.
What are other notable efficient models?
Gemma 4 31B topped Hugging Face with 400M+ downloads; Arcee 400B OSS; JoyAI Flash MoE with 480B params activating 2.7B (94% sparsity). Fireworks and PrismML focus on edge. These democratize small/efficient models.
What policy support exists in China for AI?
Policies promote manufacturing in Shenzhen and quantum AI. Native AI chips trend with firms like DeepSeek shifting to Huawei/Cambrian. This differs from US strategies.
How does this impact global AI competition?
China's token explosion and OSS models like DeepSeek-R2/GLM-5.1 challenge US leaders. Compute estimates show China's rapid buildout despite controls. Open-source race intensifies.
中国3月日均140T超美;GLM-5.1 OSS agentic coding SWE-Pro SOTA胜Opus;DeepSeek-R2 MoE 600B/37B OSS crush GPT-4o;Qwen3.6-Plus4.6T/Gemma4 31B 400M+/Arcee 400B OSS/JoyAI Flash MoE/PrismML edge/DeepSeek V4昇腾/面壁MiniCPM 2400万数亿融资;政策制造/深圳/量子AI。