Stanford HAI's 2027 AI Index notes 64% of new AI research releases publish open weights, up from 34% in 2023. Hugging Face hosts 1.2M models and 340K datasets. Linux Foundation's AI & Data projects include PyTorch (now the leading training framework, 71% of new models).
| Metric | 2027 Value | Source |
|---|---|---|
| HF models | 1.2M | Hugging Face |
| HF datasets | 340K | Hugging Face |
| HF spaces | 860K | Hugging Face |
| GitHub AI repos | 3.4M | Octoverse |
| PyTorch share | 71% | Stanford HAI |
| Llama 4 downloads | 180M | Meta |
| Llama family cumulative | 1.8B | Meta |
| DeepSeek-V3.5 downloads | 84M | HF |
| Mistral models downloads | 68M | HF |
| Qwen 3 downloads | 72M | Alibaba |
| Model | Developer | Downloads |
|---|---|---|
| Llama 4 (all variants) | Meta | 180M |
| DeepSeek-V3.5 | DeepSeek | 84M |
| Qwen 3 | Alibaba | 72M |
| Mistral Large 2 | Mistral | 46M |
| Phi-4 | Microsoft | 38M |
| Gemma 3 | 34M | |
| Yi 2 | 01.AI | 22M |
| Falcon 3 | TII | 18M |
How many AI models on Hugging Face? 1.2M in 2027.
What's the top open LLM? Llama 4 with 180M downloads in H1 2027.
Is PyTorch dominant? Yes — 71% of new models use PyTorch.
How many open-source AI startups? 2,140 per CB Insights.
Does open source lead in 2027? 64% of new research releases publish open weights (Stanford HAI).
Open-source AI in 2027 is the backbone of the global AI ecosystem. More at misar.blog.
Free newsletter
Join thousands of creators and builders. One email a week — practical AI tips, platform updates, and curated reads.
No spam · Unsubscribe anytime
AI energy consumption 2027 — data center power, training emissions, inference energy, nuclear PPAs with IEA, Stanford HA…
AI research paper statistics 2027 — arXiv, NeurIPS, ICML, ACL, top institutions, China-US-EU output with Stanford HAI, a…
The top free AI datasets for learning in 2026 — MNIST, CIFAR, ImageNet, Common Crawl, Hugging Face datasets, and more —…
Comments
Sign in to join the conversation
No comments yet. Be the first to share your thoughts!