
*Image source: YouTube*
---
# **Z Highlights**
- **Open-source resurgence in the U.S.:** Sparked partly by China’s rapid progress, America has started investing heavily in large-scale open-source initiatives again.
- **Chinese models as a starting point:** Many startups find **closed-source models too restrictive** for novel AI use cases, making open-source (often Chinese) models the default.
- **Technology ceiling:** Current methods relying on incremental data labeling are unlikely to achieve “super intelligence” capable of going beyond human abilities.
- **Robotics warning:** The robotics field may follow AI’s early closed-source trajectory, potentially limiting community and researcher participation.
> *Thomas Wolf, Co-founder & Chief Scientist of Hugging Face — central figure in the global open-source AI movement — focuses on reconstructing the AI stack across models, data, compute, and applications. Speaking to TechCrunch on **Nov 8, 2025**, he highlighted key trends in the 2025 AI competition: the rise of Chinese open-source models, America’s open-source revival, and scaling limits of LLMs.*
---
## **2025–2026 AI Competition Trends**
### **Macro Shifts**
- **Compute consolidation** around a few global leaders.
- **U.S. “open-source revival”** in response to China’s momentum.
- **Strong Chinese participation** with robust open-source models.
- **Minimax M2** ranked fifth globally — a major open-source performance milestone.
### **Why Open-Source Models Matter**
If you want to build truly new AI scenarios — like **interactive world models** — you need flexibility beyond closed-source constraints.
By 2026, open-source might reclaim center stage in the U.S.
---
## **Scaling Limits of LLMs — Why Bigger ≠ Super Intelligence**
**Thomas Wolf on limitations:**
- LLMs have **weaker generalization** than expected.
- Trend towards massive RL environments to improve learning.
- Current tech likely to **hit a ceiling** before achieving true super intelligence.
### **Science Example:**
1. **Guaranteed use case:** AI as research assistant — highly capable at supporting existing projects.
2. **Harder breakthrough:** AI defining completely new research questions, challenging accepted truths.
- Historically, breakthroughs like Copernicus came from disproving “indisputable” facts.
**Current models rarely generate truly original problems or conjectures** — crucial for scientific leaps.
---
## **The “Yes-Man” Problem & The AI Bubble**
- AI assistants are improving but **lack capacity to “ask better questions.”**
- Hype around AI “proving theorems” is misleading — discovery is about proposing **new conjectures**, not just proofs.
- Bubble valuations often assume bigger models will lead to AGI — Thomas remains skeptical.
- Capital influx may still drive valuable progress indirectly:
- Better GPUs → better simulations → scientific and engineering acceleration.
- Potential “AI–simulation flywheel.”
---
## **Open vs Closed-Source — Talent & Policy Factors**
- Open-sourcing can **attract top talent** — but trends vary by region.
- In the West: closed-source seen as cutting-edge.
- In China: open-source more appealing for recruitment.
- Labs like **Reflection** could flip trends back to “open-source is cool.”
- U.S. open-source policy support is **strategically important** to maintain tech leadership.
---
## **Business Perspective — Hugging Face**
**Funding & Operations**
- Last round (2023): $200M+, ~$5B valuation.
- Haven’t used most of the funds — operating with **old-school efficiency**.
- Team size: ~250 people, profitable, cautious spending.
**Business Model Shift**
- Moving from consulting to **Enterprise Hub**:
- Internal model hosting
- Access control & permissions
- Resource isolation
- Audit logs
- Compliance-ready
- **Adopted by thousands** of orgs, incl. Salesforce.
---
## **Robotics Expansion — Making It Open Source**
**Goal:** Build active open-source ecosystems across AI domains.
**Observation:** Most robotics players are vertically integrated & closed.
### **Key Moves**
- Released **Le Robots** library — tens of thousands of contributors.
- Built **low-cost experimental hardware**:
- **SU‑100** robotic arm ($100)
- Acquired an open-source humanoid robotics company.
- Launched **Ritchie Mini**:
- Desktop humanoid for exploring human–robot interaction.
- $1.5M in preorders, shipping within a month.
### **Focus Areas**
- Natural communication
- Genuine machine understanding
- Gateway for learning robotics & AI
---
## **Strategic Takeaways**
1. **Open-source is critical** for innovation, flexibility, and talent acquisition.
2. **Scaling LLMs alone won’t achieve AGI** — need breakthroughs in problem formulation & creativity.
3. **Capital can drive indirect breakthroughs** via GPU & simulation improvements.
4. **Robotics risk repeating closed-source mistakes** — open ecosystems can prevent this.
---
## **Additional Resource — AiToEarn**
Platforms like [AiToEarn官网](https://aitoearn.ai/) provide:
- **AI-powered content generation**
- **Cross-platform publishing** (Douyin, Kwai, WeChat, Bilibili, Xiaohongshu, Facebook, Instagram, LinkedIn, Threads, YouTube, Pinterest, X/Twitter)
- **Analytics** & **model rankings** ([查看AI模型排名](https://rank.aitoearn.ai))
**Why relevant?**
Helps open-source contributors and creators **monetize innovation** while maintaining flexibility.
---
## **Conclusion**
Thomas Wolf’s vision:
> The **future AI stack** will be **open**, **modular**, and **responsibly scaled**.
> Collaboration and sustainable business models will determine AI’s societal impact.
---
[📺 Watch full interview on YouTube](https://www.youtube.com/watch?v=SSBjP22ov8Q)
---
### ✅ **Quick Summary (60 sec)**
- **Open-source revival** in America responding to Chinese AI progress.
- Closed-source models limit unique AI applications.
- Scaling LLMs won’t reach super intelligence — new problem-generation abilities needed.
- Robotics could risk closed-source lock-in — Hugging Face is pushing open alternatives.
- Hugging Face is profitable, operating lean, focusing on **Enterprise AI** and **open-source robotics**.
---