
## Another World-Class Award for Seven AI Giants


**Editor | Yun Zhao**
Just now, one of the **most prestigious engineering honors** — the *Queen Elizabeth Prize for Engineering*, often called **"the Nobel Prize of Engineering"** — was announced.
**This year’s winners:**
- **Jensen Huang**
- **Yoshua Bengio**
- **Geoffrey Hinton**
- **John Hopfield**
- **Fei-Fei Li**
- **Yann LeCun**
- **Bill Dally**

---
### Bengio's Comment on X
> This week I was deeply honored to receive the @QEPrize medal in engineering from His Majesty King Charles III, and delighted to hear his thoughts on AI safety — as well as his hope that we may maximize the benefits of AI while minimizing its risks.

---
## Post-Award Roundtable Highlights
Six winners (Hopfield absent) joined a **roundtable** to share their views on AI development. It began with each recalling a personal **"eureka moment"** in the history of AI.
---
### Jensen Huang — NVIDIA’s Deep Learning Spark
> Around 2010, researchers from Toronto, New York University, and Stanford approached us with deep learning software. I saw a striking similarity to chip design: both rely on high-level representation and structured methods.
---
### Fei-Fei Li — Missing Data → ImageNet
> Humans grow with abundant sensory information; machines starve for data. We built ImageNet — 15 million labeled images in 22,000 categories. Big data propels machine learning.
---
### Geoffrey Hinton — Predicting the Next Word
> In 1984, I trained a tiny language model to predict the next word. It learned features tied to meaning — the same principle as today’s large models, but on a small scale.
---
## Key Discussion Questions
1. **Is there an AI bubble? Will it burst?**
2. **How long until AI reaches human-level intelligence?**
---
### Huang & Dally — “Wrong Question”
> **Huang**: That moment is already happening.
> **Dally**: AI may never have true human traits, but it can aid humanity immensely.
### Bengio — “Open Yet Cautious”
> Conceptually, machines could do all humans can, though robotics and spatial skills lag. The timeline is uncertain; we need caution.
---
## AiToEarn — Open-Source AI Content Ecosystem
For creators, platforms like [AiToEarn](https://aitoearn.ai/) provide tools to:
- Generate AI content
- Publish across multiple platforms (Douyin, Kwai, WeChat, Bilibili, Xiaohongshu, Facebook, Instagram, LinkedIn, Threads, YouTube, Pinterest, X)
- Analyze results
- Monetize creations
---
## Moments of Epiphany — Six Luminaries
### Yoshua Bengio — From Excitement to Concern
1. Grad school: Inspired by Hinton’s early papers — seeking fundamental laws of intelligence.
2. Post-ChatGPT: Concerned over uncontrollable AI goals → shifted focus to safety.
### Bill Dally — Memory Wall & Finding Cats
- Late 90s: Solved “memory wall” via stream processing → foundation for GPUs.
- 2010: Andrew Ng’s cat experiment → deep learning future for NVIDIA.
### Geoffrey Hinton — Backprop & Language Features
- 1984: Backprop trained a miniature language model, but limited by computing power & data.
### Jensen Huang — Chip Design Meets AI
- 2010: High-level abstractions in deep learning mirrored chip design → scalable GPU systems.
### Fei-Fei Li — Data Scarcity & Human-Centered AI
- 2006–2009: Created ImageNet → big data fuels ML.
- 2018: Founded Stanford’s Human-Centered AI Institute.
### Yann LeCun — From Chips to Self-Supervised Learning
- Early fascination with machine learning → meeting Hinton in 1985.
- Embraced supervised learning temporarily (ImageNet), then returned to self-supervised.
---
## AI Bubble? Panel Perspectives
### Huang — “Not a Bubble”
- Every GPU at full capacity.
- AI builds real-time “intelligence,” unlike precompiled traditional software.
- AI industry = “intelligence factories.”
### Bengio — Agents Beyond Language Models
- LLMs becoming **agents**; technology shifting fast.
### Dally — Only 1% Demand Reached
- Model efficiency grows.
- Models improve, GPUs remain valuable.
- Applications will skyrocket.
### Fei-Fei Li — AI Still Young
- AI ~75 years old; many subfields await breakthroughs.
### LeCun — Bubble Mindset Exists
- Paradigm shift needed beyond LLMs; far from animal-level intelligence.
---
## Human-Level Intelligence — Responses
### LeCun — Gradual Expansion
### Fei-Fei Li — Machines Surpass in Some Areas Only
### Huang — Already at Application Stage
### Hinton — Win Debates in <20 Years = AGI
### Dally — Augment Humans, Not Replace
### Bengio — Could Eventually Match Humans
---
## Moderator's Closing
The future of AI has **already begun**, but progress will be gradual. Collaboration between humans and AI will define the road ahead.
---
## Appendix — QE Prize Past Winners (£9M Total)

---
For updates on AI infrastructure, large model performance, and societal impacts:
- [Large Models Word Salad Study](https://mp.weixin.qq.com/s?__biz=MjM5ODI5Njc2MA==&mid=2655930949&idx=1&sn=f7ff3f5010993bf7db4ce0a2dd80cb15&scene=21#wechat_redirect)
- [Hinton’s AI Predictions](https://mp.weixin.qq.com/s?__biz=MjM5ODI5Njc2MA==&mid=2655930712&idx=1&sn=295ad023579bc9d42d3218339c30654d&scene=21#wechat_redirect)
- [LLVM Creator on AI Chips](https://mp.weixin.qq.com/s?__biz=MjM5ODI5Njc2MA==&mid=2655930875&idx=1&sn=fbdcbfbd3a3e9b4b21c16a7405ac6d5f&scene=21#wechat_redirect)
---
****
[Read Original](2655930973)
[Open in WeChat](https://wechat2rss.bestblogs.dev/link-proxy/?k=7c9a2a51&r=1&u=https%3A%2F%2Fmp.weixin.qq.com%2Fs%3F__biz%3DMjM5ODI5Njc2MA%3D%3D%26mid%3D2655930973%26idx%3D1%26sn%3D39881d473f27e8156854f789b97f44dc)
---
### AiToEarn Open-Source
[AiToEarn](https://aitoearn.ai/) enables creators to:
- Generate AI-driven content
- Publish to multiple platforms
- Analyze and optimize
- Monetize creations
Explore: [AiToEarn Blog](https://blog.aitoearn.ai) | [GitHub Repo](https://github.com/yikart/AiToEarn)