OpenAI’s Major Setback: GPT‑5 Is Just a Reskinned GPT‑4o, No Breakthrough After 2.5 Years of Pretraining

OpenAI’s Major Setback: GPT‑5 Is Just a Reskinned GPT‑4o, No Breakthrough After 2.5 Years of Pretraining

After Ilya’s Departure: No Breakthrough in Sight

image
image

Report by New Intelligence Source

Editors: KingHZ, Taozi

---

📢 Lead Summary

OpenAI is in urgent need of a comeback.

Today’s biggest leak online: The cornerstone of GPT‑5 is actually GPT‑4o — and since the release of 4o, subsequent pre-training has repeatedly encountered obstacles, to the point of near abandonment.

---

OpenAI Core Pre-Training: Consecutive Setbacks?

Rumors suggest GPT‑5’s foundation is still GPT‑4o, with all pre-training versions after GPT‑4.5 abandoned.

image

This isn’t idle speculation — the claim comes from a recent SemiAnalysis report revealing:

  • Since GPT‑4o’s release, OpenAI’s top-tier team has yet to complete a single, large-scale pre-training run for a true next‑gen frontier model.
image
image

---

Hardware & Industry Context

  • Google is leveraging TPUv7 to potentially challenge NVIDIA’s CUDA moat.
  • OpenAI trains its full-stack models entirely on NVIDIA GPUs.

Yet without large-scale model pre-training, inference and post-training hardware debates mean little — pre-training is the most resource-intensive stage in the entire AI chain.

Insider facts:

  • ✔ Google’s TPU successfully met pre‑training challenges.
  • ✘ OpenAI has made no progress in pre-training since GPT‑4o’s May 2024 launch.
image

---

Timeline of Stagnation

  • Last 2.5 years: No real scale-up in OpenAI’s pre-training.
  • Possible “ceiling” was hit after 4o — coinciding with Ilya’s departure.
image

Even Ilya commented recently: Scaling won’t stop, but “something important” will still be missing.

image
image
image

---

Major Pre-Training Collapse: The “Orion” Case

  • Late last year, “Orion” was intended to be GPT‑5.
  • After underperforming, it was downgraded to GPT‑4.5.
image

Issues with Orion:

  • Training took over 3 months — breaking the industry norm of 1–2 months.
  • Performance gains were mostly in language, with coding quality worse than older models.
  • Costs were higher without proportionate capability boosts.
image

---

Innovation Pressure & Ecosystem Needs

Frontier model setbacks point to compute limits and ecosystem gaps.

Sustainable AI R&D demands integrated platforms connecting development, deployment, and monetization.

Example: AiToEarn — open-source AI content monetization platform.

It enables creation, publishing, and monetization across:

  • Douyin
  • Kwai
  • WeChat
  • Bilibili
  • Xiaohongshu
  • Facebook
  • Instagram
  • LinkedIn
  • Threads
  • YouTube
  • Pinterest
  • X (Twitter)

With analytics and AI model rankings, AiToEarn bridges AI generation with real-world economic impact.

---

February Origins of GPT‑4.5 (“Orion”)

Focus areas:

  • Advanced language capabilities
  • Stable conversational experience
  • Expanded knowledge base
image

Keyword for GPT‑4.5: Emotional Intelligence.

Coding ability improved, but less emphasized — supporting rumors that Orion’s leap was modest.

image
image

---

GPT‑4o as Scaling Path?

  • August release of GPT‑5 framed by Sam Altman as “Ph.D.-level AI towards AGI” — met with mixed reactions.
  • Many saw GPT‑5 as a refined GPT‑4.5, not a disruptive overhaul.
image
image

Evidence points to no large-scale pretraining of an entirely new model, possibly still based on GPT‑4o.

OpenAI now focuses on reasoning paradigms + RL rather than scaling pretraining — leaving rivals a speed advantage.

image
image
image
image
image

---

Altman Acknowledges Google’s Lead

image

Leaked memos show Altman conceding Google’s excellence in LLM pretraining.

Pretraining Stage Essentials:

  • Feed massive data to the model (e.g., web pages).
  • Teaches relationships & structure — precursor to post-training and deployment.
image
image

---

Why This Matters

  • Google’s breakthroughs made Gemini 3 deeper in reasoning.
  • Given OpenAI’s 2023 setbacks, this was surprising.
  • OpenAI pivoted toward heavy-compute “reasoning models” but these failed to scale successfully.

Javier Alba de Alba on GPT‑5:

> Excellent performance and cost-effectiveness — but far from the generational leap expected.

---

Strategic Shift: Scaling → Reasoning

For creators and developers, this shift reinforces the need for adaptability.

Platforms like AiToEarn官网 connect AI outputs to monetization and analytics, allowing reach across multiple channels.

image

---

GPT‑5: Improvements but Not a Leap

  • Boost in coding
  • Advanced reasoning
  • Hallucination reduction
  • Enhanced medical applications
  • Unified naming (no more turbo/mini/o-series labels)

Javier Alba de Alba warns: GPT‑5 is closer to “GPT‑4.2” than a new-gen product.

Epoch AI suspects GPT‑5 used less training compute than GPT‑4.5.

image
image

---

Future Plans: “Shallotpeat”

image

Goal: Fix persistent pre-training issues.

For users: GPT‑5 is good but evolutionary, not revolutionary.

---

Platform Strategy Insights (Sherwin Wu Interview)

image

Shift in consensus:

  • From “one supreme model” → Specialized diversified models (e.g., Codex, Sora).
  • Separate tech stacks for text, image, video.
image

Fine-Tuning Evolution:

  • Early: adjust tone/instructions
  • Now: Reinforcement Fine-Tuning (RFT) using proprietary enterprise data.

---

Agent Development Model

Dual-track strategy: App + API to reach both consumers and developers.

image

Two agent types:

  • Undirected / Exploratory
  • Procedural / SOP-oriented — where hard-coded logic is essential.

Agent Builder provides controlled, verifiable agent workflows — critical for regulated industries.

image
image
image

---

One More Thing: December Launch Wave

image

Rumored releases:

  • Image Gen v2
  • IMO & IOI multimodal models
  • GPT‑5.2 Codex
image
image

---

Reference

Leaked memo: Sam Altman sees ‘rough vibes’ and economic headwinds at OpenAI

---

Instant ASI Tracking

⭐ Like · Share · Watch — all in one click ⭐

Turn on notifications to get instant updates from New Zhiyuan.

image
image

Read the original |

Open in WeChat

---

Shallotpeat could be OpenAI’s consolidation before a major leap — aligning talent, compute, and API innovation to match Gemini’s pace.

Meanwhile, creator economy tools like AiToEarn let AI-generated content go global in one click. For developers tracking these shifts, integration of model performance, ecosystems, and monetization will be key in the months ahead.

---

Would you like me to now produce a concise timeline infographic mapping OpenAI’s pre-training milestones from GPT‑4o to “Shallotpeat”? That would make these developments even clearer.

Read more

Translate the following blog post title into English, concise and natural. Return plain text only without quotes. 哈佛大学 R 编程课程介绍

Harvard CS50: Introduction to Programming with R Harvard University offers exceptional beginner-friendly computer science courses. We’re excited to announce the release of Harvard CS50’s Introduction to Programming in R, a powerful language widely used for statistical computing, data science, and graphics. This course was developed by Carter Zenke.