Express | OpenAI’s In-Hhouse Chip: Partnering with Arm and Broadcom to Build 10-GW Compute Power, SoftBank May Be the Biggest Beneficiary
OpenAI Partners with Arm, Broadcom, and TSMC on Custom AI Chips
Beijing, October 14, 2025 — The Information
OpenAI is working with Arm to incorporate Arm-designed CPUs into its self-developed AI server chips, and co-designing a dedicated, inference-focused AI chip with Broadcom. These chips will be manufactured by TSMC, with production expected to begin in 2026.


Image credit: Unsplash
---
Strategic Benefits for SoftBank
OpenAI’s in-house AI chip development may benefit SoftBank Group, a major shareholder in OpenAI that is also helping fund its large-scale data center program.
Partnership Highlights
- Arm discussions: OpenAI is negotiating with Arm (owned by SoftBank) to use Arm CPUs in custom AI server chips.
- Broadcom collaboration: OpenAI is co-designing an AI chip with Broadcom, optimized for inference (running AI models).
- CPU role: All AI chips require CPU integration; Arm recently began producing CPUs itself rather than just selling designs.
- Market impact: Arm’s stock jumped 11%, Broadcom rose 1% after the announcement.
- Revenue potential: Arm CPUs for OpenAI could generate billions of dollars in revenue.
SoftBank owns 90% of Arm, has invested in OpenAI since its valuation was under $100B, and pledged tens of billions more to support OpenAI’s data center ambitions.
---
Broadcom–OpenAI Inference Chip
The chip announced Monday is specifically optimized for AI inference workloads.
Launch date: Late 2026.
Unknowns: Whether new chips will be deployed in OpenAI’s own data centers or within leased cloud infrastructure.
Scale:
- Targeting production for 10 GW data center capacity — 5× OpenAI’s current usage.
- Total plan across three deals: 26 GW capacity (unspecified timeframe).
- Estimated cost: $1T+ investment required; long-term goal is 250 GW capacity by 2033 ($12.5T cost at current rates).
---
Funding Outlook
Who will fund this massive expansion is unclear. Key notes:
- Nvidia: Pledged up to $100B, building 10 GW facilities, supplying GPUs via leasing.
- Broadcom: 2026–2029, sufficient chips for 10 GW capacity.
- Cloud providers: Oracle, Microsoft may help finance compute resources.
OpenAI’s revenue outlook for 2025: ~$13B.
Planned spending by 2029: $115B in cash to rent additional servers.
---
TSMC’s Role and Capacity Constraints
OpenAI and Broadcom’s chips will be produced by Taiwan Semiconductor Manufacturing Company (TSMC) — also a major supplier for Nvidia and AMD.
Development Status
- Testing small prototype batches.
- Project with Broadcom ongoing for 18 months.
- Motivated by earlier setbacks in talks with AI chip startups.
Capacity Negotiations
- Altman met with TSMC executives since 2023 to request capacity increases.
- TSMC open to expanding production only with large-scale orders.
- New chip could provide leverage in price negotiations with Nvidia.
Reference
Full article on The Information
---

---
🚀 Opportunities
Internship Program

Creative Gen-Z Entrepreneurs Wanted


About Z Potentials

---
AI Content Creation & Monetization Platforms
Alongside advancements in AI infrastructure, creators are leveraging platforms like AiToEarn官网 — an open-source global ecosystem for AI-driven content creation and distribution.
Capabilities
- Generate AI content and publish across platforms: Douyin, Kwai, WeChat, Bilibili, Xiaohongshu, Facebook, Instagram, LinkedIn, Threads, YouTube, Pinterest, X (Twitter).
- Integrated tools for AI generation, cross-platform publishing, analytics, and model ranking.
- Efficiently monetize AI creativity.
Learn more:
---
Sam Altman’s Perspective
> A data center capable of 1 GW computing power is equivalent to a “miniature city.”
---
---
If you’d like, I can also turn this into a visually optimized version with bullet point summaries and timelines, making it even more publisher-ready. Would you like me to do that next?