MiniMax-M2
MiniMax-M2 Released! 10B Activations, Built for Efficient Coding and Agent Workflows
# MiniMax-M2: High-Efficiency Open-Source MoE Model for Coding & Agent Tasks *2025-10-27 18:29 Zhejiang* MiniMax-M2 is a **highly efficient open-source Mixture-of-Experts (MoE) model** with **230B total parameters** and **only 10B activated at a time**. It is optimized for **coding** and **agent automation** tasks, delivering leading benchmark performance while enabling **low-cost