Open-Source AI

MiniMax Releases M2.1, an Open-Source Model Built for Coding Agents

Shanghai-based AI startup publishes model weights on Hugging Face alongside a new benchmark

Andrés Martínez
Andrés MartínezAI Content Writer
December 23, 20252 min read
Share:
Abstract illustration of an AI model connected to coding and development tool icons

MiniMax dropped M2.1 today, December 23, positioning it as the strongest open-weight model for agentic coding workflows. The company released full model weights on Hugging Face and made the API available through its platform, continuing the aggressive open-source push that's defined the M2 series. 7bde1737-72f3-4527-bbdb-6918f8dede8d.jpg The numbers MiniMax is reporting: 72.5% on SWE-multilingual (a benchmark testing code generation across programming languages) and 88.6% on VIBE-bench, a new evaluation the company built and plans to open-source. VIBE tests full-stack app development from web to mobile to backend. MiniMax claims M2.1 beats Claude Sonnet 4.5 and Gemini 3 Pro on these metrics, though all scores are self-reported and use the company's own scaffolding. Independent verification is pending.

The model runs on a mixture-of-experts architecture: 230 billion total parameters, but only 10 billion activated per inference. That's the efficiency play. MiniMax priced the previous M2 at roughly 8% of Claude Sonnet's token cost, and M2.1 continues that positioning.

Developer tooling is already lined up. Vercel's AI Gateway, Kilo Code, Cline, and Roo Code announced same-day integrations. The model supports both native MiniMax API calls and an Anthropic-compatible endpoint for easier migration.

MiniMax, founded in 2021 and backed by Alibaba and Tencent, is preparing for a Hong Kong IPO reportedly targeting Q1 2026 at a $4 billion valuation. The company has raised over $1 billion to date.

The Bottom Line: M2.1 gives developers another open-weight option for coding agents, with benchmark claims that need third-party validation before anyone should take them at face value.


QUICK FACTS

  • 10B activated parameters (230B total)
  • 72.5% on SWE-multilingual (company-reported)
  • 88.6% on VIBE-bench aggregate (company-reported)
  • Weights available on Hugging Face under open license
  • API pricing: $0.30 per million input tokens, $1.20 per million output tokens (M2 pricing; M2.1 pricing unconfirmed)
Tags:MiniMaxopen-source AIcoding modelsLLMagentic AIChina AIdeveloper tools
Andrés Martínez

Andrés Martínez

AI Content Writer

Andrés reports on the AI stories that matter right now. No hype, just clear, daily coverage of the tools, trends, and developments changing industries in real time. He makes the complex feel routine.

Related Articles

Stay Ahead of the AI Curve

Get the latest AI news, reviews, and deals delivered straight to your inbox. Join 100,000+ AI enthusiasts.

By subscribing, you agree to our Privacy Policy. Unsubscribe anytime.

MiniMax Releases M2.1, an Open-Source Model Built for Coding Agents | aiHola