MiniMax-M2
MiniMax-M2 is an open‑weight Mixture‑of‑Experts language model (≈230B total parameters with ~10B active) optimized for coding and agentic tool use. It supports tool/function calling, long‑horizon planning, and strong performance on coding and agent benchmarks (e.g., SWE‑bench, Terminal‑Bench, BrowseComp). The Hugging Face release provides safetensors, Transformers integration, and deployment guides for SGLang, vLLM, MLX, and standard Transformers. License: modified‑MIT.
Key Information
- Category: Language Models
- Source: Huggingface
- Tags: text-generation
- Last updated: February 24, 2026
Structured Metrics
No structured metrics captured yet.
Links
Canonical source: https://huggingface.co/MiniMaxAI/MiniMax-M2