DeepSeek-V3.2
An open‑weight large language model from DeepSeek optimized for efficient reasoning and agentic tool use. It introduces DeepSeek Sparse Attention (DSA) for long‑context efficiency, scaled RL post‑training, and a new chat template with “thinking with tools” and tool calling. MIT‑licensed weights (BF16/FP8/F32) are available on Hugging Face with guidance for local inference and OpenAI‑compatible message encoding.
Key Information
- Category: Language Models
- Source: Huggingface
- Tags: text-generation
- Last updated: February 24, 2026
Structured Metrics
No structured metrics captured yet.
Links
Canonical source: https://huggingface.co/deepseek-ai/DeepSeek-V3.2