DeepSeek-V2
DeepSeek-V2 is a state-of-the-art Mixture-of-Experts (MoE) language model designed for economical training and efficient inference, boasting 236B total parameters with excellent performance across various benchmarks and exceptional capabilities in text generation and conversational AI.
Key Information
- Category: Language Models
- Source: Huggingface
- Tags: text-generation
- Last updated: March 03, 2026
Structured Metrics
No structured metrics captured yet.
Links
Canonical source: https://huggingface.co/deepseek-ai/DeepSeek-V2