Smaug-72B-v0.1

Smaug-72B-v0.1 is an open-source large language model for text generation developed by Abacus.AI. Based on Qwen-72B and finetuned using the novel DPO-Positive (DPOP) technique, it achieves high performance on benchmarks like MT-Bench and is the first open model to surpass an average score of 80% on the Open LLM Leaderboard.

Key Information

  • Category: Language Models
  • Source: Huggingface
  • Tags: text-generation
  • Last updated: February 24, 2026

Structured Metrics

No structured metrics captured yet.

Links

Canonical source: https://huggingface.co/abacusai/Smaug-72B-v0.1