Jamba-v0.1

Jamba-v0.1 is a state-of-the-art, hybrid SSM-Transformer large language model developed by AI21 Labs. It is a pretrained, mixture-of-experts generative text model with 12B active parameters (52B total across experts), supporting a 256K context length. Designed for high throughput, it serves as a strong base for fine-tuning into chat/instruct versions.

Key Information

  • Category: Language Models
  • Source: Huggingface
  • Tags: text-generation
  • Last updated: February 24, 2026

Structured Metrics

No structured metrics captured yet.

Links

Canonical source: https://huggingface.co/ai21labs/Jamba-v0.1