Llama 4 Maverick & Scout
A new generation of large language models from Meta released on Hugging Face. Llama 4 includes two Mixture-of-Experts models – Maverick (~400B total with 17B active parameters and 128 experts) and Scout (~109B total with 17B active parameters and 16 experts). Both support native multimodal inputs (text and images), extremely long context lengths (up to 10M tokens in Scout), and are integrated with Hugging Face transformers and TGI for easy deployment.
Key Information
- Category: Language Models
- Source: Huggingface
- Last updated: March 03, 2026
Structured Metrics
No structured metrics captured yet.
Links
Canonical source: https://huggingface.co/blog/llama4-release