EleutherAI/gpt-neox-20b
A 20-billion parameter autoregressive transformer language model developed by EleutherAI using the GPT-NeoX library. It is designed primarily for research purposes, with capabilities for further fine-tuning and adaptation, and provides detailed technical specifications and evaluation results.
Key Information
- Category: Language Models
- Source: Huggingface
- Tags: text-generation
- Last updated: March 03, 2026
Structured Metrics
No structured metrics captured yet.
Links
Canonical source: https://huggingface.co/EleutherAI/gpt-neox-20b