GPT-2
GPT-2 is a pretrained generative transformer model by OpenAI, designed for text generation. It is trained using a causal language modeling objective on a large corpus of English text and is available on Hugging Face. The model card provides detailed usage examples, training procedure, limitations, and evaluation results.
Key Information
- Category: Language Models
- Source: Huggingface
- Tags: text-generation
- Last updated: March 03, 2026
Structured Metrics
No structured metrics captured yet.
Links
Canonical source: https://huggingface.co/openai-community/gpt2