Bielik-11B-v2
Bielik-11B-v2 is an 11-billion parameter generative text model trained on Polish text corpora. Initialized from Mistral-7B-v0.2 and fine-tuned using advanced parallelization techniques, it offers robust text generation capabilities in Polish and English, as evidenced by its performance on multiple NLP leaderboards.
Key Information
- Category: Language Models
- Source: Huggingface
- Tags: text-generation
- Last updated: March 03, 2026
Structured Metrics
No structured metrics captured yet.
Links
Canonical source: https://huggingface.co/speakleash/Bielik-11B-v2