Model: utter-project/EuroMoE-2.6B-A0.6B-2512 Source: Original Platform
license, language, library_name
| license | language | library_name | |||||||||||||||||||||||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| apache-2.0 |
|
transformers |
Model Card for EuroLLM-9B-2512
This is the model card for EuroLLM-2.6B-A0.6-2512, the pre-trained model for EuroLLM-2.6B-A0.6-2512-Instruct.
- Developed by: Instituto Superior Técnico - University of Lisbon, Instituto de Telecomunicações, University of Edinburgh, Aveni, Unbabel, University of Paris-Saclay, Artefact Research Center, University of Amsterdam, Naver Labs, Sorbonne Université.
- Funded by: European Union.
- Model type: A 2.6B parameter multilingual transfomer LLM.
- Language(s) (NLP): Bulgarian, Croatian, Czech, Danish, Dutch, English, Estonian, Finnish, French, German, Greek, Hungarian, Irish, Italian, Latvian, Lithuanian, Maltese, Polish, Portuguese, Romanian, Slovak, Slovenian, Spanish, Swedish, Arabic, Catalan, Chinese, Galician, Hindi, Japanese, Korean, Norwegian, Russian, Turkish, and Ukrainian.
- License: Apache License 2.0.
Bias, Risks, and Limitations
This model has not been aligned to human preferences, so the model may generate problematic outputs (e.g., hallucinations, harmful content, or false statements).
Description