Files
EuroMoE-2.6B-A0.6B-2512/README.md
ModelHub XC 01d36d31f4 初始化项目,由ModelHub XC社区提供模型
Model: utter-project/EuroMoE-2.6B-A0.6B-2512
Source: Original Platform
2026-04-14 12:57:09 +08:00

1.4 KiB

license, language, library_name
license language library_name
apache-2.0
en
de
es
fr
it
pt
pl
nl
tr
sv
cs
el
hu
ro
fi
uk
sl
sk
da
lt
lv
et
bg
no
ca
hr
ga
mt
gl
zh
ru
ko
ja
ar
hi
transformers

Model Card for EuroLLM-9B-2512

This is the model card for EuroLLM-2.6B-A0.6-2512, the pre-trained model for EuroLLM-2.6B-A0.6-2512-Instruct.

  • Developed by: Instituto Superior Técnico - University of Lisbon, Instituto de Telecomunicações, University of Edinburgh, Aveni, Unbabel, University of Paris-Saclay, Artefact Research Center, University of Amsterdam, Naver Labs, Sorbonne Université.
  • Funded by: European Union.
  • Model type: A 2.6B parameter multilingual transfomer LLM.
  • Language(s) (NLP): Bulgarian, Croatian, Czech, Danish, Dutch, English, Estonian, Finnish, French, German, Greek, Hungarian, Irish, Italian, Latvian, Lithuanian, Maltese, Polish, Portuguese, Romanian, Slovak, Slovenian, Spanish, Swedish, Arabic, Catalan, Chinese, Galician, Hindi, Japanese, Korean, Norwegian, Russian, Turkish, and Ukrainian.
  • License: Apache License 2.0.

Bias, Risks, and Limitations

This model has not been aligned to human preferences, so the model may generate problematic outputs (e.g., hallucinations, harmful content, or false statements).