Update README.md

This commit is contained in:
Rasmus Rasmussen
2026-01-15 00:30:53 +00:00
committed by system
parent acd985b844
commit e34a2718ff

View File

@@ -9,7 +9,7 @@ tags:
- moe
- llama
---
<img src="theprint_18b_moe.png" width="420" />
# theprint-MoE-8x3-0126-GGUF
An 18B parameter Mixture of Experts model combining 8 specialized 3B experts, with 2 experts activated per token by default (configurable up to 4 at inference).