Update README.md

This commit is contained in:
Rasmus Rasmussen
2026-01-15 10:20:08 +00:00
committed by system
parent 3b0373da51
commit c101ba56d7

View File

@@ -10,7 +10,9 @@ tags:
- llama
---
<img src="theprint_18b_moe.png" width="420" />
# theprint-MoE-8x3-0126-GGUF
An 18B parameter Mixture of Experts model combining 8 specialized 3B experts, with 2 experts activated per token by default (configurable up to 4 at inference).
## Architecture