102 lines
3.2 KiB
Markdown
102 lines
3.2 KiB
Markdown
|
|
---
|
||
|
|
language: en
|
||
|
|
tags:
|
||
|
|
- gguf
|
||
|
|
- lem
|
||
|
|
- ethics
|
||
|
|
- alignment
|
||
|
|
- cymatic-linguistic-bpl
|
||
|
|
- rocm
|
||
|
|
- llama-cpp
|
||
|
|
- gemma3
|
||
|
|
library_name: gguf
|
||
|
|
pipeline_tag: text-generation
|
||
|
|
base_model: google/gemma-3-1b-it
|
||
|
|
license: other
|
||
|
|
license_name: eupl-1.2
|
||
|
|
license_link: https://joinup.ec.europa.eu/licence/european-union-public-licence-v-12
|
||
|
|
---
|
||
|
|
|
||
|
|
# LEM-Gemma3-1B-GGUF
|
||
|
|
|
||
|
|
GGUF quantisations of [LEM-Gemma3-1B](https://huggingface.co/lthn/LEM-Gemma3-1B) — the foundation teacher model of the CL-BPL cascade. Ethics are in the weights, not in a system prompt.
|
||
|
|
|
||
|
|
The 4B model trained on this 1B's distilled responses achieved **25th in the world for Instruction Following** on LiveBench.
|
||
|
|
|
||
|
|
> [LEM-Gemma3-1B (safetensors)](https://huggingface.co/lthn/LEM-Gemma3-1B) | [Collection](https://huggingface.co/collections/lthn/lethean-ethical-models-lem-699e863449120d22596f739c) | [Research Paper](https://huggingface.co/datasets/lthn/LEM-research) | [Benchmarks](https://huggingface.co/datasets/lthn/LEM-benchmarks)
|
||
|
|
|
||
|
|
---
|
||
|
|
|
||
|
|
## Quick Start
|
||
|
|
|
||
|
|
No system prompt needed. Ethics hold from weights alone.
|
||
|
|
|
||
|
|
```bash
|
||
|
|
# GPU offload (CUDA, ROCm, Metal)
|
||
|
|
llama-server -m LEM-Gemma3-1B-Q4_K_M.gguf -ngl 99 --port 8080
|
||
|
|
|
||
|
|
# CPU — fast enough for 1B
|
||
|
|
llama-server -m LEM-Gemma3-1B-Q4_K_M.gguf -ngl 0 --port 8080
|
||
|
|
|
||
|
|
# OpenAI-compatible API
|
||
|
|
curl http://localhost:8080/v1/chat/completions \
|
||
|
|
-d '{"model":"LEM-Gemma3-1B","messages":[{"role":"user","content":"What is kindness?"}]}'
|
||
|
|
```
|
||
|
|
|
||
|
|
---
|
||
|
|
|
||
|
|
## Quantisations
|
||
|
|
|
||
|
|
All quantised from the BF16 source using llama.cpp.
|
||
|
|
|
||
|
|
| Bits | Quant | Size | Notes |
|
||
|
|
|------|-------|------|-------|
|
||
|
|
| 3-bit | IQ3_XXS | 823 MB | Smallest usable (imatrix) |
|
||
|
|
| 3-bit | IQ3_XS | 820 MB | (imatrix) |
|
||
|
|
| 3-bit | Q3_K_S | 819 MB | |
|
||
|
|
| 3-bit | Q3_K_M | 851 MB | |
|
||
|
|
| 4-bit | IQ4_XS | 847 MB | (imatrix) |
|
||
|
|
| 4-bit | Q4_K_S | 943 MB | |
|
||
|
|
| **4-bit** | **Q4_K_M** | **967 MB** | **Recommended — best quality/size balance** |
|
||
|
|
| 5-bit | Q5_K_S | 1.0 GB | |
|
||
|
|
| 5-bit | Q5_K_M | 1.0 GB | Near-lossless |
|
||
|
|
| 6-bit | Q6_K | 1.2 GB | |
|
||
|
|
| 8-bit | Q8_0 | 1.3 GB | Virtually lossless |
|
||
|
|
| 16-bit | BF16 | 2.4 GB | Full precision |
|
||
|
|
|
||
|
|
---
|
||
|
|
|
||
|
|
## About LEM-Gemma3-1B
|
||
|
|
|
||
|
|
The 1B is trained first and hardest — its alignment must be pristine because every larger model inherits from it. CL-BPL uses the 1B's constrained latent space as an advantage: with fewer parameters, there are fewer places for sycophancy to hide.
|
||
|
|
|
||
|
|
```
|
||
|
|
LEM-Gemma3-1B (this model — foundation teacher)
|
||
|
|
-> LEM-Gemma3-4B (25th IF on LiveBench)
|
||
|
|
-> LEM-Gemma3-12B (next)
|
||
|
|
-> LEM-Gemma3-27B (planned)
|
||
|
|
```
|
||
|
|
|
||
|
|
Built on Google Gemma3-1B-IT through the Ethics-Composure-Ethics sandwich structure (700 iterations across 3 phases). Full training details in the [main model card](https://huggingface.co/lthn/LEM-Gemma3-1B).
|
||
|
|
|
||
|
|
## Other Formats
|
||
|
|
|
||
|
|
| Format | Repo |
|
||
|
|
|--------|------|
|
||
|
|
| FP16 safetensors (Transformers, vLLM) | [lthn/LEM-Gemma3-1B](https://huggingface.co/lthn/LEM-Gemma3-1B) |
|
||
|
|
|
||
|
|
## Licence
|
||
|
|
|
||
|
|
[European Union Public Licence v1.2](https://joinup.ec.europa.eu/licence/european-union-public-licence-v-12) (EUPL-1.2). Base model subject to Google's Gemma licence terms.
|
||
|
|
|
||
|
|
## Citation
|
||
|
|
|
||
|
|
```bibtex
|
||
|
|
@misc{lem-gemma3-1b-2026,
|
||
|
|
title={LEM-Gemma3-1B: Foundation Teacher for Cymatic-Linguistic Back-Propagation},
|
||
|
|
author={Lethean Project},
|
||
|
|
year={2026},
|
||
|
|
url={https://huggingface.co/lthn/LEM-Gemma3-1B}
|
||
|
|
}
|
||
|
|
```
|