--- license: apache-2.0 base_model: - gustavecortal/Piaget-4B language: - en pipeline_tag: text-generation library_name: transformers tags: - text-generation-inference --- # **Piaget-4B-GGUF** > Piaget, a language model finetuned on 15k psychological and philosophical reasoning traces. Piaget is based on Qwen3 and was finetuned on a subset of open reasoning traces from Dolphin R1 and General Reasoning. Performed domain filtering on Dolphin R1 and General Reasoning. Prompts were embedded, clustered with k-means (k=20 000) and majority-voted for domain labels using Qwen3-1.7B, following the Intelligent Internet pipeline. Clusters tagged psychology or philosophy were retained for LoRA finetuning (rank=8, alpha=16, max length=2048, epoch=1, batch size=16). Piaget aims to reason about psychological and philosophical concepts such as self-image, emotion, and existence. Piaget was inspired by my position paper on emotion analysis: Improving Language Models for Emotion Analysis: Insights from Cognitive Science. ## Model files | File | Size | Format | |------|------|--------| | Piaget-4B.BF16.gguf | 8.05 GB | BF16 | | Piaget-4B.F16.gguf | 8.05 GB | F16 | | Piaget-4B.F32.gguf | 16.1 GB | F32 | | Piaget-4B.Q2_K.gguf | 1.67 GB | Q2_K | | Piaget-4B.Q3_K_L.gguf | 2.24 GB | Q3_K_L | | Piaget-4B.Q3_K_M.gguf | 2.08 GB | Q3_K_M | | Piaget-4B.Q3_K_S.gguf | 1.89 GB | Q3_K_S | | Piaget-4B.Q4_K_M.gguf | 2.5 GB | Q4_K_M | | Piaget-4B.Q4_K_S.gguf | 2.38 GB | Q4_K_S | | Piaget-4B.Q5_K_M.gguf | 2.89 GB | Q5_K_M | | Piaget-4B.Q5_K_S.gguf | 2.82 GB | Q5_K_S | | Piaget-4B.Q6_K.gguf | 3.31 GB | Q6_K | | Piaget-4B.Q8_0.gguf | 4.28 GB | Q8_0 | | .gitattributes | 2.4 kB | - | | README.md | 65 Bytes | - | | config.json | 29 Bytes | - | ## Quants Usage (sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants) Here is a handy graph by ikawrakow comparing some lower-quality quant types (lower is better): ![image.png](https://www.nethype.de/huggingface_embed/quantpplgraph.png)