初始化项目,由ModelHub XC社区提供模型

Model: mradermacher/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning-i1-GGUF
Source: Original Platform
This commit is contained in:
ModelHub XC
2026-04-15 18:41:04 +08:00
commit 152f6048b3
27 changed files with 245 additions and 0 deletions

60
.gitattributes vendored Normal file
View File

@@ -0,0 +1,60 @@
*.7z filter=lfs diff=lfs merge=lfs -text
*.arrow filter=lfs diff=lfs merge=lfs -text
*.bin filter=lfs diff=lfs merge=lfs -text
*.bz2 filter=lfs diff=lfs merge=lfs -text
*.ckpt filter=lfs diff=lfs merge=lfs -text
*.ftz filter=lfs diff=lfs merge=lfs -text
*.gz filter=lfs diff=lfs merge=lfs -text
*.h5 filter=lfs diff=lfs merge=lfs -text
*.joblib filter=lfs diff=lfs merge=lfs -text
*.lfs.* filter=lfs diff=lfs merge=lfs -text
*.mlmodel filter=lfs diff=lfs merge=lfs -text
*.model filter=lfs diff=lfs merge=lfs -text
*.msgpack filter=lfs diff=lfs merge=lfs -text
*.npy filter=lfs diff=lfs merge=lfs -text
*.npz filter=lfs diff=lfs merge=lfs -text
*.onnx filter=lfs diff=lfs merge=lfs -text
*.ot filter=lfs diff=lfs merge=lfs -text
*.parquet filter=lfs diff=lfs merge=lfs -text
*.pb filter=lfs diff=lfs merge=lfs -text
*.pickle filter=lfs diff=lfs merge=lfs -text
*.pkl filter=lfs diff=lfs merge=lfs -text
*.pt filter=lfs diff=lfs merge=lfs -text
*.pth filter=lfs diff=lfs merge=lfs -text
*.rar filter=lfs diff=lfs merge=lfs -text
*.safetensors filter=lfs diff=lfs merge=lfs -text
saved_model/**/* filter=lfs diff=lfs merge=lfs -text
*.tar.* filter=lfs diff=lfs merge=lfs -text
*.tar filter=lfs diff=lfs merge=lfs -text
*.tflite filter=lfs diff=lfs merge=lfs -text
*.tgz filter=lfs diff=lfs merge=lfs -text
*.wasm filter=lfs diff=lfs merge=lfs -text
*.xz filter=lfs diff=lfs merge=lfs -text
*.zip filter=lfs diff=lfs merge=lfs -text
*.zst filter=lfs diff=lfs merge=lfs -text
*tfevents* filter=lfs diff=lfs merge=lfs -text
LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.imatrix.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-IQ1_M.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-IQ1_S.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-IQ2_M.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-IQ2_S.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-IQ2_XS.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-IQ2_XXS.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-IQ3_M.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-IQ3_S.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-IQ3_XS.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-IQ3_XXS.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-IQ4_NL.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-IQ4_XS.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-Q2_K.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-Q2_K_S.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-Q3_K_L.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-Q3_K_M.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-Q3_K_S.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-Q4_0.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-Q4_1.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-Q4_K_M.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-Q4_K_S.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-Q5_K_M.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-Q5_K_S.gguf filter=lfs diff=lfs merge=lfs -text
LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-Q6_K.gguf filter=lfs diff=lfs merge=lfs -text

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:86f2afe502da44544d813c7e7e4f723cdef714bedea483f2c26e59eff3e14033
size 327145984

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:57545e4901b19bd7812e19f8a6f02e7813afc699e96abb4a12135fde779d588d
size 304388608

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:e7f10cccddafeeeb46cfc6c3124a2f53cde4f05214924c8cde4a8f142bb49632
size 434133504

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:5db75f897161cf0ee6e091fb6a1578599d09103cc72aa1e2cbbedbc67d082161
size 403790336

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:00af71c5b7f1b47ba89a6adf32ca07fc9cea27f3b95690bf2d236121312c9f37
size 396204544

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:a09af8ef7d480631fe4cc9f99bde012cb714c41e1b1008fe5a82b5b86c727e9f
size 365074944

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:5d251d89f08146f80b04c80b82e67736eef929d43de56c48c21623b6eb4ce4d5
size 566794752

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:dd5746417dec4e149914a34b41bea677918fe8a64de665a41b33eb38c5534874
size 558160384

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:2703be431001410f6cb9b2044b0a8f993c811e6aba67804201a3b016b3b9df08
size 537811456

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:7df842f83dfa7a8a1d420eff16bd23737e29022b68e384ba65c9d4cd9a0028f6
size 490985984

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:8e9309c3b79c13db861051148f3f6b7ce4e2fbc2af4478e21864070ff62b735e
size 695753216

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:bb599c707b13ea2e43ea561031f9977c3eed2df97e951e01358ad10781341894
size 663378432

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:67f0acf44a54616dcb7341b40307701779d7d257b81bf79ee57a771becf60d87
size 483400192

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:9be7802bc15b7bf12bf5348755574cc1535aef4aeb6f64d4cb032b05ab18dbd2
size 460806656

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:a87622ff684af10b466d5d0e51c5c61f28687c49c6855d1f7e473303e8773475
size 635476480

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:be934bb38e057fd8ff909a038dd5d0dff4f709b2ad3899bbe71704af98d83aa3
size 600349184

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:5d5ff0d1234137998c89086b8d331fb1b5e49d20b294ed9565ae4247e9b6b740
size 558160384

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:e0ec521c26f68432d9127a5a276a78702f40d1882fdad726b1719a2829539122
size 697850368

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:dc25034056d3161ca251b5ba574c3261bf6ea84babf5d571b84296e89cf3fc2e
size 760502784

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:eb4f241329419e8c0e4e6a6b3003bb91c1968ca69d39f70db1f15865713d28ff
size 730896896

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:f29d89255ca9a266a61f0b03d85ff30dad52a15c9749a42cb0e84dbd830b3e48
size 700471808

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:3beb46dd6f1e437e09c2177fdc55fb3ad66954ac310dd6be5c6519e577a0e662
size 843356672

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:1d8b186b6e5a2ad870a60edae83f93bce1e01edc7f48e72c882b09efe36b8812
size 825252352

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:88fa0b2f4ab07fbaccba4f3f9f33e2999d661ed35b78fbf71263cb872ac60e1e
size 962845184

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:c8e26249385d89d102dd6efa50f57f115e06ad773c5809cc5ddb17c46de715b0
size 1161536

110
README.md Normal file
View File

@@ -0,0 +1,110 @@
---
base_model: DavidAU/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning
datasets:
- TeichAI/claude-4.5-opus-high-reasoning-250x
language:
- en
library_name: transformers
license: apache-2.0
mradermacher:
readme_rev: 1
quantized_by: mradermacher
tags:
- unsloth
- finetune
- All use cases
- bfloat16
- creative
- creative writing
- fiction writing
- plot generation
- sub-plot generation
- fiction writing
- story generation
- scene continue
- storytelling
- fiction story
- science fiction
- romance
- all genres
- story
- writing
- vivid prosing
- vivid writing
- fiction
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
<!-- ### quants: Q2_K IQ3_M Q4_K_S IQ3_XXS Q3_K_M small-IQ4_NL Q4_K_M IQ2_M Q6_K IQ4_XS Q2_K_S IQ1_M Q3_K_S IQ2_XXS Q3_K_L IQ2_XS Q5_K_S IQ2_S IQ1_S Q5_K_M Q4_0 IQ3_XS Q4_1 IQ3_S -->
<!-- ### quants_skip: -->
<!-- ### skip_mmproj: -->
weighted/imatrix quants of https://huggingface.co/DavidAU/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning
<!-- provided-files -->
***For a convenient overview and download list, visit our [model page for this model](https://hf.tst.eu/model#LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning-i1-GGUF).***
static quants are available at https://huggingface.co/mradermacher/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning-i1-GGUF/resolve/main/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.imatrix.gguf) | imatrix | 0.1 | imatrix file (for creating your own quants) |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning-i1-GGUF/resolve/main/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-IQ1_S.gguf) | i1-IQ1_S | 0.4 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning-i1-GGUF/resolve/main/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-IQ1_M.gguf) | i1-IQ1_M | 0.4 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning-i1-GGUF/resolve/main/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 0.5 | |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning-i1-GGUF/resolve/main/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-IQ2_XS.gguf) | i1-IQ2_XS | 0.5 | |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning-i1-GGUF/resolve/main/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-IQ2_S.gguf) | i1-IQ2_S | 0.5 | |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning-i1-GGUF/resolve/main/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-IQ2_M.gguf) | i1-IQ2_M | 0.5 | |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning-i1-GGUF/resolve/main/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-Q2_K_S.gguf) | i1-Q2_K_S | 0.6 | very low quality |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning-i1-GGUF/resolve/main/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-Q2_K.gguf) | i1-Q2_K | 0.6 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning-i1-GGUF/resolve/main/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 0.6 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning-i1-GGUF/resolve/main/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-IQ3_XS.gguf) | i1-IQ3_XS | 0.6 | |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning-i1-GGUF/resolve/main/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-IQ3_S.gguf) | i1-IQ3_S | 0.7 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning-i1-GGUF/resolve/main/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-Q3_K_S.gguf) | i1-Q3_K_S | 0.7 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning-i1-GGUF/resolve/main/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-IQ3_M.gguf) | i1-IQ3_M | 0.7 | |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning-i1-GGUF/resolve/main/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-Q3_K_M.gguf) | i1-Q3_K_M | 0.7 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning-i1-GGUF/resolve/main/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-Q3_K_L.gguf) | i1-Q3_K_L | 0.7 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning-i1-GGUF/resolve/main/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-IQ4_XS.gguf) | i1-IQ4_XS | 0.8 | |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning-i1-GGUF/resolve/main/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-IQ4_NL.gguf) | i1-IQ4_NL | 0.8 | prefer IQ4_XS |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning-i1-GGUF/resolve/main/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-Q4_0.gguf) | i1-Q4_0 | 0.8 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning-i1-GGUF/resolve/main/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-Q4_K_S.gguf) | i1-Q4_K_S | 0.8 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning-i1-GGUF/resolve/main/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-Q4_K_M.gguf) | i1-Q4_K_M | 0.8 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning-i1-GGUF/resolve/main/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-Q4_1.gguf) | i1-Q4_1 | 0.9 | |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning-i1-GGUF/resolve/main/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-Q5_K_S.gguf) | i1-Q5_K_S | 0.9 | |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning-i1-GGUF/resolve/main/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-Q5_K_M.gguf) | i1-Q5_K_M | 0.9 | |
| [GGUF](https://huggingface.co/mradermacher/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning-i1-GGUF/resolve/main/LFM2.5-1.2B-Thinking-Claude-4.5-Opus-High-Reasoning.i1-Q6_K.gguf) | i1-Q6_K | 1.1 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):
![image.png](https://www.nethype.de/huggingface_embed/quantpplgraph.png)
And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->