初始化项目,由ModelHub XC社区提供模型
Model: mradermacher/MemReader-4B-thinking-i1-GGUF Source: Original Platform
This commit is contained in:
60
.gitattributes
vendored
Normal file
60
.gitattributes
vendored
Normal file
@@ -0,0 +1,60 @@
|
|||||||
|
*.7z filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.arrow filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.bin filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.bz2 filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.ckpt filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.ftz filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.gz filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.h5 filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.joblib filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.lfs.* filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.mlmodel filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.model filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.msgpack filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.npy filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.npz filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.onnx filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.ot filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.parquet filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.pb filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.pickle filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.pkl filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.pt filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.pth filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.rar filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.safetensors filter=lfs diff=lfs merge=lfs -text
|
||||||
|
saved_model/**/* filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.tar.* filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.tar filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.tflite filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.tgz filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.wasm filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.xz filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.zip filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.zst filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
||||||
|
MemReader-4B-thinking.imatrix.gguf filter=lfs diff=lfs merge=lfs -text
|
||||||
|
MemReader-4B-thinking.i1-IQ1_M.gguf filter=lfs diff=lfs merge=lfs -text
|
||||||
|
MemReader-4B-thinking.i1-IQ1_S.gguf filter=lfs diff=lfs merge=lfs -text
|
||||||
|
MemReader-4B-thinking.i1-IQ2_M.gguf filter=lfs diff=lfs merge=lfs -text
|
||||||
|
MemReader-4B-thinking.i1-IQ2_S.gguf filter=lfs diff=lfs merge=lfs -text
|
||||||
|
MemReader-4B-thinking.i1-IQ2_XS.gguf filter=lfs diff=lfs merge=lfs -text
|
||||||
|
MemReader-4B-thinking.i1-IQ2_XXS.gguf filter=lfs diff=lfs merge=lfs -text
|
||||||
|
MemReader-4B-thinking.i1-IQ3_M.gguf filter=lfs diff=lfs merge=lfs -text
|
||||||
|
MemReader-4B-thinking.i1-IQ3_S.gguf filter=lfs diff=lfs merge=lfs -text
|
||||||
|
MemReader-4B-thinking.i1-IQ3_XS.gguf filter=lfs diff=lfs merge=lfs -text
|
||||||
|
MemReader-4B-thinking.i1-IQ3_XXS.gguf filter=lfs diff=lfs merge=lfs -text
|
||||||
|
MemReader-4B-thinking.i1-IQ4_NL.gguf filter=lfs diff=lfs merge=lfs -text
|
||||||
|
MemReader-4B-thinking.i1-IQ4_XS.gguf filter=lfs diff=lfs merge=lfs -text
|
||||||
|
MemReader-4B-thinking.i1-Q2_K.gguf filter=lfs diff=lfs merge=lfs -text
|
||||||
|
MemReader-4B-thinking.i1-Q2_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
||||||
|
MemReader-4B-thinking.i1-Q3_K_L.gguf filter=lfs diff=lfs merge=lfs -text
|
||||||
|
MemReader-4B-thinking.i1-Q3_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
||||||
|
MemReader-4B-thinking.i1-Q3_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
||||||
|
MemReader-4B-thinking.i1-Q4_0.gguf filter=lfs diff=lfs merge=lfs -text
|
||||||
|
MemReader-4B-thinking.i1-Q4_1.gguf filter=lfs diff=lfs merge=lfs -text
|
||||||
|
MemReader-4B-thinking.i1-Q4_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
||||||
|
MemReader-4B-thinking.i1-Q4_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
||||||
|
MemReader-4B-thinking.i1-Q5_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
||||||
|
MemReader-4B-thinking.i1-Q5_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
||||||
|
MemReader-4B-thinking.i1-Q6_K.gguf filter=lfs diff=lfs merge=lfs -text
|
||||||
3
MemReader-4B-thinking.i1-IQ1_M.gguf
Normal file
3
MemReader-4B-thinking.i1-IQ1_M.gguf
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
version https://git-lfs.github.com/spec/v1
|
||||||
|
oid sha256:cf6a06b4f5c67f448ca949cbd215d5831b0db9f0c7649f250bc54f96718cddb0
|
||||||
|
size 1254644864
|
||||||
3
MemReader-4B-thinking.i1-IQ1_S.gguf
Normal file
3
MemReader-4B-thinking.i1-IQ1_S.gguf
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
version https://git-lfs.github.com/spec/v1
|
||||||
|
oid sha256:d7cce0d3607b828c34e6dff23c6ca14546352518f17949fa50297eeca1fd2165
|
||||||
|
size 1182882944
|
||||||
3
MemReader-4B-thinking.i1-IQ2_M.gguf
Normal file
3
MemReader-4B-thinking.i1-IQ2_M.gguf
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
version https://git-lfs.github.com/spec/v1
|
||||||
|
oid sha256:2750d82b193dd2d19024770d8a29ca263eb81f5eff76073ea29d522c6fbbc11c
|
||||||
|
size 1680114304
|
||||||
3
MemReader-4B-thinking.i1-IQ2_S.gguf
Normal file
3
MemReader-4B-thinking.i1-IQ2_S.gguf
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
version https://git-lfs.github.com/spec/v1
|
||||||
|
oid sha256:3c958c8532fef7775f5e93ee12717a4e5fcd8e336db30c7dc92042e7fe29f898
|
||||||
|
size 1584431744
|
||||||
3
MemReader-4B-thinking.i1-IQ2_XS.gguf
Normal file
3
MemReader-4B-thinking.i1-IQ2_XS.gguf
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
version https://git-lfs.github.com/spec/v1
|
||||||
|
oid sha256:65aa748620b2b51c3189b33723673b1e50bbba852bda7db0245c6a5ee85529ed
|
||||||
|
size 1481727104
|
||||||
3
MemReader-4B-thinking.i1-IQ2_XXS.gguf
Normal file
3
MemReader-4B-thinking.i1-IQ2_XXS.gguf
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
version https://git-lfs.github.com/spec/v1
|
||||||
|
oid sha256:14a7bbef3743638d5be74bffd6c6a225aef8eae9bb4cbe3825dd57bc765c8fa0
|
||||||
|
size 1374248064
|
||||||
3
MemReader-4B-thinking.i1-IQ3_M.gguf
Normal file
3
MemReader-4B-thinking.i1-IQ3_M.gguf
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
version https://git-lfs.github.com/spec/v1
|
||||||
|
oid sha256:517df8bd7c0dffae54a94b06fafbbb1df8981b25a65b488402ddc6a903158e3f
|
||||||
|
size 2130026624
|
||||||
3
MemReader-4B-thinking.i1-IQ3_S.gguf
Normal file
3
MemReader-4B-thinking.i1-IQ3_S.gguf
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
version https://git-lfs.github.com/spec/v1
|
||||||
|
oid sha256:188db893d6a3a2da1aaa95c763a57fac5ee0b856831adc79bb6b6fee748a948a
|
||||||
|
size 2066661504
|
||||||
3
MemReader-4B-thinking.i1-IQ3_XS.gguf
Normal file
3
MemReader-4B-thinking.i1-IQ3_XS.gguf
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
version https://git-lfs.github.com/spec/v1
|
||||||
|
oid sha256:a890b95ecf9393bea74598fe8cb7335107b2ab4651400339f3600c3275b07f97
|
||||||
|
size 1981505664
|
||||||
3
MemReader-4B-thinking.i1-IQ3_XXS.gguf
Normal file
3
MemReader-4B-thinking.i1-IQ3_XXS.gguf
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
version https://git-lfs.github.com/spec/v1
|
||||||
|
oid sha256:f5414568bce4e4ee8603513df16936078cae9b8bd5958ad6137d7a953e25b882
|
||||||
|
size 1837318784
|
||||||
3
MemReader-4B-thinking.i1-IQ4_NL.gguf
Normal file
3
MemReader-4B-thinking.i1-IQ4_NL.gguf
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
version https://git-lfs.github.com/spec/v1
|
||||||
|
oid sha256:4e71d78abdc26ab0da6961c28eedb95dc7f4b05c55047f637af12cd7fbfff144
|
||||||
|
size 2600132224
|
||||||
3
MemReader-4B-thinking.i1-IQ4_XS.gguf
Normal file
3
MemReader-4B-thinking.i1-IQ4_XS.gguf
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
version https://git-lfs.github.com/spec/v1
|
||||||
|
oid sha256:f8d12e8883cf6dbd339c1b69ccfc6c50b43b377f896295c5d60af7ae02c3b038
|
||||||
|
size 2477385344
|
||||||
3
MemReader-4B-thinking.i1-Q2_K.gguf
Normal file
3
MemReader-4B-thinking.i1-Q2_K.gguf
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
version https://git-lfs.github.com/spec/v1
|
||||||
|
oid sha256:55d1f77b77cc5dadacc420916b88cb631b7e75ae9fcf860e4bf9172db1112f03
|
||||||
|
size 1797126784
|
||||||
3
MemReader-4B-thinking.i1-Q2_K_S.gguf
Normal file
3
MemReader-4B-thinking.i1-Q2_K_S.gguf
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
version https://git-lfs.github.com/spec/v1
|
||||||
|
oid sha256:7c87667011ce0b5cb85bfc5f5b0db2a09f160a433bb75e6069a75cdf5a718187
|
||||||
|
size 1691081344
|
||||||
3
MemReader-4B-thinking.i1-Q3_K_L.gguf
Normal file
3
MemReader-4B-thinking.i1-Q3_K_L.gguf
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
version https://git-lfs.github.com/spec/v1
|
||||||
|
oid sha256:a83714176d60a712458ad77ebe1e2dab392c663ccfeb9e81a142e8acc8d9021c
|
||||||
|
size 2406916224
|
||||||
3
MemReader-4B-thinking.i1-Q3_K_M.gguf
Normal file
3
MemReader-4B-thinking.i1-Q3_K_M.gguf
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
version https://git-lfs.github.com/spec/v1
|
||||||
|
oid sha256:5a3f70f719ac1f22e7e4cdff8b0e713eec349391a0e7a6fdf04365d930999d97
|
||||||
|
size 2242748544
|
||||||
3
MemReader-4B-thinking.i1-Q3_K_S.gguf
Normal file
3
MemReader-4B-thinking.i1-Q3_K_S.gguf
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
version https://git-lfs.github.com/spec/v1
|
||||||
|
oid sha256:ec955c64e9c4c24a65cd5cc0fad90b02e0647ee304b3aec7dfb86da79f82272e
|
||||||
|
size 2054127744
|
||||||
3
MemReader-4B-thinking.i1-Q4_0.gguf
Normal file
3
MemReader-4B-thinking.i1-Q4_0.gguf
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
version https://git-lfs.github.com/spec/v1
|
||||||
|
oid sha256:d189fdf91bd045d46b2c778f5fb80e6fb1ea8e8c8b778cd9fca60da3f73b22bf
|
||||||
|
size 2594561664
|
||||||
3
MemReader-4B-thinking.i1-Q4_1.gguf
Normal file
3
MemReader-4B-thinking.i1-Q4_1.gguf
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
version https://git-lfs.github.com/spec/v1
|
||||||
|
oid sha256:e607589840cb8b46217724f6061c93188def72046e4cd8690285cee49f9d62d6
|
||||||
|
size 2839727744
|
||||||
3
MemReader-4B-thinking.i1-Q4_K_M.gguf
Normal file
3
MemReader-4B-thinking.i1-Q4_K_M.gguf
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
version https://git-lfs.github.com/spec/v1
|
||||||
|
oid sha256:93c6d7377cffaa17fb0a07eef09a4caeae8ee6e663066f56bdd0ac426985c5b3
|
||||||
|
size 2716069504
|
||||||
3
MemReader-4B-thinking.i1-Q4_K_S.gguf
Normal file
3
MemReader-4B-thinking.i1-Q4_K_S.gguf
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
version https://git-lfs.github.com/spec/v1
|
||||||
|
oid sha256:ba0eb4369e4e32d3ee1c02eeea4cc73de1a3bfa3204715cf519a9e9679d64121
|
||||||
|
size 2602098304
|
||||||
3
MemReader-4B-thinking.i1-Q5_K_M.gguf
Normal file
3
MemReader-4B-thinking.i1-Q5_K_M.gguf
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
version https://git-lfs.github.com/spec/v1
|
||||||
|
oid sha256:17cbd3b413c356953f9fa34fcde35df9fb8f57d033726ce63232af0805e32168
|
||||||
|
size 3156921984
|
||||||
3
MemReader-4B-thinking.i1-Q5_K_S.gguf
Normal file
3
MemReader-4B-thinking.i1-Q5_K_S.gguf
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
version https://git-lfs.github.com/spec/v1
|
||||||
|
oid sha256:5775502bedbbb85bf9dd979d7943d771504c91304601029210044504ddc42b90
|
||||||
|
size 3091119744
|
||||||
3
MemReader-4B-thinking.i1-Q6_K.gguf
Normal file
3
MemReader-4B-thinking.i1-Q6_K.gguf
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
version https://git-lfs.github.com/spec/v1
|
||||||
|
oid sha256:06bf33fe3b9ea2080955294b023843a2144c3ccce82d4a8f9f7463afc7cc2f9e
|
||||||
|
size 3625327744
|
||||||
3
MemReader-4B-thinking.imatrix.gguf
Normal file
3
MemReader-4B-thinking.imatrix.gguf
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
version https://git-lfs.github.com/spec/v1
|
||||||
|
oid sha256:6792c36899de2bb56c276d734284950b69532c04f4ea6ffc3527eccd2dfebd85
|
||||||
|
size 3872640
|
||||||
93
README.md
Normal file
93
README.md
Normal file
@@ -0,0 +1,93 @@
|
|||||||
|
---
|
||||||
|
base_model: IAAR-Shanghai/MemReader-4B-thinking
|
||||||
|
language:
|
||||||
|
- en
|
||||||
|
- zh
|
||||||
|
library_name: transformers
|
||||||
|
license: apache-2.0
|
||||||
|
mradermacher:
|
||||||
|
readme_rev: 1
|
||||||
|
quantized_by: mradermacher
|
||||||
|
tags:
|
||||||
|
- qwen3
|
||||||
|
- memory
|
||||||
|
- memory-extraction
|
||||||
|
- tool-calling
|
||||||
|
- reasoning
|
||||||
|
- agent
|
||||||
|
---
|
||||||
|
## About
|
||||||
|
|
||||||
|
<!-- ### quantize_version: 2 -->
|
||||||
|
<!-- ### output_tensor_quantised: 1 -->
|
||||||
|
<!-- ### convert_type: hf -->
|
||||||
|
<!-- ### vocab_type: -->
|
||||||
|
<!-- ### tags: nicoboss -->
|
||||||
|
<!-- ### quants: Q2_K IQ3_M Q4_K_S IQ3_XXS Q3_K_M small-IQ4_NL Q4_K_M IQ2_M Q6_K IQ4_XS Q2_K_S IQ1_M Q3_K_S IQ2_XXS Q3_K_L IQ2_XS Q5_K_S IQ2_S IQ1_S Q5_K_M Q4_0 IQ3_XS Q4_1 IQ3_S -->
|
||||||
|
<!-- ### quants_skip: -->
|
||||||
|
<!-- ### skip_mmproj: -->
|
||||||
|
weighted/imatrix quants of https://huggingface.co/IAAR-Shanghai/MemReader-4B-thinking
|
||||||
|
|
||||||
|
<!-- provided-files -->
|
||||||
|
|
||||||
|
***For a convenient overview and download list, visit our [model page for this model](https://hf.tst.eu/model#MemReader-4B-thinking-i1-GGUF).***
|
||||||
|
|
||||||
|
static quants are available at https://huggingface.co/mradermacher/MemReader-4B-thinking-GGUF
|
||||||
|
## Usage
|
||||||
|
|
||||||
|
If you are unsure how to use GGUF files, refer to one of [TheBloke's
|
||||||
|
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
|
||||||
|
more details, including on how to concatenate multi-part files.
|
||||||
|
|
||||||
|
## Provided Quants
|
||||||
|
|
||||||
|
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
|
||||||
|
|
||||||
|
| Link | Type | Size/GB | Notes |
|
||||||
|
|:-----|:-----|--------:|:------|
|
||||||
|
| [GGUF](https://huggingface.co/mradermacher/MemReader-4B-thinking-i1-GGUF/resolve/main/MemReader-4B-thinking.imatrix.gguf) | imatrix | 0.1 | imatrix file (for creating your own quants) |
|
||||||
|
| [GGUF](https://huggingface.co/mradermacher/MemReader-4B-thinking-i1-GGUF/resolve/main/MemReader-4B-thinking.i1-IQ1_S.gguf) | i1-IQ1_S | 1.3 | for the desperate |
|
||||||
|
| [GGUF](https://huggingface.co/mradermacher/MemReader-4B-thinking-i1-GGUF/resolve/main/MemReader-4B-thinking.i1-IQ1_M.gguf) | i1-IQ1_M | 1.4 | mostly desperate |
|
||||||
|
| [GGUF](https://huggingface.co/mradermacher/MemReader-4B-thinking-i1-GGUF/resolve/main/MemReader-4B-thinking.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 1.5 | |
|
||||||
|
| [GGUF](https://huggingface.co/mradermacher/MemReader-4B-thinking-i1-GGUF/resolve/main/MemReader-4B-thinking.i1-IQ2_XS.gguf) | i1-IQ2_XS | 1.6 | |
|
||||||
|
| [GGUF](https://huggingface.co/mradermacher/MemReader-4B-thinking-i1-GGUF/resolve/main/MemReader-4B-thinking.i1-IQ2_S.gguf) | i1-IQ2_S | 1.7 | |
|
||||||
|
| [GGUF](https://huggingface.co/mradermacher/MemReader-4B-thinking-i1-GGUF/resolve/main/MemReader-4B-thinking.i1-IQ2_M.gguf) | i1-IQ2_M | 1.8 | |
|
||||||
|
| [GGUF](https://huggingface.co/mradermacher/MemReader-4B-thinking-i1-GGUF/resolve/main/MemReader-4B-thinking.i1-Q2_K_S.gguf) | i1-Q2_K_S | 1.8 | very low quality |
|
||||||
|
| [GGUF](https://huggingface.co/mradermacher/MemReader-4B-thinking-i1-GGUF/resolve/main/MemReader-4B-thinking.i1-Q2_K.gguf) | i1-Q2_K | 1.9 | IQ3_XXS probably better |
|
||||||
|
| [GGUF](https://huggingface.co/mradermacher/MemReader-4B-thinking-i1-GGUF/resolve/main/MemReader-4B-thinking.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 1.9 | lower quality |
|
||||||
|
| [GGUF](https://huggingface.co/mradermacher/MemReader-4B-thinking-i1-GGUF/resolve/main/MemReader-4B-thinking.i1-IQ3_XS.gguf) | i1-IQ3_XS | 2.1 | |
|
||||||
|
| [GGUF](https://huggingface.co/mradermacher/MemReader-4B-thinking-i1-GGUF/resolve/main/MemReader-4B-thinking.i1-Q3_K_S.gguf) | i1-Q3_K_S | 2.2 | IQ3_XS probably better |
|
||||||
|
| [GGUF](https://huggingface.co/mradermacher/MemReader-4B-thinking-i1-GGUF/resolve/main/MemReader-4B-thinking.i1-IQ3_S.gguf) | i1-IQ3_S | 2.2 | beats Q3_K* |
|
||||||
|
| [GGUF](https://huggingface.co/mradermacher/MemReader-4B-thinking-i1-GGUF/resolve/main/MemReader-4B-thinking.i1-IQ3_M.gguf) | i1-IQ3_M | 2.2 | |
|
||||||
|
| [GGUF](https://huggingface.co/mradermacher/MemReader-4B-thinking-i1-GGUF/resolve/main/MemReader-4B-thinking.i1-Q3_K_M.gguf) | i1-Q3_K_M | 2.3 | IQ3_S probably better |
|
||||||
|
| [GGUF](https://huggingface.co/mradermacher/MemReader-4B-thinking-i1-GGUF/resolve/main/MemReader-4B-thinking.i1-Q3_K_L.gguf) | i1-Q3_K_L | 2.5 | IQ3_M probably better |
|
||||||
|
| [GGUF](https://huggingface.co/mradermacher/MemReader-4B-thinking-i1-GGUF/resolve/main/MemReader-4B-thinking.i1-IQ4_XS.gguf) | i1-IQ4_XS | 2.6 | |
|
||||||
|
| [GGUF](https://huggingface.co/mradermacher/MemReader-4B-thinking-i1-GGUF/resolve/main/MemReader-4B-thinking.i1-Q4_0.gguf) | i1-Q4_0 | 2.7 | fast, low quality |
|
||||||
|
| [GGUF](https://huggingface.co/mradermacher/MemReader-4B-thinking-i1-GGUF/resolve/main/MemReader-4B-thinking.i1-IQ4_NL.gguf) | i1-IQ4_NL | 2.7 | prefer IQ4_XS |
|
||||||
|
| [GGUF](https://huggingface.co/mradermacher/MemReader-4B-thinking-i1-GGUF/resolve/main/MemReader-4B-thinking.i1-Q4_K_S.gguf) | i1-Q4_K_S | 2.7 | optimal size/speed/quality |
|
||||||
|
| [GGUF](https://huggingface.co/mradermacher/MemReader-4B-thinking-i1-GGUF/resolve/main/MemReader-4B-thinking.i1-Q4_K_M.gguf) | i1-Q4_K_M | 2.8 | fast, recommended |
|
||||||
|
| [GGUF](https://huggingface.co/mradermacher/MemReader-4B-thinking-i1-GGUF/resolve/main/MemReader-4B-thinking.i1-Q4_1.gguf) | i1-Q4_1 | 2.9 | |
|
||||||
|
| [GGUF](https://huggingface.co/mradermacher/MemReader-4B-thinking-i1-GGUF/resolve/main/MemReader-4B-thinking.i1-Q5_K_S.gguf) | i1-Q5_K_S | 3.2 | |
|
||||||
|
| [GGUF](https://huggingface.co/mradermacher/MemReader-4B-thinking-i1-GGUF/resolve/main/MemReader-4B-thinking.i1-Q5_K_M.gguf) | i1-Q5_K_M | 3.3 | |
|
||||||
|
| [GGUF](https://huggingface.co/mradermacher/MemReader-4B-thinking-i1-GGUF/resolve/main/MemReader-4B-thinking.i1-Q6_K.gguf) | i1-Q6_K | 3.7 | practically like static Q6_K |
|
||||||
|
|
||||||
|
Here is a handy graph by ikawrakow comparing some lower-quality quant
|
||||||
|
types (lower is better):
|
||||||
|
|
||||||
|

|
||||||
|
|
||||||
|
And here are Artefact2's thoughts on the matter:
|
||||||
|
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
|
||||||
|
|
||||||
|
## FAQ / Model Request
|
||||||
|
|
||||||
|
See https://huggingface.co/mradermacher/model_requests for some answers to
|
||||||
|
questions you might have and/or if you want some other model quantized.
|
||||||
|
|
||||||
|
## Thanks
|
||||||
|
|
||||||
|
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
|
||||||
|
me use its servers and providing upgrades to my workstation to enable
|
||||||
|
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
|
||||||
|
|
||||||
|
<!-- end -->
|
||||||
Reference in New Issue
Block a user