初始化项目,由ModelHub XC社区提供模型
Model: mradermacher/G-Zombie-3.2-1B-i1-GGUF Source: Original Platform
This commit is contained in:
60
.gitattributes
vendored
Normal file
60
.gitattributes
vendored
Normal file
@@ -0,0 +1,60 @@
|
||||
*.7z filter=lfs diff=lfs merge=lfs -text
|
||||
*.arrow filter=lfs diff=lfs merge=lfs -text
|
||||
*.bin filter=lfs diff=lfs merge=lfs -text
|
||||
*.bz2 filter=lfs diff=lfs merge=lfs -text
|
||||
*.ckpt filter=lfs diff=lfs merge=lfs -text
|
||||
*.ftz filter=lfs diff=lfs merge=lfs -text
|
||||
*.gz filter=lfs diff=lfs merge=lfs -text
|
||||
*.h5 filter=lfs diff=lfs merge=lfs -text
|
||||
*.joblib filter=lfs diff=lfs merge=lfs -text
|
||||
*.lfs.* filter=lfs diff=lfs merge=lfs -text
|
||||
*.mlmodel filter=lfs diff=lfs merge=lfs -text
|
||||
*.model filter=lfs diff=lfs merge=lfs -text
|
||||
*.msgpack filter=lfs diff=lfs merge=lfs -text
|
||||
*.npy filter=lfs diff=lfs merge=lfs -text
|
||||
*.npz filter=lfs diff=lfs merge=lfs -text
|
||||
*.onnx filter=lfs diff=lfs merge=lfs -text
|
||||
*.ot filter=lfs diff=lfs merge=lfs -text
|
||||
*.parquet filter=lfs diff=lfs merge=lfs -text
|
||||
*.pb filter=lfs diff=lfs merge=lfs -text
|
||||
*.pickle filter=lfs diff=lfs merge=lfs -text
|
||||
*.pkl filter=lfs diff=lfs merge=lfs -text
|
||||
*.pt filter=lfs diff=lfs merge=lfs -text
|
||||
*.pth filter=lfs diff=lfs merge=lfs -text
|
||||
*.rar filter=lfs diff=lfs merge=lfs -text
|
||||
*.safetensors filter=lfs diff=lfs merge=lfs -text
|
||||
saved_model/**/* filter=lfs diff=lfs merge=lfs -text
|
||||
*.tar.* filter=lfs diff=lfs merge=lfs -text
|
||||
*.tar filter=lfs diff=lfs merge=lfs -text
|
||||
*.tflite filter=lfs diff=lfs merge=lfs -text
|
||||
*.tgz filter=lfs diff=lfs merge=lfs -text
|
||||
*.wasm filter=lfs diff=lfs merge=lfs -text
|
||||
*.xz filter=lfs diff=lfs merge=lfs -text
|
||||
*.zip filter=lfs diff=lfs merge=lfs -text
|
||||
*.zst filter=lfs diff=lfs merge=lfs -text
|
||||
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
||||
G-Zombie-3.2-1B.imatrix.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
G-Zombie-3.2-1B.i1-IQ1_M.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
G-Zombie-3.2-1B.i1-IQ1_S.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
G-Zombie-3.2-1B.i1-IQ2_M.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
G-Zombie-3.2-1B.i1-IQ2_S.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
G-Zombie-3.2-1B.i1-IQ2_XS.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
G-Zombie-3.2-1B.i1-IQ2_XXS.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
G-Zombie-3.2-1B.i1-IQ3_M.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
G-Zombie-3.2-1B.i1-IQ3_S.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
G-Zombie-3.2-1B.i1-IQ3_XS.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
G-Zombie-3.2-1B.i1-IQ3_XXS.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
G-Zombie-3.2-1B.i1-IQ4_NL.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
G-Zombie-3.2-1B.i1-IQ4_XS.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
G-Zombie-3.2-1B.i1-Q2_K.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
G-Zombie-3.2-1B.i1-Q2_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
G-Zombie-3.2-1B.i1-Q3_K_L.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
G-Zombie-3.2-1B.i1-Q3_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
G-Zombie-3.2-1B.i1-Q3_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
G-Zombie-3.2-1B.i1-Q4_0.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
G-Zombie-3.2-1B.i1-Q4_1.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
G-Zombie-3.2-1B.i1-Q4_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
G-Zombie-3.2-1B.i1-Q4_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
G-Zombie-3.2-1B.i1-Q5_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
G-Zombie-3.2-1B.i1-Q5_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
G-Zombie-3.2-1B.i1-Q6_K.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
3
G-Zombie-3.2-1B.i1-IQ1_M.gguf
Normal file
3
G-Zombie-3.2-1B.i1-IQ1_M.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:dd4781b1cada032a515e402c6910a45d37a68a4164c7a7a0004b5ded76ec63f7
|
||||
size 413606752
|
||||
3
G-Zombie-3.2-1B.i1-IQ1_S.gguf
Normal file
3
G-Zombie-3.2-1B.i1-IQ1_S.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:075a8b71d024f47b0b1149176aa662142e1fa54ef33472a45d8d96397fb1fe44
|
||||
size 393552736
|
||||
3
G-Zombie-3.2-1B.i1-IQ2_M.gguf
Normal file
3
G-Zombie-3.2-1B.i1-IQ2_M.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:b22bfcbf37b903cec81a3fea3d574d926f2c1c8701944c8b2683971064588d9e
|
||||
size 515449696
|
||||
3
G-Zombie-3.2-1B.i1-IQ2_S.gguf
Normal file
3
G-Zombie-3.2-1B.i1-IQ2_S.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:692ef55c0e25f61ea7d173a38eed41fd0978fee3f7ed76d67267aba1afa88053
|
||||
size 488711008
|
||||
3
G-Zombie-3.2-1B.i1-IQ2_XS.gguf
Normal file
3
G-Zombie-3.2-1B.i1-IQ2_XS.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:30fec7fdabade5cff066c6c28dce1f039cfe67151deca9818b6c20f619f94f80
|
||||
size 475865952
|
||||
3
G-Zombie-3.2-1B.i1-IQ2_XXS.gguf
Normal file
3
G-Zombie-3.2-1B.i1-IQ2_XXS.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:2113b93f1dadd89f4221eb9511139d0bfbc8c9fe5d5a57d1add3d7343c109936
|
||||
size 447030112
|
||||
3
G-Zombie-3.2-1B.i1-IQ3_M.gguf
Normal file
3
G-Zombie-3.2-1B.i1-IQ3_M.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:983592931d202cd9eadc4f32785207bf862da5db6ccf287f005aa518d62ccc71
|
||||
size 657290080
|
||||
3
G-Zombie-3.2-1B.i1-IQ3_S.gguf
Normal file
3
G-Zombie-3.2-1B.i1-IQ3_S.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:2d7f7e70d34b9646471b456fab8a82095072774213dbbbd87d40d1f43ed34ba6
|
||||
size 643920736
|
||||
3
G-Zombie-3.2-1B.i1-IQ3_XS.gguf
Normal file
3
G-Zombie-3.2-1B.i1-IQ3_XS.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:6b1af278994de6e1ae284fe2d85814433b4720e54c0c64dc5f64e3d4145cfc26
|
||||
size 621114208
|
||||
3
G-Zombie-3.2-1B.i1-IQ3_XXS.gguf
Normal file
3
G-Zombie-3.2-1B.i1-IQ3_XXS.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:41f390a0097cf7cc8ce4c78095e2dfd0776c47928a394465d3e17fd0dc3e4556
|
||||
size 562111328
|
||||
3
G-Zombie-3.2-1B.i1-IQ4_NL.gguf
Normal file
3
G-Zombie-3.2-1B.i1-IQ4_NL.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:8f0c3fd703d8ee711b39951b7d39a17ee2c8e058b5dfa6dca38dd3001028785b
|
||||
size 773026656
|
||||
3
G-Zombie-3.2-1B.i1-IQ4_XS.gguf
Normal file
3
G-Zombie-3.2-1B.i1-IQ4_XS.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:304334234aa0504d8ec7af6a7e18ac25406293456ed00a174e1bcb651e71881e
|
||||
size 743142240
|
||||
3
G-Zombie-3.2-1B.i1-Q2_K.gguf
Normal file
3
G-Zombie-3.2-1B.i1-Q2_K.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:62076797f7a9ac3f034856cdceac6bfb6ccf874c21d85322e2ec1724abdc7252
|
||||
size 580875104
|
||||
3
G-Zombie-3.2-1B.i1-Q2_K_S.gguf
Normal file
3
G-Zombie-3.2-1B.i1-Q2_K_S.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:8753892f81a3c635310351bc6115f597e600e4e26c94af792170c5bfb5d5f8b1
|
||||
size 554660704
|
||||
3
G-Zombie-3.2-1B.i1-Q3_K_L.gguf
Normal file
3
G-Zombie-3.2-1B.i1-Q3_K_L.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:38426e6180408b0d1eaf90597b2379942ee00aaa0a80db1c4ff13765ea3c9b6a
|
||||
size 732525408
|
||||
3
G-Zombie-3.2-1B.i1-Q3_K_M.gguf
Normal file
3
G-Zombie-3.2-1B.i1-Q3_K_M.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:d45717d247cffcb49181fab2e366e61cdbc5a0ce9fdb0a3254f9b84dd48e9c5e
|
||||
size 690844512
|
||||
3
G-Zombie-3.2-1B.i1-Q3_K_S.gguf
Normal file
3
G-Zombie-3.2-1B.i1-Q3_K_S.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:3a80387d4685592534905bba2fbe1becf6ad761a926a99b7e382687384b22864
|
||||
size 641692512
|
||||
3
G-Zombie-3.2-1B.i1-Q4_0.gguf
Normal file
3
G-Zombie-3.2-1B.i1-Q4_0.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:1dfb6f60cb05dbbc27286c2de5b4213ed0d6230b6652a82557fc533d0a02cf22
|
||||
size 773026656
|
||||
3
G-Zombie-3.2-1B.i1-Q4_1.gguf
Normal file
3
G-Zombie-3.2-1B.i1-Q4_1.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:64df14db1dc71e678248b33e0d3322004a52c87f0091a7aaa1d2dce9a31a3c85
|
||||
size 831746912
|
||||
3
G-Zombie-3.2-1B.i1-Q4_K_M.gguf
Normal file
3
G-Zombie-3.2-1B.i1-Q4_K_M.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:f05681404ab10024927d7db9058a63d261f855afa84f3c657c8a8da6963835a5
|
||||
size 807695200
|
||||
3
G-Zombie-3.2-1B.i1-Q4_K_S.gguf
Normal file
3
G-Zombie-3.2-1B.i1-Q4_K_S.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:709ead474510a67e8cbfbcd1f3b461d6f70e2d961be9a4932481268fceb0a30f
|
||||
size 775648096
|
||||
3
G-Zombie-3.2-1B.i1-Q5_K_M.gguf
Normal file
3
G-Zombie-3.2-1B.i1-Q5_K_M.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:fe4f48ce79d25e87579125acf19585de85d77b6e040de6efe59911224a8fd57d
|
||||
size 911504224
|
||||
3
G-Zombie-3.2-1B.i1-Q5_K_S.gguf
Normal file
3
G-Zombie-3.2-1B.i1-Q5_K_S.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:17a6a0dc53b77c7ccd1225272f6ea62fcfad91c63716a66806cef7b386f20aa9
|
||||
size 892564320
|
||||
3
G-Zombie-3.2-1B.i1-Q6_K.gguf
Normal file
3
G-Zombie-3.2-1B.i1-Q6_K.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:1d489a98511b3f90ad445af5d423e058f3e82c3d009e911262636c8eac0f971f
|
||||
size 1021801312
|
||||
3
G-Zombie-3.2-1B.imatrix.gguf
Normal file
3
G-Zombie-3.2-1B.imatrix.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:fad85dfe4e4c2bb388ca1890598980c68e0fc094980b34f15e8addcd6ac9caab
|
||||
size 1328000
|
||||
87
README.md
Normal file
87
README.md
Normal file
@@ -0,0 +1,87 @@
|
||||
---
|
||||
base_model: UmbrellaInc/G-Zombie-3.2-1B
|
||||
language:
|
||||
- en
|
||||
library_name: transformers
|
||||
mradermacher:
|
||||
readme_rev: 1
|
||||
quantized_by: mradermacher
|
||||
tags:
|
||||
- mergekit
|
||||
- merge
|
||||
---
|
||||
## About
|
||||
|
||||
<!-- ### quantize_version: 2 -->
|
||||
<!-- ### output_tensor_quantised: 1 -->
|
||||
<!-- ### convert_type: hf -->
|
||||
<!-- ### vocab_type: -->
|
||||
<!-- ### tags: nicoboss -->
|
||||
<!-- ### quants: Q2_K IQ3_M Q4_K_S IQ3_XXS Q3_K_M small-IQ4_NL Q4_K_M IQ2_M Q6_K IQ4_XS Q2_K_S IQ1_M Q3_K_S IQ2_XXS Q3_K_L IQ2_XS Q5_K_S IQ2_S IQ1_S Q5_K_M Q4_0 IQ3_XS Q4_1 IQ3_S -->
|
||||
<!-- ### quants_skip: -->
|
||||
<!-- ### skip_mmproj: -->
|
||||
weighted/imatrix quants of https://huggingface.co/UmbrellaInc/G-Zombie-3.2-1B
|
||||
|
||||
<!-- provided-files -->
|
||||
|
||||
***For a convenient overview and download list, visit our [model page for this model](https://hf.tst.eu/model#G-Zombie-3.2-1B-i1-GGUF).***
|
||||
|
||||
static quants are available at https://huggingface.co/mradermacher/G-Zombie-3.2-1B-GGUF
|
||||
## Usage
|
||||
|
||||
If you are unsure how to use GGUF files, refer to one of [TheBloke's
|
||||
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
|
||||
more details, including on how to concatenate multi-part files.
|
||||
|
||||
## Provided Quants
|
||||
|
||||
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
|
||||
|
||||
| Link | Type | Size/GB | Notes |
|
||||
|:-----|:-----|--------:|:------|
|
||||
| [GGUF](https://huggingface.co/mradermacher/G-Zombie-3.2-1B-i1-GGUF/resolve/main/G-Zombie-3.2-1B.imatrix.gguf) | imatrix | 0.1 | imatrix file (for creating your own quants) |
|
||||
| [GGUF](https://huggingface.co/mradermacher/G-Zombie-3.2-1B-i1-GGUF/resolve/main/G-Zombie-3.2-1B.i1-IQ1_S.gguf) | i1-IQ1_S | 0.5 | for the desperate |
|
||||
| [GGUF](https://huggingface.co/mradermacher/G-Zombie-3.2-1B-i1-GGUF/resolve/main/G-Zombie-3.2-1B.i1-IQ1_M.gguf) | i1-IQ1_M | 0.5 | mostly desperate |
|
||||
| [GGUF](https://huggingface.co/mradermacher/G-Zombie-3.2-1B-i1-GGUF/resolve/main/G-Zombie-3.2-1B.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 0.5 | |
|
||||
| [GGUF](https://huggingface.co/mradermacher/G-Zombie-3.2-1B-i1-GGUF/resolve/main/G-Zombie-3.2-1B.i1-IQ2_XS.gguf) | i1-IQ2_XS | 0.6 | |
|
||||
| [GGUF](https://huggingface.co/mradermacher/G-Zombie-3.2-1B-i1-GGUF/resolve/main/G-Zombie-3.2-1B.i1-IQ2_S.gguf) | i1-IQ2_S | 0.6 | |
|
||||
| [GGUF](https://huggingface.co/mradermacher/G-Zombie-3.2-1B-i1-GGUF/resolve/main/G-Zombie-3.2-1B.i1-IQ2_M.gguf) | i1-IQ2_M | 0.6 | |
|
||||
| [GGUF](https://huggingface.co/mradermacher/G-Zombie-3.2-1B-i1-GGUF/resolve/main/G-Zombie-3.2-1B.i1-Q2_K_S.gguf) | i1-Q2_K_S | 0.7 | very low quality |
|
||||
| [GGUF](https://huggingface.co/mradermacher/G-Zombie-3.2-1B-i1-GGUF/resolve/main/G-Zombie-3.2-1B.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 0.7 | lower quality |
|
||||
| [GGUF](https://huggingface.co/mradermacher/G-Zombie-3.2-1B-i1-GGUF/resolve/main/G-Zombie-3.2-1B.i1-Q2_K.gguf) | i1-Q2_K | 0.7 | IQ3_XXS probably better |
|
||||
| [GGUF](https://huggingface.co/mradermacher/G-Zombie-3.2-1B-i1-GGUF/resolve/main/G-Zombie-3.2-1B.i1-IQ3_XS.gguf) | i1-IQ3_XS | 0.7 | |
|
||||
| [GGUF](https://huggingface.co/mradermacher/G-Zombie-3.2-1B-i1-GGUF/resolve/main/G-Zombie-3.2-1B.i1-Q3_K_S.gguf) | i1-Q3_K_S | 0.7 | IQ3_XS probably better |
|
||||
| [GGUF](https://huggingface.co/mradermacher/G-Zombie-3.2-1B-i1-GGUF/resolve/main/G-Zombie-3.2-1B.i1-IQ3_S.gguf) | i1-IQ3_S | 0.7 | beats Q3_K* |
|
||||
| [GGUF](https://huggingface.co/mradermacher/G-Zombie-3.2-1B-i1-GGUF/resolve/main/G-Zombie-3.2-1B.i1-IQ3_M.gguf) | i1-IQ3_M | 0.8 | |
|
||||
| [GGUF](https://huggingface.co/mradermacher/G-Zombie-3.2-1B-i1-GGUF/resolve/main/G-Zombie-3.2-1B.i1-Q3_K_M.gguf) | i1-Q3_K_M | 0.8 | IQ3_S probably better |
|
||||
| [GGUF](https://huggingface.co/mradermacher/G-Zombie-3.2-1B-i1-GGUF/resolve/main/G-Zombie-3.2-1B.i1-Q3_K_L.gguf) | i1-Q3_K_L | 0.8 | IQ3_M probably better |
|
||||
| [GGUF](https://huggingface.co/mradermacher/G-Zombie-3.2-1B-i1-GGUF/resolve/main/G-Zombie-3.2-1B.i1-IQ4_XS.gguf) | i1-IQ4_XS | 0.8 | |
|
||||
| [GGUF](https://huggingface.co/mradermacher/G-Zombie-3.2-1B-i1-GGUF/resolve/main/G-Zombie-3.2-1B.i1-IQ4_NL.gguf) | i1-IQ4_NL | 0.9 | prefer IQ4_XS |
|
||||
| [GGUF](https://huggingface.co/mradermacher/G-Zombie-3.2-1B-i1-GGUF/resolve/main/G-Zombie-3.2-1B.i1-Q4_0.gguf) | i1-Q4_0 | 0.9 | fast, low quality |
|
||||
| [GGUF](https://huggingface.co/mradermacher/G-Zombie-3.2-1B-i1-GGUF/resolve/main/G-Zombie-3.2-1B.i1-Q4_K_S.gguf) | i1-Q4_K_S | 0.9 | optimal size/speed/quality |
|
||||
| [GGUF](https://huggingface.co/mradermacher/G-Zombie-3.2-1B-i1-GGUF/resolve/main/G-Zombie-3.2-1B.i1-Q4_K_M.gguf) | i1-Q4_K_M | 0.9 | fast, recommended |
|
||||
| [GGUF](https://huggingface.co/mradermacher/G-Zombie-3.2-1B-i1-GGUF/resolve/main/G-Zombie-3.2-1B.i1-Q4_1.gguf) | i1-Q4_1 | 0.9 | |
|
||||
| [GGUF](https://huggingface.co/mradermacher/G-Zombie-3.2-1B-i1-GGUF/resolve/main/G-Zombie-3.2-1B.i1-Q5_K_S.gguf) | i1-Q5_K_S | 1.0 | |
|
||||
| [GGUF](https://huggingface.co/mradermacher/G-Zombie-3.2-1B-i1-GGUF/resolve/main/G-Zombie-3.2-1B.i1-Q5_K_M.gguf) | i1-Q5_K_M | 1.0 | |
|
||||
| [GGUF](https://huggingface.co/mradermacher/G-Zombie-3.2-1B-i1-GGUF/resolve/main/G-Zombie-3.2-1B.i1-Q6_K.gguf) | i1-Q6_K | 1.1 | practically like static Q6_K |
|
||||
|
||||
Here is a handy graph by ikawrakow comparing some lower-quality quant
|
||||
types (lower is better):
|
||||
|
||||

|
||||
|
||||
And here are Artefact2's thoughts on the matter:
|
||||
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
|
||||
|
||||
## FAQ / Model Request
|
||||
|
||||
See https://huggingface.co/mradermacher/model_requests for some answers to
|
||||
questions you might have and/or if you want some other model quantized.
|
||||
|
||||
## Thanks
|
||||
|
||||
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
|
||||
me use its servers and providing upgrades to my workstation to enable
|
||||
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
|
||||
|
||||
<!-- end -->
|
||||
Reference in New Issue
Block a user