初始化项目,由ModelHub XC社区提供模型
Model: mradermacher/Alexia.v2-1B-i1-GGUF Source: Original Platform
This commit is contained in:
60
.gitattributes
vendored
Normal file
60
.gitattributes
vendored
Normal file
@@ -0,0 +1,60 @@
|
||||
*.7z filter=lfs diff=lfs merge=lfs -text
|
||||
*.arrow filter=lfs diff=lfs merge=lfs -text
|
||||
*.bin filter=lfs diff=lfs merge=lfs -text
|
||||
*.bz2 filter=lfs diff=lfs merge=lfs -text
|
||||
*.ckpt filter=lfs diff=lfs merge=lfs -text
|
||||
*.ftz filter=lfs diff=lfs merge=lfs -text
|
||||
*.gz filter=lfs diff=lfs merge=lfs -text
|
||||
*.h5 filter=lfs diff=lfs merge=lfs -text
|
||||
*.joblib filter=lfs diff=lfs merge=lfs -text
|
||||
*.lfs.* filter=lfs diff=lfs merge=lfs -text
|
||||
*.mlmodel filter=lfs diff=lfs merge=lfs -text
|
||||
*.model filter=lfs diff=lfs merge=lfs -text
|
||||
*.msgpack filter=lfs diff=lfs merge=lfs -text
|
||||
*.npy filter=lfs diff=lfs merge=lfs -text
|
||||
*.npz filter=lfs diff=lfs merge=lfs -text
|
||||
*.onnx filter=lfs diff=lfs merge=lfs -text
|
||||
*.ot filter=lfs diff=lfs merge=lfs -text
|
||||
*.parquet filter=lfs diff=lfs merge=lfs -text
|
||||
*.pb filter=lfs diff=lfs merge=lfs -text
|
||||
*.pickle filter=lfs diff=lfs merge=lfs -text
|
||||
*.pkl filter=lfs diff=lfs merge=lfs -text
|
||||
*.pt filter=lfs diff=lfs merge=lfs -text
|
||||
*.pth filter=lfs diff=lfs merge=lfs -text
|
||||
*.rar filter=lfs diff=lfs merge=lfs -text
|
||||
*.safetensors filter=lfs diff=lfs merge=lfs -text
|
||||
saved_model/**/* filter=lfs diff=lfs merge=lfs -text
|
||||
*.tar.* filter=lfs diff=lfs merge=lfs -text
|
||||
*.tar filter=lfs diff=lfs merge=lfs -text
|
||||
*.tflite filter=lfs diff=lfs merge=lfs -text
|
||||
*.tgz filter=lfs diff=lfs merge=lfs -text
|
||||
*.wasm filter=lfs diff=lfs merge=lfs -text
|
||||
*.xz filter=lfs diff=lfs merge=lfs -text
|
||||
*.zip filter=lfs diff=lfs merge=lfs -text
|
||||
*.zst filter=lfs diff=lfs merge=lfs -text
|
||||
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
||||
Alexia.v2-1B.imatrix.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
Alexia.v2-1B.i1-IQ1_M.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
Alexia.v2-1B.i1-IQ1_S.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
Alexia.v2-1B.i1-IQ2_M.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
Alexia.v2-1B.i1-IQ2_S.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
Alexia.v2-1B.i1-IQ2_XS.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
Alexia.v2-1B.i1-IQ2_XXS.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
Alexia.v2-1B.i1-IQ3_M.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
Alexia.v2-1B.i1-IQ3_S.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
Alexia.v2-1B.i1-IQ3_XS.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
Alexia.v2-1B.i1-IQ3_XXS.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
Alexia.v2-1B.i1-IQ4_NL.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
Alexia.v2-1B.i1-IQ4_XS.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
Alexia.v2-1B.i1-Q2_K.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
Alexia.v2-1B.i1-Q2_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
Alexia.v2-1B.i1-Q3_K_L.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
Alexia.v2-1B.i1-Q3_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
Alexia.v2-1B.i1-Q3_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
Alexia.v2-1B.i1-Q4_0.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
Alexia.v2-1B.i1-Q4_1.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
Alexia.v2-1B.i1-Q4_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
Alexia.v2-1B.i1-Q4_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
Alexia.v2-1B.i1-Q5_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
Alexia.v2-1B.i1-Q5_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
Alexia.v2-1B.i1-Q6_K.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
3
Alexia.v2-1B.i1-IQ1_M.gguf
Normal file
3
Alexia.v2-1B.i1-IQ1_M.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:3730c600a8b444fb3a04312e8cd89db99eeaaf48a3a4a4b56a9c35a91ff1f6e3
|
||||
size 643485440
|
||||
3
Alexia.v2-1B.i1-IQ1_S.gguf
Normal file
3
Alexia.v2-1B.i1-IQ1_S.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:5d2de2d96e35e5ccaa129791d7e38cb4e4e44ca1fa09080bf7fd2c8a728ae4e3
|
||||
size 639193088
|
||||
3
Alexia.v2-1B.i1-IQ2_M.gguf
Normal file
3
Alexia.v2-1B.i1-IQ2_M.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:4320ae6b213166b8decf3c8462efd672f961a765e20cf40124f3d270fbbcc9db
|
||||
size 669783296
|
||||
3
Alexia.v2-1B.i1-IQ2_S.gguf
Normal file
3
Alexia.v2-1B.i1-IQ2_S.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:ff88cf611e7d240335ff8c31e9dc94b043415db9aab53351490b7615f2ee15d3
|
||||
size 664060160
|
||||
3
Alexia.v2-1B.i1-IQ2_XS.gguf
Normal file
3
Alexia.v2-1B.i1-IQ2_XS.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:2547f09de934c939747647d63368c95ec278722cccc4c82e5ed22f2dd14319de
|
||||
size 657320960
|
||||
3
Alexia.v2-1B.i1-IQ2_XXS.gguf
Normal file
3
Alexia.v2-1B.i1-IQ2_XXS.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:c8d8abcf6f639654af0a917532ba08ed6f1c74513e963db1b14b8d14956d1f38
|
||||
size 650639360
|
||||
3
Alexia.v2-1B.i1-IQ3_M.gguf
Normal file
3
Alexia.v2-1B.i1-IQ3_M.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:dd7da183c7597972e9d5aeec0ff48b1783aa3f21236af83e94286ede72b93f99
|
||||
size 697060352
|
||||
3
Alexia.v2-1B.i1-IQ3_S.gguf
Normal file
3
Alexia.v2-1B.i1-IQ3_S.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:d215c5823355ff347d7763c10bba5d962fb90f3e0212e40ad77b3004371e436a
|
||||
size 689814272
|
||||
3
Alexia.v2-1B.i1-IQ3_XS.gguf
Normal file
3
Alexia.v2-1B.i1-IQ3_XS.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:70a20106eeb9a4a75c93d1ad3d69d8f2a4927056540eb7a66bf5487af834fbf2
|
||||
size 689814272
|
||||
3
Alexia.v2-1B.i1-IQ3_XXS.gguf
Normal file
3
Alexia.v2-1B.i1-IQ3_XXS.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:32d1811097d803afdc3fb8c336e99f20a052c8c038565b2f33af707af062f1fd
|
||||
size 680109824
|
||||
3
Alexia.v2-1B.i1-IQ4_NL.gguf
Normal file
3
Alexia.v2-1B.i1-IQ4_NL.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:fea39c5066c6d2125848e21591360fe268699cba44c79665fdd519f1ebe5e477
|
||||
size 721862912
|
||||
3
Alexia.v2-1B.i1-IQ4_XS.gguf
Normal file
3
Alexia.v2-1B.i1-IQ4_XS.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:eda603c8fdbc2001aa57a6064b2ca79ca2c5eca990a9eadb8f7b241095fd4306
|
||||
size 714434816
|
||||
3
Alexia.v2-1B.i1-Q2_K.gguf
Normal file
3
Alexia.v2-1B.i1-Q2_K.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:fad50c2ff1d353c11acfa73a2fa7417d7535762f7ba292d265f9fbc0b599fd21
|
||||
size 689814272
|
||||
3
Alexia.v2-1B.i1-Q2_K_S.gguf
Normal file
3
Alexia.v2-1B.i1-Q2_K_S.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:3774dde94236d249e0977614793961a9ea69eff7e56d09924ce9caa3410c9139
|
||||
size 671271680
|
||||
3
Alexia.v2-1B.i1-Q3_K_L.gguf
Normal file
3
Alexia.v2-1B.i1-Q3_K_L.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:0a6dc3f5c85d4f1b6ea0f338b41ab280965676220b0d207b1d42b95d2349c7fa
|
||||
size 751575296
|
||||
3
Alexia.v2-1B.i1-Q3_K_M.gguf
Normal file
3
Alexia.v2-1B.i1-Q3_K_M.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:a702b13770fdc272968d9da68788057540547b780fa659e4b2e521118147ca89
|
||||
size 722415872
|
||||
3
Alexia.v2-1B.i1-Q3_K_S.gguf
Normal file
3
Alexia.v2-1B.i1-Q3_K_S.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:e268adc404b610e929645f8a3ee47b7a6542bea9ea14b55c1df7db9ae612067f
|
||||
size 688855808
|
||||
3
Alexia.v2-1B.i1-Q4_0.gguf
Normal file
3
Alexia.v2-1B.i1-Q4_0.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:4c08ff476126c1225ae71f1d5f2b3d8f593b37ef22be6fd0534a0dd317063603
|
||||
size 721918208
|
||||
3
Alexia.v2-1B.i1-Q4_1.gguf
Normal file
3
Alexia.v2-1B.i1-Q4_1.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:7f313d70579c4e1909d381e2ea5ff8dfd07639087793bb7015c51204d1956093
|
||||
size 764035328
|
||||
3
Alexia.v2-1B.i1-Q4_K_M.gguf
Normal file
3
Alexia.v2-1B.i1-Q4_K_M.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:ded372cd3fae2a30a98074afc89bae2848833736251abc3267f489308aafd545
|
||||
size 806057984
|
||||
3
Alexia.v2-1B.i1-Q4_K_S.gguf
Normal file
3
Alexia.v2-1B.i1-Q4_K_S.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:89967daa0194a75c32caf3c612083747ca33793879d17a47cda06bad65fa94a8
|
||||
size 780992768
|
||||
3
Alexia.v2-1B.i1-Q5_K_M.gguf
Normal file
3
Alexia.v2-1B.i1-Q5_K_M.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:b993862185f3062fd89bddd2340050f07b000e4e402292832e7a8df0b3429a7c
|
||||
size 851345408
|
||||
3
Alexia.v2-1B.i1-Q5_K_S.gguf
Normal file
3
Alexia.v2-1B.i1-Q5_K_S.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:0f438f9f9a73695347287415364a0248d7d81710acc7af2520b2488a51581628
|
||||
size 836399360
|
||||
3
Alexia.v2-1B.i1-Q6_K.gguf
Normal file
3
Alexia.v2-1B.i1-Q6_K.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:c237c42b85f19e6044c6d3847569619457c148d03e9cb2b204b3bcc84db21265
|
||||
size 1011738368
|
||||
3
Alexia.v2-1B.imatrix.gguf
Normal file
3
Alexia.v2-1B.imatrix.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:6d527cfa6644a35f29884e52f9d825e15b4e2d0e62870076d3b78ed050c604a7
|
||||
size 1452416
|
||||
94
README.md
Normal file
94
README.md
Normal file
@@ -0,0 +1,94 @@
|
||||
---
|
||||
base_model: UmbrellaInc/Alexia.v2-1B
|
||||
datasets:
|
||||
- TeichAI/glm-4.7-2000x
|
||||
language:
|
||||
- es
|
||||
- en
|
||||
library_name: transformers
|
||||
license: gemma
|
||||
mradermacher:
|
||||
readme_rev: 1
|
||||
quantized_by: mradermacher
|
||||
tags:
|
||||
- uncensored
|
||||
- nsfw
|
||||
- mutation
|
||||
- merge
|
||||
- not-for-all-audiences
|
||||
---
|
||||
## About
|
||||
|
||||
<!-- ### quantize_version: 2 -->
|
||||
<!-- ### output_tensor_quantised: 1 -->
|
||||
<!-- ### convert_type: hf -->
|
||||
<!-- ### vocab_type: -->
|
||||
<!-- ### tags: nicoboss -->
|
||||
<!-- ### quants: Q2_K IQ3_M Q4_K_S IQ3_XXS Q3_K_M small-IQ4_NL Q4_K_M IQ2_M Q6_K IQ4_XS Q2_K_S IQ1_M Q3_K_S IQ2_XXS Q3_K_L IQ2_XS Q5_K_S IQ2_S IQ1_S Q5_K_M Q4_0 IQ3_XS Q4_1 IQ3_S -->
|
||||
<!-- ### quants_skip: -->
|
||||
<!-- ### skip_mmproj: -->
|
||||
weighted/imatrix quants of https://huggingface.co/UmbrellaInc/Alexia.v2-1B
|
||||
|
||||
<!-- provided-files -->
|
||||
|
||||
***For a convenient overview and download list, visit our [model page for this model](https://hf.tst.eu/model#Alexia.v2-1B-i1-GGUF).***
|
||||
|
||||
static quants are available at https://huggingface.co/mradermacher/Alexia.v2-1B-GGUF
|
||||
## Usage
|
||||
|
||||
If you are unsure how to use GGUF files, refer to one of [TheBloke's
|
||||
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
|
||||
more details, including on how to concatenate multi-part files.
|
||||
|
||||
## Provided Quants
|
||||
|
||||
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
|
||||
|
||||
| Link | Type | Size/GB | Notes |
|
||||
|:-----|:-----|--------:|:------|
|
||||
| [GGUF](https://huggingface.co/mradermacher/Alexia.v2-1B-i1-GGUF/resolve/main/Alexia.v2-1B.imatrix.gguf) | imatrix | 0.1 | imatrix file (for creating your own quants) |
|
||||
| [GGUF](https://huggingface.co/mradermacher/Alexia.v2-1B-i1-GGUF/resolve/main/Alexia.v2-1B.i1-IQ1_S.gguf) | i1-IQ1_S | 0.7 | for the desperate |
|
||||
| [GGUF](https://huggingface.co/mradermacher/Alexia.v2-1B-i1-GGUF/resolve/main/Alexia.v2-1B.i1-IQ1_M.gguf) | i1-IQ1_M | 0.7 | mostly desperate |
|
||||
| [GGUF](https://huggingface.co/mradermacher/Alexia.v2-1B-i1-GGUF/resolve/main/Alexia.v2-1B.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 0.8 | |
|
||||
| [GGUF](https://huggingface.co/mradermacher/Alexia.v2-1B-i1-GGUF/resolve/main/Alexia.v2-1B.i1-IQ2_XS.gguf) | i1-IQ2_XS | 0.8 | |
|
||||
| [GGUF](https://huggingface.co/mradermacher/Alexia.v2-1B-i1-GGUF/resolve/main/Alexia.v2-1B.i1-IQ2_S.gguf) | i1-IQ2_S | 0.8 | |
|
||||
| [GGUF](https://huggingface.co/mradermacher/Alexia.v2-1B-i1-GGUF/resolve/main/Alexia.v2-1B.i1-IQ2_M.gguf) | i1-IQ2_M | 0.8 | |
|
||||
| [GGUF](https://huggingface.co/mradermacher/Alexia.v2-1B-i1-GGUF/resolve/main/Alexia.v2-1B.i1-Q2_K_S.gguf) | i1-Q2_K_S | 0.8 | very low quality |
|
||||
| [GGUF](https://huggingface.co/mradermacher/Alexia.v2-1B-i1-GGUF/resolve/main/Alexia.v2-1B.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 0.8 | lower quality |
|
||||
| [GGUF](https://huggingface.co/mradermacher/Alexia.v2-1B-i1-GGUF/resolve/main/Alexia.v2-1B.i1-Q3_K_S.gguf) | i1-Q3_K_S | 0.8 | IQ3_XS probably better |
|
||||
| [GGUF](https://huggingface.co/mradermacher/Alexia.v2-1B-i1-GGUF/resolve/main/Alexia.v2-1B.i1-IQ3_S.gguf) | i1-IQ3_S | 0.8 | beats Q3_K* |
|
||||
| [GGUF](https://huggingface.co/mradermacher/Alexia.v2-1B-i1-GGUF/resolve/main/Alexia.v2-1B.i1-IQ3_XS.gguf) | i1-IQ3_XS | 0.8 | |
|
||||
| [GGUF](https://huggingface.co/mradermacher/Alexia.v2-1B-i1-GGUF/resolve/main/Alexia.v2-1B.i1-Q2_K.gguf) | i1-Q2_K | 0.8 | IQ3_XXS probably better |
|
||||
| [GGUF](https://huggingface.co/mradermacher/Alexia.v2-1B-i1-GGUF/resolve/main/Alexia.v2-1B.i1-IQ3_M.gguf) | i1-IQ3_M | 0.8 | |
|
||||
| [GGUF](https://huggingface.co/mradermacher/Alexia.v2-1B-i1-GGUF/resolve/main/Alexia.v2-1B.i1-IQ4_XS.gguf) | i1-IQ4_XS | 0.8 | |
|
||||
| [GGUF](https://huggingface.co/mradermacher/Alexia.v2-1B-i1-GGUF/resolve/main/Alexia.v2-1B.i1-IQ4_NL.gguf) | i1-IQ4_NL | 0.8 | prefer IQ4_XS |
|
||||
| [GGUF](https://huggingface.co/mradermacher/Alexia.v2-1B-i1-GGUF/resolve/main/Alexia.v2-1B.i1-Q4_0.gguf) | i1-Q4_0 | 0.8 | fast, low quality |
|
||||
| [GGUF](https://huggingface.co/mradermacher/Alexia.v2-1B-i1-GGUF/resolve/main/Alexia.v2-1B.i1-Q3_K_M.gguf) | i1-Q3_K_M | 0.8 | IQ3_S probably better |
|
||||
| [GGUF](https://huggingface.co/mradermacher/Alexia.v2-1B-i1-GGUF/resolve/main/Alexia.v2-1B.i1-Q3_K_L.gguf) | i1-Q3_K_L | 0.9 | IQ3_M probably better |
|
||||
| [GGUF](https://huggingface.co/mradermacher/Alexia.v2-1B-i1-GGUF/resolve/main/Alexia.v2-1B.i1-Q4_1.gguf) | i1-Q4_1 | 0.9 | |
|
||||
| [GGUF](https://huggingface.co/mradermacher/Alexia.v2-1B-i1-GGUF/resolve/main/Alexia.v2-1B.i1-Q4_K_S.gguf) | i1-Q4_K_S | 0.9 | optimal size/speed/quality |
|
||||
| [GGUF](https://huggingface.co/mradermacher/Alexia.v2-1B-i1-GGUF/resolve/main/Alexia.v2-1B.i1-Q4_K_M.gguf) | i1-Q4_K_M | 0.9 | fast, recommended |
|
||||
| [GGUF](https://huggingface.co/mradermacher/Alexia.v2-1B-i1-GGUF/resolve/main/Alexia.v2-1B.i1-Q5_K_S.gguf) | i1-Q5_K_S | 0.9 | |
|
||||
| [GGUF](https://huggingface.co/mradermacher/Alexia.v2-1B-i1-GGUF/resolve/main/Alexia.v2-1B.i1-Q5_K_M.gguf) | i1-Q5_K_M | 1.0 | |
|
||||
| [GGUF](https://huggingface.co/mradermacher/Alexia.v2-1B-i1-GGUF/resolve/main/Alexia.v2-1B.i1-Q6_K.gguf) | i1-Q6_K | 1.1 | practically like static Q6_K |
|
||||
|
||||
Here is a handy graph by ikawrakow comparing some lower-quality quant
|
||||
types (lower is better):
|
||||
|
||||

|
||||
|
||||
And here are Artefact2's thoughts on the matter:
|
||||
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
|
||||
|
||||
## FAQ / Model Request
|
||||
|
||||
See https://huggingface.co/mradermacher/model_requests for some answers to
|
||||
questions you might have and/or if you want some other model quantized.
|
||||
|
||||
## Thanks
|
||||
|
||||
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
|
||||
me use its servers and providing upgrades to my workstation to enable
|
||||
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
|
||||
|
||||
<!-- end -->
|
||||
Reference in New Issue
Block a user