初始化项目,由ModelHub XC社区提供模型

Model: mradermacher/Kosmos-Aurora_faustus-8B-i1-GGUF
Source: Original Platform
This commit is contained in:
ModelHub XC
2026-05-02 14:17:39 +08:00
commit daf5652b72
27 changed files with 213 additions and 0 deletions

60
.gitattributes vendored Normal file
View File

@@ -0,0 +1,60 @@
*.7z filter=lfs diff=lfs merge=lfs -text
*.arrow filter=lfs diff=lfs merge=lfs -text
*.bin filter=lfs diff=lfs merge=lfs -text
*.bz2 filter=lfs diff=lfs merge=lfs -text
*.ckpt filter=lfs diff=lfs merge=lfs -text
*.ftz filter=lfs diff=lfs merge=lfs -text
*.gz filter=lfs diff=lfs merge=lfs -text
*.h5 filter=lfs diff=lfs merge=lfs -text
*.joblib filter=lfs diff=lfs merge=lfs -text
*.lfs.* filter=lfs diff=lfs merge=lfs -text
*.mlmodel filter=lfs diff=lfs merge=lfs -text
*.model filter=lfs diff=lfs merge=lfs -text
*.msgpack filter=lfs diff=lfs merge=lfs -text
*.npy filter=lfs diff=lfs merge=lfs -text
*.npz filter=lfs diff=lfs merge=lfs -text
*.onnx filter=lfs diff=lfs merge=lfs -text
*.ot filter=lfs diff=lfs merge=lfs -text
*.parquet filter=lfs diff=lfs merge=lfs -text
*.pb filter=lfs diff=lfs merge=lfs -text
*.pickle filter=lfs diff=lfs merge=lfs -text
*.pkl filter=lfs diff=lfs merge=lfs -text
*.pt filter=lfs diff=lfs merge=lfs -text
*.pth filter=lfs diff=lfs merge=lfs -text
*.rar filter=lfs diff=lfs merge=lfs -text
*.safetensors filter=lfs diff=lfs merge=lfs -text
saved_model/**/* filter=lfs diff=lfs merge=lfs -text
*.tar.* filter=lfs diff=lfs merge=lfs -text
*.tar filter=lfs diff=lfs merge=lfs -text
*.tflite filter=lfs diff=lfs merge=lfs -text
*.tgz filter=lfs diff=lfs merge=lfs -text
*.wasm filter=lfs diff=lfs merge=lfs -text
*.xz filter=lfs diff=lfs merge=lfs -text
*.zip filter=lfs diff=lfs merge=lfs -text
*.zst filter=lfs diff=lfs merge=lfs -text
*tfevents* filter=lfs diff=lfs merge=lfs -text
imatrix.dat filter=lfs diff=lfs merge=lfs -text
Kosmos-Aurora_faustus-8B.i1-Q2_K.gguf filter=lfs diff=lfs merge=lfs -text
Kosmos-Aurora_faustus-8B.i1-IQ3_M.gguf filter=lfs diff=lfs merge=lfs -text
Kosmos-Aurora_faustus-8B.i1-Q4_K_S.gguf filter=lfs diff=lfs merge=lfs -text
Kosmos-Aurora_faustus-8B.i1-IQ3_XXS.gguf filter=lfs diff=lfs merge=lfs -text
Kosmos-Aurora_faustus-8B.i1-Q3_K_M.gguf filter=lfs diff=lfs merge=lfs -text
Kosmos-Aurora_faustus-8B.i1-IQ4_NL.gguf filter=lfs diff=lfs merge=lfs -text
Kosmos-Aurora_faustus-8B.i1-IQ2_M.gguf filter=lfs diff=lfs merge=lfs -text
Kosmos-Aurora_faustus-8B.i1-Q4_K_M.gguf filter=lfs diff=lfs merge=lfs -text
Kosmos-Aurora_faustus-8B.i1-Q2_K_S.gguf filter=lfs diff=lfs merge=lfs -text
Kosmos-Aurora_faustus-8B.i1-Q6_K.gguf filter=lfs diff=lfs merge=lfs -text
Kosmos-Aurora_faustus-8B.i1-IQ4_XS.gguf filter=lfs diff=lfs merge=lfs -text
Kosmos-Aurora_faustus-8B.i1-IQ1_M.gguf filter=lfs diff=lfs merge=lfs -text
Kosmos-Aurora_faustus-8B.i1-Q3_K_S.gguf filter=lfs diff=lfs merge=lfs -text
Kosmos-Aurora_faustus-8B.i1-IQ2_XXS.gguf filter=lfs diff=lfs merge=lfs -text
Kosmos-Aurora_faustus-8B.i1-Q3_K_L.gguf filter=lfs diff=lfs merge=lfs -text
Kosmos-Aurora_faustus-8B.i1-IQ2_XS.gguf filter=lfs diff=lfs merge=lfs -text
Kosmos-Aurora_faustus-8B.i1-Q5_K_S.gguf filter=lfs diff=lfs merge=lfs -text
Kosmos-Aurora_faustus-8B.i1-IQ2_S.gguf filter=lfs diff=lfs merge=lfs -text
Kosmos-Aurora_faustus-8B.i1-IQ1_S.gguf filter=lfs diff=lfs merge=lfs -text
Kosmos-Aurora_faustus-8B.i1-Q5_K_M.gguf filter=lfs diff=lfs merge=lfs -text
Kosmos-Aurora_faustus-8B.i1-Q4_0.gguf filter=lfs diff=lfs merge=lfs -text
Kosmos-Aurora_faustus-8B.i1-IQ3_XS.gguf filter=lfs diff=lfs merge=lfs -text
Kosmos-Aurora_faustus-8B.i1-Q4_1.gguf filter=lfs diff=lfs merge=lfs -text
Kosmos-Aurora_faustus-8B.i1-IQ3_S.gguf filter=lfs diff=lfs merge=lfs -text

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:b12350edd11a285f2fbeae67d7940e501851d863eb0c82286538227a029ccbc7
size 2161972640

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:d3c2448959f8fb31ce34e38389c4c17128812a93ed0d8325e9536bf2f7e2b5bb
size 2019628448

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:236be53a1d1b61f70f7eff93122e9ae2867946e9d5b92ebc7fa0953e00d59bde
size 2948281760

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:46be64dde4b0e678a379beb5c76bc38f329e5e6da31265a130403f9a076688fa
size 2758489504

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:9c00fc1a517c6566172223a7c10d4dbf07b1d153db6ff8c40d16b01541ecbd58
size 2605782432

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:208316e016da9a7506861e6e92be79e488fd18c47b1bfea5968bb825e484669d
size 2399212960

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:a6233016b26e213d5e7dfacb64a1cd51a6c325172633dce8159510c0e3fc007b
size 3784824224

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:ccef71b591c2835e5e8f36768f424c366e2554b01ca81dff47202b828e1be86a
size 3682325920

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:8c3d6ca55db52653a55d74cc636935faaf51e1b2cd0f3d28e9ee1e56c5117115
size 3518748064

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:c0b47dd37620d25c32ab41bbd7081b8efd6ae6efe9b248b77d99ea256111743a
size 3274913184

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:094bb3096be05296e3a3bf4314fcf99a6b71c1bb2317967a173ca7f3038aadf6
size 4677989792

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:3d2208c245c0bed8a05d6e84e461f82ebc5d2378dfa1a559dae2c3a4fcbf94a0
size 4447663520

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:55fba04b4f9f9e53ae27790322e611dcfc539afab0ec3a468240da4ad2f32512
size 3179132320

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:a01c69016b455a3a25e9cb7a675f00950169a78ede1bbd3cd2935a934433725c
size 2988815776

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:2a99c2c99da5600ce68aa11c421c2f2f722ef73f5757a6db6a41a9d34a0c913a
size 4321957280

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:6112031fccd992c5e39318faeff9d770712008975cb894f5c55026d568d25cc5
size 4018918816

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:27475f253c355b220760a1f1607c9aa48c63198fd09bb5ee1cd6e120f7a99a47
size 3664500128

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:4095419ac0537b03b312881c119aab79cc67965a791d32c3030a8e85d4351d9e
size 4675892640

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:5833b294a4a414f9cbea195352adfab94e406aba50361df4b7ea35a6cb731768
size 5130253728

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:a29295561539fc635da0e4eaddf4150a19db619628abf301d3ca5bc0bf66307a
size 4920735136

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:4aa7ef6c445224d80dad971529993ac26ef00f74cc83c3e9892574be52556d9a
size 4692669856

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:43571969516028d5167fe18284c49ef8f20425cdba0b8d5bc7f50db63be072ec
size 5732988320

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:fbcfcd7d529093275c9a129cbe117c7a131a55a7c21896cd6f18196feb7bb07c
size 5599294880

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:c5957d282f4ac5f82f3b8706cf877ce4613630b4c8f29659127bcc69d798c177
size 6596007328

78
README.md Normal file
View File

@@ -0,0 +1,78 @@
---
base_model: jaspionjader/Kosmos-Aurora_faustus-8B
language:
- en
library_name: transformers
quantized_by: mradermacher
tags:
- mergekit
- merge
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
weighted/imatrix quants of https://huggingface.co/jaspionjader/Kosmos-Aurora_faustus-8B
<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/Kosmos-Aurora_faustus-8B-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/Kosmos-Aurora_faustus-8B-i1-GGUF/resolve/main/Kosmos-Aurora_faustus-8B.i1-IQ1_S.gguf) | i1-IQ1_S | 2.1 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/Kosmos-Aurora_faustus-8B-i1-GGUF/resolve/main/Kosmos-Aurora_faustus-8B.i1-IQ1_M.gguf) | i1-IQ1_M | 2.3 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/Kosmos-Aurora_faustus-8B-i1-GGUF/resolve/main/Kosmos-Aurora_faustus-8B.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 2.5 | |
| [GGUF](https://huggingface.co/mradermacher/Kosmos-Aurora_faustus-8B-i1-GGUF/resolve/main/Kosmos-Aurora_faustus-8B.i1-IQ2_XS.gguf) | i1-IQ2_XS | 2.7 | |
| [GGUF](https://huggingface.co/mradermacher/Kosmos-Aurora_faustus-8B-i1-GGUF/resolve/main/Kosmos-Aurora_faustus-8B.i1-IQ2_S.gguf) | i1-IQ2_S | 2.9 | |
| [GGUF](https://huggingface.co/mradermacher/Kosmos-Aurora_faustus-8B-i1-GGUF/resolve/main/Kosmos-Aurora_faustus-8B.i1-IQ2_M.gguf) | i1-IQ2_M | 3.0 | |
| [GGUF](https://huggingface.co/mradermacher/Kosmos-Aurora_faustus-8B-i1-GGUF/resolve/main/Kosmos-Aurora_faustus-8B.i1-Q2_K_S.gguf) | i1-Q2_K_S | 3.1 | very low quality |
| [GGUF](https://huggingface.co/mradermacher/Kosmos-Aurora_faustus-8B-i1-GGUF/resolve/main/Kosmos-Aurora_faustus-8B.i1-Q2_K.gguf) | i1-Q2_K | 3.3 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/Kosmos-Aurora_faustus-8B-i1-GGUF/resolve/main/Kosmos-Aurora_faustus-8B.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 3.4 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/Kosmos-Aurora_faustus-8B-i1-GGUF/resolve/main/Kosmos-Aurora_faustus-8B.i1-IQ3_XS.gguf) | i1-IQ3_XS | 3.6 | |
| [GGUF](https://huggingface.co/mradermacher/Kosmos-Aurora_faustus-8B-i1-GGUF/resolve/main/Kosmos-Aurora_faustus-8B.i1-Q3_K_S.gguf) | i1-Q3_K_S | 3.8 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/Kosmos-Aurora_faustus-8B-i1-GGUF/resolve/main/Kosmos-Aurora_faustus-8B.i1-IQ3_S.gguf) | i1-IQ3_S | 3.8 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/Kosmos-Aurora_faustus-8B-i1-GGUF/resolve/main/Kosmos-Aurora_faustus-8B.i1-IQ3_M.gguf) | i1-IQ3_M | 3.9 | |
| [GGUF](https://huggingface.co/mradermacher/Kosmos-Aurora_faustus-8B-i1-GGUF/resolve/main/Kosmos-Aurora_faustus-8B.i1-Q3_K_M.gguf) | i1-Q3_K_M | 4.1 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/Kosmos-Aurora_faustus-8B-i1-GGUF/resolve/main/Kosmos-Aurora_faustus-8B.i1-Q3_K_L.gguf) | i1-Q3_K_L | 4.4 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/Kosmos-Aurora_faustus-8B-i1-GGUF/resolve/main/Kosmos-Aurora_faustus-8B.i1-IQ4_XS.gguf) | i1-IQ4_XS | 4.5 | |
| [GGUF](https://huggingface.co/mradermacher/Kosmos-Aurora_faustus-8B-i1-GGUF/resolve/main/Kosmos-Aurora_faustus-8B.i1-Q4_0.gguf) | i1-Q4_0 | 4.8 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/Kosmos-Aurora_faustus-8B-i1-GGUF/resolve/main/Kosmos-Aurora_faustus-8B.i1-IQ4_NL.gguf) | i1-IQ4_NL | 4.8 | prefer IQ4_XS |
| [GGUF](https://huggingface.co/mradermacher/Kosmos-Aurora_faustus-8B-i1-GGUF/resolve/main/Kosmos-Aurora_faustus-8B.i1-Q4_K_S.gguf) | i1-Q4_K_S | 4.8 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/Kosmos-Aurora_faustus-8B-i1-GGUF/resolve/main/Kosmos-Aurora_faustus-8B.i1-Q4_K_M.gguf) | i1-Q4_K_M | 5.0 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Kosmos-Aurora_faustus-8B-i1-GGUF/resolve/main/Kosmos-Aurora_faustus-8B.i1-Q4_1.gguf) | i1-Q4_1 | 5.2 | |
| [GGUF](https://huggingface.co/mradermacher/Kosmos-Aurora_faustus-8B-i1-GGUF/resolve/main/Kosmos-Aurora_faustus-8B.i1-Q5_K_S.gguf) | i1-Q5_K_S | 5.7 | |
| [GGUF](https://huggingface.co/mradermacher/Kosmos-Aurora_faustus-8B-i1-GGUF/resolve/main/Kosmos-Aurora_faustus-8B.i1-Q5_K_M.gguf) | i1-Q5_K_M | 5.8 | |
| [GGUF](https://huggingface.co/mradermacher/Kosmos-Aurora_faustus-8B-i1-GGUF/resolve/main/Kosmos-Aurora_faustus-8B.i1-Q6_K.gguf) | i1-Q6_K | 6.7 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):
![image.png](https://www.nethype.de/huggingface_embed/quantpplgraph.png)
And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->

3
imatrix.dat Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:72bb9ea32e4d528938dbed822cebb813825433ecf834f0d547a4ccb18a9e25d9
size 4988157