初始化项目,由ModelHub XC社区提供模型

Model: mradermacher/Apollo-1-2B-i1-GGUF
Source: Original Platform
This commit is contained in:
ModelHub XC
2026-05-08 23:17:54 +08:00
commit f647dec025
27 changed files with 336 additions and 0 deletions

60
.gitattributes vendored Normal file
View File

@@ -0,0 +1,60 @@
*.7z filter=lfs diff=lfs merge=lfs -text
*.arrow filter=lfs diff=lfs merge=lfs -text
*.bin filter=lfs diff=lfs merge=lfs -text
*.bz2 filter=lfs diff=lfs merge=lfs -text
*.ckpt filter=lfs diff=lfs merge=lfs -text
*.ftz filter=lfs diff=lfs merge=lfs -text
*.gz filter=lfs diff=lfs merge=lfs -text
*.h5 filter=lfs diff=lfs merge=lfs -text
*.joblib filter=lfs diff=lfs merge=lfs -text
*.lfs.* filter=lfs diff=lfs merge=lfs -text
*.mlmodel filter=lfs diff=lfs merge=lfs -text
*.model filter=lfs diff=lfs merge=lfs -text
*.msgpack filter=lfs diff=lfs merge=lfs -text
*.npy filter=lfs diff=lfs merge=lfs -text
*.npz filter=lfs diff=lfs merge=lfs -text
*.onnx filter=lfs diff=lfs merge=lfs -text
*.ot filter=lfs diff=lfs merge=lfs -text
*.parquet filter=lfs diff=lfs merge=lfs -text
*.pb filter=lfs diff=lfs merge=lfs -text
*.pickle filter=lfs diff=lfs merge=lfs -text
*.pkl filter=lfs diff=lfs merge=lfs -text
*.pt filter=lfs diff=lfs merge=lfs -text
*.pth filter=lfs diff=lfs merge=lfs -text
*.rar filter=lfs diff=lfs merge=lfs -text
*.safetensors filter=lfs diff=lfs merge=lfs -text
saved_model/**/* filter=lfs diff=lfs merge=lfs -text
*.tar.* filter=lfs diff=lfs merge=lfs -text
*.tar filter=lfs diff=lfs merge=lfs -text
*.tflite filter=lfs diff=lfs merge=lfs -text
*.tgz filter=lfs diff=lfs merge=lfs -text
*.wasm filter=lfs diff=lfs merge=lfs -text
*.xz filter=lfs diff=lfs merge=lfs -text
*.zip filter=lfs diff=lfs merge=lfs -text
*.zst filter=lfs diff=lfs merge=lfs -text
*tfevents* filter=lfs diff=lfs merge=lfs -text
Apollo-1-2B.imatrix.gguf filter=lfs diff=lfs merge=lfs -text
Apollo-1-2B.i1-IQ1_M.gguf filter=lfs diff=lfs merge=lfs -text
Apollo-1-2B.i1-IQ1_S.gguf filter=lfs diff=lfs merge=lfs -text
Apollo-1-2B.i1-IQ2_M.gguf filter=lfs diff=lfs merge=lfs -text
Apollo-1-2B.i1-IQ2_S.gguf filter=lfs diff=lfs merge=lfs -text
Apollo-1-2B.i1-IQ2_XS.gguf filter=lfs diff=lfs merge=lfs -text
Apollo-1-2B.i1-IQ2_XXS.gguf filter=lfs diff=lfs merge=lfs -text
Apollo-1-2B.i1-IQ3_M.gguf filter=lfs diff=lfs merge=lfs -text
Apollo-1-2B.i1-IQ3_S.gguf filter=lfs diff=lfs merge=lfs -text
Apollo-1-2B.i1-IQ3_XS.gguf filter=lfs diff=lfs merge=lfs -text
Apollo-1-2B.i1-IQ3_XXS.gguf filter=lfs diff=lfs merge=lfs -text
Apollo-1-2B.i1-IQ4_NL.gguf filter=lfs diff=lfs merge=lfs -text
Apollo-1-2B.i1-IQ4_XS.gguf filter=lfs diff=lfs merge=lfs -text
Apollo-1-2B.i1-Q2_K.gguf filter=lfs diff=lfs merge=lfs -text
Apollo-1-2B.i1-Q2_K_S.gguf filter=lfs diff=lfs merge=lfs -text
Apollo-1-2B.i1-Q3_K_L.gguf filter=lfs diff=lfs merge=lfs -text
Apollo-1-2B.i1-Q3_K_M.gguf filter=lfs diff=lfs merge=lfs -text
Apollo-1-2B.i1-Q3_K_S.gguf filter=lfs diff=lfs merge=lfs -text
Apollo-1-2B.i1-Q4_0.gguf filter=lfs diff=lfs merge=lfs -text
Apollo-1-2B.i1-Q4_1.gguf filter=lfs diff=lfs merge=lfs -text
Apollo-1-2B.i1-Q4_K_M.gguf filter=lfs diff=lfs merge=lfs -text
Apollo-1-2B.i1-Q4_K_S.gguf filter=lfs diff=lfs merge=lfs -text
Apollo-1-2B.i1-Q5_K_M.gguf filter=lfs diff=lfs merge=lfs -text
Apollo-1-2B.i1-Q5_K_S.gguf filter=lfs diff=lfs merge=lfs -text
Apollo-1-2B.i1-Q6_K.gguf filter=lfs diff=lfs merge=lfs -text

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:ca32f57c0342b068b47e76aaba5cfb60eb61aeebc73fec6699140cfa2f7e8813
size 543795552

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:772c168ec7e88dd6eb51746ce1ac905b3c7e86627aa3b763b33a26c66c71e839
size 515778912

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:9f7eed9fe61d9d22016620d0e4159ea7e760c0a10296ad8ee66f80abb50781bb
size 695183712

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:00683b64ac5d76c646538a9f17a5eb93ae16cc96ffc1a976863e4da881870cf7
size 657828192

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:cee0267c74b5cf55f66ae957b0245ccbdd8e5dc3deca26589a1f0f31ada7dfbc
size 631515488

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:b97f662e26991a851f98e2f31c8443c93e1bce65900363e48af9bf409c04b4ca
size 590489952

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:ae520ee40a5b3a4cee708bf9f20567507ab4d76ea4d7786b71d71591046043c2
size 895664480

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:fe528c2d9ef367e69b41de5183553e1cd9f78d92d79582ac67089175a38a25c9
size 867254624

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:f8c4b236de8f8e198414738710905f8c1356aa7a75073c6f6304b9ce2bf8d2b8
size 834224480

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:8739d08aff67fd7350dda8bf8081c38c772f940b48086dcd022e066e72fc1cd0
size 754362720

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:8bcabacfdb93941a08ac3b40cd785f14721fd2723db1c52d1b40f7ad61464cd2
size 1054425440

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:e31250974bb607f74e57a4edf7dae3919a79829e8940cfce81dc04296f796b7a
size 1010385248

3
Apollo-1-2B.i1-Q2_K.gguf Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:63bd909cf4e2c983d80d0ced6a26d10fb1b296747340a32ba63c352b7520ac0c
size 777797984

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:cb6d58add5d0adf8ec48dc0fed8c4c8b35331359b4838741f7c17cadd384f917
size 732971360

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:eea50fa59dafddf0357d7c1dcc23880771c31091c7495be4f45c9e8f2c51293e
size 1003503968

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:49be9413752ffa22eec6cf94a0792105bc8a461fbbf828fa73c5e41489b55f1e
size 939540832

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:86122a7bea76b01312db4f1c7e600f2ae60cce142bdddf2cef99ab137f9aad00
size 867254624

3
Apollo-1-2B.i1-Q4_0.gguf Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:5a91e34c07a8998f220de4993fee2bd9a09ca0ae24685f929083787680156de3
size 1056784736

3
Apollo-1-2B.i1-Q4_1.gguf Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:b818079ed58ca3793c538c89f58ecebf229c29a29003fa000ede88639d2d3978
size 1142505824

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:0bd7fcfa501be8649a834c68dcda5d6738c951f9e36d56c52fedb889fc14085d
size 1107411296

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:d0e6af2f226303ed4b5f27208951ccb95783c974a55ea4851b16a9bd1c4ed8a2
size 1060192608

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:a375ddf9e81419e7807148d9abc81f333477e4010cd6000ab734830a58da99b8
size 1257881952

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:87f57318de9081e71ed4093ef87efbcc274c7605169328e95fc997b9fe01b6d5
size 1230586208

3
Apollo-1-2B.i1-Q6_K.gguf Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:45da32280d9b8f30b1dc04efd83005ace76b7b7f4a1d05894a30bb3bef66ee3a
size 1417757024

3
Apollo-1-2B.imatrix.gguf Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:4d5e7281b914bbe33855a951b49f556274da87ef7793d668ad85f29b1d5478d7
size 2094560

201
README.md Normal file
View File

@@ -0,0 +1,201 @@
---
base_model: Loom-Labs/Apollo-1-2B
language:
- en
- fr
- pt
- de
- ro
- sv
- da
- bg
- ru
- cs
- el
- uk
- es
- nl
- sk
- hr
- pl
- lt
- nb
- nn
- fa
- sl
- gu
- lv
- it
- oc
- ne
- mr
- be
- sr
- lb
- vec
- as
- cy
- szl
- ast
- hne
- awa
- mai
- bho
- sd
- ga
- fo
- hi
- pa
- bn
- or
- tg
- yi
- lmo
- lij
- scn
- fur
- sc
- gl
- ca
- is
- sq
- li
- prs
- af
- mk
- si
- ur
- mag
- bs
- hy
- zh
- yue
- my
- ar
- he
- mt
- id
- ms
- tl
- ceb
- jv
- su
- min
- ban
- pag
- ilo
- war
- ta
- te
- kn
- ml
- tr
- az
- uz
- kk
- ba
- tt
- th
- lo
- fi
- et
- hu
- vi
- km
- ja
- ko
- ka
- eu
- ht
- pap
- kea
- tpi
- sw
library_name: transformers
license: other
license_link: https://huggingface.co/apexion-ai/Nous-V1-8B/blob/main/LICENSE.md
license_name: anvdl-1.0
mradermacher:
readme_rev: 1
quantized_by: mradermacher
tags:
- text-generation-inference
- transformers
- unsloth
- qwen3
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
<!-- ### quants: Q2_K IQ3_M Q4_K_S IQ3_XXS Q3_K_M small-IQ4_NL Q4_K_M IQ2_M Q6_K IQ4_XS Q2_K_S IQ1_M Q3_K_S IQ2_XXS Q3_K_L IQ2_XS Q5_K_S IQ2_S IQ1_S Q5_K_M Q4_0 IQ3_XS Q4_1 IQ3_S -->
<!-- ### quants_skip: -->
<!-- ### skip_mmproj: -->
weighted/imatrix quants of https://huggingface.co/Loom-Labs/Apollo-1-2B
<!-- provided-files -->
***For a convenient overview and download list, visit our [model page for this model](https://hf.tst.eu/model#Apollo-1-2B-i1-GGUF).***
static quants are available at https://huggingface.co/mradermacher/Apollo-1-2B-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/Apollo-1-2B-i1-GGUF/resolve/main/Apollo-1-2B.imatrix.gguf) | imatrix | 0.1 | imatrix file (for creating your own quants) |
| [GGUF](https://huggingface.co/mradermacher/Apollo-1-2B-i1-GGUF/resolve/main/Apollo-1-2B.i1-IQ1_S.gguf) | i1-IQ1_S | 0.6 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/Apollo-1-2B-i1-GGUF/resolve/main/Apollo-1-2B.i1-IQ1_M.gguf) | i1-IQ1_M | 0.6 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/Apollo-1-2B-i1-GGUF/resolve/main/Apollo-1-2B.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 0.7 | |
| [GGUF](https://huggingface.co/mradermacher/Apollo-1-2B-i1-GGUF/resolve/main/Apollo-1-2B.i1-IQ2_XS.gguf) | i1-IQ2_XS | 0.7 | |
| [GGUF](https://huggingface.co/mradermacher/Apollo-1-2B-i1-GGUF/resolve/main/Apollo-1-2B.i1-IQ2_S.gguf) | i1-IQ2_S | 0.8 | |
| [GGUF](https://huggingface.co/mradermacher/Apollo-1-2B-i1-GGUF/resolve/main/Apollo-1-2B.i1-IQ2_M.gguf) | i1-IQ2_M | 0.8 | |
| [GGUF](https://huggingface.co/mradermacher/Apollo-1-2B-i1-GGUF/resolve/main/Apollo-1-2B.i1-Q2_K_S.gguf) | i1-Q2_K_S | 0.8 | very low quality |
| [GGUF](https://huggingface.co/mradermacher/Apollo-1-2B-i1-GGUF/resolve/main/Apollo-1-2B.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 0.9 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/Apollo-1-2B-i1-GGUF/resolve/main/Apollo-1-2B.i1-Q2_K.gguf) | i1-Q2_K | 0.9 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/Apollo-1-2B-i1-GGUF/resolve/main/Apollo-1-2B.i1-IQ3_XS.gguf) | i1-IQ3_XS | 0.9 | |
| [GGUF](https://huggingface.co/mradermacher/Apollo-1-2B-i1-GGUF/resolve/main/Apollo-1-2B.i1-IQ3_S.gguf) | i1-IQ3_S | 1.0 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/Apollo-1-2B-i1-GGUF/resolve/main/Apollo-1-2B.i1-Q3_K_S.gguf) | i1-Q3_K_S | 1.0 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/Apollo-1-2B-i1-GGUF/resolve/main/Apollo-1-2B.i1-IQ3_M.gguf) | i1-IQ3_M | 1.0 | |
| [GGUF](https://huggingface.co/mradermacher/Apollo-1-2B-i1-GGUF/resolve/main/Apollo-1-2B.i1-Q3_K_M.gguf) | i1-Q3_K_M | 1.0 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/Apollo-1-2B-i1-GGUF/resolve/main/Apollo-1-2B.i1-Q3_K_L.gguf) | i1-Q3_K_L | 1.1 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/Apollo-1-2B-i1-GGUF/resolve/main/Apollo-1-2B.i1-IQ4_XS.gguf) | i1-IQ4_XS | 1.1 | |
| [GGUF](https://huggingface.co/mradermacher/Apollo-1-2B-i1-GGUF/resolve/main/Apollo-1-2B.i1-IQ4_NL.gguf) | i1-IQ4_NL | 1.2 | prefer IQ4_XS |
| [GGUF](https://huggingface.co/mradermacher/Apollo-1-2B-i1-GGUF/resolve/main/Apollo-1-2B.i1-Q4_0.gguf) | i1-Q4_0 | 1.2 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/Apollo-1-2B-i1-GGUF/resolve/main/Apollo-1-2B.i1-Q4_K_S.gguf) | i1-Q4_K_S | 1.2 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/Apollo-1-2B-i1-GGUF/resolve/main/Apollo-1-2B.i1-Q4_K_M.gguf) | i1-Q4_K_M | 1.2 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Apollo-1-2B-i1-GGUF/resolve/main/Apollo-1-2B.i1-Q4_1.gguf) | i1-Q4_1 | 1.2 | |
| [GGUF](https://huggingface.co/mradermacher/Apollo-1-2B-i1-GGUF/resolve/main/Apollo-1-2B.i1-Q5_K_S.gguf) | i1-Q5_K_S | 1.3 | |
| [GGUF](https://huggingface.co/mradermacher/Apollo-1-2B-i1-GGUF/resolve/main/Apollo-1-2B.i1-Q5_K_M.gguf) | i1-Q5_K_M | 1.4 | |
| [GGUF](https://huggingface.co/mradermacher/Apollo-1-2B-i1-GGUF/resolve/main/Apollo-1-2B.i1-Q6_K.gguf) | i1-Q6_K | 1.5 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):
![image.png](https://www.nethype.de/huggingface_embed/quantpplgraph.png)
And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->