初始化项目,由ModelHub XC社区提供模型

Model: mradermacher/SpaceOm-i1-GGUF
Source: Original Platform
This commit is contained in:
ModelHub XC
2026-04-24 22:42:08 +08:00
commit a6f8cbc759
27 changed files with 234 additions and 0 deletions

60
.gitattributes vendored Normal file
View File

@@ -0,0 +1,60 @@
*.7z filter=lfs diff=lfs merge=lfs -text
*.arrow filter=lfs diff=lfs merge=lfs -text
*.bin filter=lfs diff=lfs merge=lfs -text
*.bz2 filter=lfs diff=lfs merge=lfs -text
*.ckpt filter=lfs diff=lfs merge=lfs -text
*.ftz filter=lfs diff=lfs merge=lfs -text
*.gz filter=lfs diff=lfs merge=lfs -text
*.h5 filter=lfs diff=lfs merge=lfs -text
*.joblib filter=lfs diff=lfs merge=lfs -text
*.lfs.* filter=lfs diff=lfs merge=lfs -text
*.mlmodel filter=lfs diff=lfs merge=lfs -text
*.model filter=lfs diff=lfs merge=lfs -text
*.msgpack filter=lfs diff=lfs merge=lfs -text
*.npy filter=lfs diff=lfs merge=lfs -text
*.npz filter=lfs diff=lfs merge=lfs -text
*.onnx filter=lfs diff=lfs merge=lfs -text
*.ot filter=lfs diff=lfs merge=lfs -text
*.parquet filter=lfs diff=lfs merge=lfs -text
*.pb filter=lfs diff=lfs merge=lfs -text
*.pickle filter=lfs diff=lfs merge=lfs -text
*.pkl filter=lfs diff=lfs merge=lfs -text
*.pt filter=lfs diff=lfs merge=lfs -text
*.pth filter=lfs diff=lfs merge=lfs -text
*.rar filter=lfs diff=lfs merge=lfs -text
*.safetensors filter=lfs diff=lfs merge=lfs -text
saved_model/**/* filter=lfs diff=lfs merge=lfs -text
*.tar.* filter=lfs diff=lfs merge=lfs -text
*.tar filter=lfs diff=lfs merge=lfs -text
*.tflite filter=lfs diff=lfs merge=lfs -text
*.tgz filter=lfs diff=lfs merge=lfs -text
*.wasm filter=lfs diff=lfs merge=lfs -text
*.xz filter=lfs diff=lfs merge=lfs -text
*.zip filter=lfs diff=lfs merge=lfs -text
*.zst filter=lfs diff=lfs merge=lfs -text
*tfevents* filter=lfs diff=lfs merge=lfs -text
imatrix.dat filter=lfs diff=lfs merge=lfs -text
SpaceOm.i1-IQ1_M.gguf filter=lfs diff=lfs merge=lfs -text
SpaceOm.i1-IQ1_S.gguf filter=lfs diff=lfs merge=lfs -text
SpaceOm.i1-IQ2_M.gguf filter=lfs diff=lfs merge=lfs -text
SpaceOm.i1-IQ2_S.gguf filter=lfs diff=lfs merge=lfs -text
SpaceOm.i1-IQ2_XS.gguf filter=lfs diff=lfs merge=lfs -text
SpaceOm.i1-IQ2_XXS.gguf filter=lfs diff=lfs merge=lfs -text
SpaceOm.i1-IQ3_M.gguf filter=lfs diff=lfs merge=lfs -text
SpaceOm.i1-IQ3_S.gguf filter=lfs diff=lfs merge=lfs -text
SpaceOm.i1-IQ3_XS.gguf filter=lfs diff=lfs merge=lfs -text
SpaceOm.i1-IQ3_XXS.gguf filter=lfs diff=lfs merge=lfs -text
SpaceOm.i1-IQ4_NL.gguf filter=lfs diff=lfs merge=lfs -text
SpaceOm.i1-IQ4_XS.gguf filter=lfs diff=lfs merge=lfs -text
SpaceOm.i1-Q2_K.gguf filter=lfs diff=lfs merge=lfs -text
SpaceOm.i1-Q2_K_S.gguf filter=lfs diff=lfs merge=lfs -text
SpaceOm.i1-Q3_K_L.gguf filter=lfs diff=lfs merge=lfs -text
SpaceOm.i1-Q3_K_M.gguf filter=lfs diff=lfs merge=lfs -text
SpaceOm.i1-Q3_K_S.gguf filter=lfs diff=lfs merge=lfs -text
SpaceOm.i1-Q4_0.gguf filter=lfs diff=lfs merge=lfs -text
SpaceOm.i1-Q4_1.gguf filter=lfs diff=lfs merge=lfs -text
SpaceOm.i1-Q4_K_M.gguf filter=lfs diff=lfs merge=lfs -text
SpaceOm.i1-Q4_K_S.gguf filter=lfs diff=lfs merge=lfs -text
SpaceOm.i1-Q5_K_M.gguf filter=lfs diff=lfs merge=lfs -text
SpaceOm.i1-Q5_K_S.gguf filter=lfs diff=lfs merge=lfs -text
SpaceOm.i1-Q6_K.gguf filter=lfs diff=lfs merge=lfs -text

99
README.md Normal file
View File

@@ -0,0 +1,99 @@
---
base_model: remyxai/SpaceOm
datasets:
- remyxai/SpaceThinker
language:
- en
library_name: transformers
license: apache-2.0
license_link: https://huggingface.co/Qwen/Qwen2.5-VL-3B-Instruct/blob/main/LICENSE
license_name: qwen-research
mradermacher:
readme_rev: 1
quantized_by: mradermacher
tags:
- remyx
- SpatialReasoning
- spatial-reasoning
- test-time-compute
- thinking
- reasoning
- multimodal
- vlm
- vision-language
- distance-estimation
- quantitative-spatial-reasoning
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
weighted/imatrix quants of https://huggingface.co/remyxai/SpaceOm
<!-- provided-files -->
***For a convenient overview and download list, visit our [model page for this model](https://hf.tst.eu/model#SpaceOm-i1-GGUF).***
static quants are available at https://huggingface.co/mradermacher/SpaceOm-GGUF
**This is a vision model - mmproj files (if any) will be in the [static repository](https://huggingface.co/mradermacher/SpaceOm-GGUF).**
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/SpaceOm-i1-GGUF/resolve/main/SpaceOm.i1-IQ1_S.gguf) | i1-IQ1_S | 0.9 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/SpaceOm-i1-GGUF/resolve/main/SpaceOm.i1-IQ1_M.gguf) | i1-IQ1_M | 1.0 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/SpaceOm-i1-GGUF/resolve/main/SpaceOm.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 1.0 | |
| [GGUF](https://huggingface.co/mradermacher/SpaceOm-i1-GGUF/resolve/main/SpaceOm.i1-IQ2_XS.gguf) | i1-IQ2_XS | 1.1 | |
| [GGUF](https://huggingface.co/mradermacher/SpaceOm-i1-GGUF/resolve/main/SpaceOm.i1-IQ2_S.gguf) | i1-IQ2_S | 1.2 | |
| [GGUF](https://huggingface.co/mradermacher/SpaceOm-i1-GGUF/resolve/main/SpaceOm.i1-IQ2_M.gguf) | i1-IQ2_M | 1.2 | |
| [GGUF](https://huggingface.co/mradermacher/SpaceOm-i1-GGUF/resolve/main/SpaceOm.i1-Q2_K_S.gguf) | i1-Q2_K_S | 1.3 | very low quality |
| [GGUF](https://huggingface.co/mradermacher/SpaceOm-i1-GGUF/resolve/main/SpaceOm.i1-Q2_K.gguf) | i1-Q2_K | 1.4 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/SpaceOm-i1-GGUF/resolve/main/SpaceOm.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 1.4 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/SpaceOm-i1-GGUF/resolve/main/SpaceOm.i1-IQ3_XS.gguf) | i1-IQ3_XS | 1.5 | |
| [GGUF](https://huggingface.co/mradermacher/SpaceOm-i1-GGUF/resolve/main/SpaceOm.i1-Q3_K_S.gguf) | i1-Q3_K_S | 1.6 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/SpaceOm-i1-GGUF/resolve/main/SpaceOm.i1-IQ3_S.gguf) | i1-IQ3_S | 1.6 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/SpaceOm-i1-GGUF/resolve/main/SpaceOm.i1-IQ3_M.gguf) | i1-IQ3_M | 1.6 | |
| [GGUF](https://huggingface.co/mradermacher/SpaceOm-i1-GGUF/resolve/main/SpaceOm.i1-Q3_K_M.gguf) | i1-Q3_K_M | 1.7 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/SpaceOm-i1-GGUF/resolve/main/SpaceOm.i1-Q3_K_L.gguf) | i1-Q3_K_L | 1.8 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/SpaceOm-i1-GGUF/resolve/main/SpaceOm.i1-IQ4_XS.gguf) | i1-IQ4_XS | 1.8 | |
| [GGUF](https://huggingface.co/mradermacher/SpaceOm-i1-GGUF/resolve/main/SpaceOm.i1-IQ4_NL.gguf) | i1-IQ4_NL | 1.9 | prefer IQ4_XS |
| [GGUF](https://huggingface.co/mradermacher/SpaceOm-i1-GGUF/resolve/main/SpaceOm.i1-Q4_0.gguf) | i1-Q4_0 | 1.9 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/SpaceOm-i1-GGUF/resolve/main/SpaceOm.i1-Q4_K_S.gguf) | i1-Q4_K_S | 1.9 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/SpaceOm-i1-GGUF/resolve/main/SpaceOm.i1-Q4_K_M.gguf) | i1-Q4_K_M | 2.0 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/SpaceOm-i1-GGUF/resolve/main/SpaceOm.i1-Q4_1.gguf) | i1-Q4_1 | 2.1 | |
| [GGUF](https://huggingface.co/mradermacher/SpaceOm-i1-GGUF/resolve/main/SpaceOm.i1-Q5_K_S.gguf) | i1-Q5_K_S | 2.3 | |
| [GGUF](https://huggingface.co/mradermacher/SpaceOm-i1-GGUF/resolve/main/SpaceOm.i1-Q5_K_M.gguf) | i1-Q5_K_M | 2.3 | |
| [GGUF](https://huggingface.co/mradermacher/SpaceOm-i1-GGUF/resolve/main/SpaceOm.i1-Q6_K.gguf) | i1-Q6_K | 2.6 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):
![image.png](https://www.nethype.de/huggingface_embed/quantpplgraph.png)
And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->

3
SpaceOm.i1-IQ1_M.gguf Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:5afc1588abdaecd576171fd7ab5046c380590b897f20e0472d24edad39abf314
size 850025856

3
SpaceOm.i1-IQ1_S.gguf Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:035270fee55bd669975bf7f200c4083c317dfcc6ea42968b982bcd84f63dbb20
size 791092608

3
SpaceOm.i1-IQ2_M.gguf Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:8483ac5345b0f528545962d129d8231ca082eb15bbd15e4fb5f91af53dbb47cf
size 1140514176

3
SpaceOm.i1-IQ2_S.gguf Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:e83485ff88f1e5c8824d183a79c0961afa02894ca732e1fac1ad9674c1289b37
size 1061936512

3
SpaceOm.i1-IQ2_XS.gguf Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:0b2ab3a7e3b66b05251643030b8ada78807a6c72d85481e659ffc7cef3084cab
size 1031544192

3
SpaceOm.i1-IQ2_XXS.gguf Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:3a0b0877fd150fca7a43e6339b15d808096428570f4d6f6c4261bda416a50804
size 948247936

3
SpaceOm.i1-IQ3_M.gguf Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:cf9dc165c6b8c3b5064c8fdda8cb31611bff22b00a9147ea1d8513ca99dcc3ae
size 1488893312

3
SpaceOm.i1-IQ3_S.gguf Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:6163d9f72c6afef9e163a5c7816d896bf953419d19c766894e9072f25cdfc425
size 1456862592

3
SpaceOm.i1-IQ3_XS.gguf Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:6d402719e4114d0d658bfab2f589144735e320457a95462de63ddda5298744df
size 1391834496

3
SpaceOm.i1-IQ3_XXS.gguf Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:30927114ed0d9dc5ad8212d64d34f6869d6867609b5bbb7f61062d23e784c632
size 1282825600

3
SpaceOm.i1-IQ4_NL.gguf Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:028e3f477a79fa2db2d1d9a7302b9f44b44ff6254e2d6e6c672ff71c63476fd0
size 1825207680

3
SpaceOm.i1-IQ4_XS.gguf Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:1e4ff8eb04be02081a73acd8ad390235e791d9b1e2ea5df18bf6dc04ced99a11
size 1739093376

3
SpaceOm.i1-Q2_K.gguf Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:8dc3d2141509de251e6a6d465b9e3bd18db19cf43bdb1b2aff4a4dbefc08fb11
size 1274754432

3
SpaceOm.i1-Q2_K_S.gguf Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:ce6c5428691ab944581a5863c1e582edaf9f0808e9d6cb6462f91d79f7684fcd
size 1198126464

3
SpaceOm.i1-Q3_K_L.gguf Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:eb8efab26959f091f5c172b91b8edf1293e8aa908f9eb2f69c65b9e0093d9a3b
size 1707390336

3
SpaceOm.i1-Q3_K_M.gguf Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:596d95a2d61c57f70eed121b98b543c5a1fd832eeac598424f2afd6b7846a75d
size 1590474112

3
SpaceOm.i1-Q3_K_S.gguf Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:f27df7460cb883ba893d0f78b46f55835ceaf8a9812c96d4048b0da4cc4b024b
size 1454355840

3
SpaceOm.i1-Q4_0.gguf Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:270c7b1bbeb15d2f09c0af676a51558ed6c25dfbf3f08bba85a268ecd48ce656
size 1828484480

3
SpaceOm.i1-Q4_1.gguf Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:6f43e4ba7d3093a40994c5e9a904f37ab6eb633583515b99558decb5cba2256e
size 1996256640

3
SpaceOm.i1-Q4_K_M.gguf Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:405519d066475c376ca72e1c2786abc282aa2dbb24383c4849a050af43a391f8
size 1929901440

3
SpaceOm.i1-Q4_K_S.gguf Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:7c4fecb7f736579e6e20ffeef93e820c8b306a8ccf77d87aa739084af1098946
size 1834382720

3
SpaceOm.i1-Q5_K_M.gguf Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:f36f0a99d45c0ef241000600ed39093cb4c4bd36f13f4d373ebb1b83d57d83be
size 2224813440

3
SpaceOm.i1-Q5_K_S.gguf Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:d1203c8ce1437656a57d9cd6fa58347a742fd694818308b64e94668ed67cbdcb
size 2169664896

3
SpaceOm.i1-Q6_K.gguf Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:aea46fe3f80da85444b11f0440659cfb33104b95b33cd0150fc23f73dfd0ff20
size 2538157440

3
imatrix.dat Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:503ce16fdb4ab7a420fe25f333caf257edc76391ec929a3a823e34a901ee4e61
size 3362977