初始化项目,由ModelHub XC社区提供模型

Model: mradermacher/Panacea-MegaScience-Qwen3-1.7B-i1-GGUF
Source: Original Platform
This commit is contained in:
ModelHub XC
2026-04-10 19:50:55 +08:00
commit 5b6e2a7507
27 changed files with 231 additions and 0 deletions

60
.gitattributes vendored Normal file
View File

@@ -0,0 +1,60 @@
*.7z filter=lfs diff=lfs merge=lfs -text
*.arrow filter=lfs diff=lfs merge=lfs -text
*.bin filter=lfs diff=lfs merge=lfs -text
*.bz2 filter=lfs diff=lfs merge=lfs -text
*.ckpt filter=lfs diff=lfs merge=lfs -text
*.ftz filter=lfs diff=lfs merge=lfs -text
*.gz filter=lfs diff=lfs merge=lfs -text
*.h5 filter=lfs diff=lfs merge=lfs -text
*.joblib filter=lfs diff=lfs merge=lfs -text
*.lfs.* filter=lfs diff=lfs merge=lfs -text
*.mlmodel filter=lfs diff=lfs merge=lfs -text
*.model filter=lfs diff=lfs merge=lfs -text
*.msgpack filter=lfs diff=lfs merge=lfs -text
*.npy filter=lfs diff=lfs merge=lfs -text
*.npz filter=lfs diff=lfs merge=lfs -text
*.onnx filter=lfs diff=lfs merge=lfs -text
*.ot filter=lfs diff=lfs merge=lfs -text
*.parquet filter=lfs diff=lfs merge=lfs -text
*.pb filter=lfs diff=lfs merge=lfs -text
*.pickle filter=lfs diff=lfs merge=lfs -text
*.pkl filter=lfs diff=lfs merge=lfs -text
*.pt filter=lfs diff=lfs merge=lfs -text
*.pth filter=lfs diff=lfs merge=lfs -text
*.rar filter=lfs diff=lfs merge=lfs -text
*.safetensors filter=lfs diff=lfs merge=lfs -text
saved_model/**/* filter=lfs diff=lfs merge=lfs -text
*.tar.* filter=lfs diff=lfs merge=lfs -text
*.tar filter=lfs diff=lfs merge=lfs -text
*.tflite filter=lfs diff=lfs merge=lfs -text
*.tgz filter=lfs diff=lfs merge=lfs -text
*.wasm filter=lfs diff=lfs merge=lfs -text
*.xz filter=lfs diff=lfs merge=lfs -text
*.zip filter=lfs diff=lfs merge=lfs -text
*.zst filter=lfs diff=lfs merge=lfs -text
*tfevents* filter=lfs diff=lfs merge=lfs -text
Panacea-MegaScience-Qwen3-1.7B.imatrix.gguf filter=lfs diff=lfs merge=lfs -text
Panacea-MegaScience-Qwen3-1.7B.i1-IQ1_M.gguf filter=lfs diff=lfs merge=lfs -text
Panacea-MegaScience-Qwen3-1.7B.i1-IQ1_S.gguf filter=lfs diff=lfs merge=lfs -text
Panacea-MegaScience-Qwen3-1.7B.i1-IQ2_M.gguf filter=lfs diff=lfs merge=lfs -text
Panacea-MegaScience-Qwen3-1.7B.i1-IQ2_S.gguf filter=lfs diff=lfs merge=lfs -text
Panacea-MegaScience-Qwen3-1.7B.i1-IQ2_XS.gguf filter=lfs diff=lfs merge=lfs -text
Panacea-MegaScience-Qwen3-1.7B.i1-IQ2_XXS.gguf filter=lfs diff=lfs merge=lfs -text
Panacea-MegaScience-Qwen3-1.7B.i1-IQ3_M.gguf filter=lfs diff=lfs merge=lfs -text
Panacea-MegaScience-Qwen3-1.7B.i1-IQ3_S.gguf filter=lfs diff=lfs merge=lfs -text
Panacea-MegaScience-Qwen3-1.7B.i1-IQ3_XS.gguf filter=lfs diff=lfs merge=lfs -text
Panacea-MegaScience-Qwen3-1.7B.i1-IQ3_XXS.gguf filter=lfs diff=lfs merge=lfs -text
Panacea-MegaScience-Qwen3-1.7B.i1-IQ4_NL.gguf filter=lfs diff=lfs merge=lfs -text
Panacea-MegaScience-Qwen3-1.7B.i1-IQ4_XS.gguf filter=lfs diff=lfs merge=lfs -text
Panacea-MegaScience-Qwen3-1.7B.i1-Q2_K.gguf filter=lfs diff=lfs merge=lfs -text
Panacea-MegaScience-Qwen3-1.7B.i1-Q2_K_S.gguf filter=lfs diff=lfs merge=lfs -text
Panacea-MegaScience-Qwen3-1.7B.i1-Q3_K_L.gguf filter=lfs diff=lfs merge=lfs -text
Panacea-MegaScience-Qwen3-1.7B.i1-Q3_K_M.gguf filter=lfs diff=lfs merge=lfs -text
Panacea-MegaScience-Qwen3-1.7B.i1-Q3_K_S.gguf filter=lfs diff=lfs merge=lfs -text
Panacea-MegaScience-Qwen3-1.7B.i1-Q4_0.gguf filter=lfs diff=lfs merge=lfs -text
Panacea-MegaScience-Qwen3-1.7B.i1-Q4_1.gguf filter=lfs diff=lfs merge=lfs -text
Panacea-MegaScience-Qwen3-1.7B.i1-Q4_K_M.gguf filter=lfs diff=lfs merge=lfs -text
Panacea-MegaScience-Qwen3-1.7B.i1-Q4_K_S.gguf filter=lfs diff=lfs merge=lfs -text
Panacea-MegaScience-Qwen3-1.7B.i1-Q5_K_M.gguf filter=lfs diff=lfs merge=lfs -text
Panacea-MegaScience-Qwen3-1.7B.i1-Q5_K_S.gguf filter=lfs diff=lfs merge=lfs -text
Panacea-MegaScience-Qwen3-1.7B.i1-Q6_K.gguf filter=lfs diff=lfs merge=lfs -text

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:80543183743b31a4c81a011e91ce09e3f079a943da0314afc9ba24a19c8e9171
size 543794912

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:1c8f77d6c1b77a34a6f47fe59c8c31774262c42bf675e83755da52e8a7053152
size 515778272

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:4d7aba52e529f7f4493a2cafbb6385370ec1e9f16665a2841d3c7e526953c875
size 695183072

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:951d4e8405a8ad7cb358b4389b7d20fa4317385fb298700d0b76867235bc49e9
size 657827552

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:6695dac697b1b4c9a29061494da5e41a959e7f3bbc237cc3a39e4cf51261e521
size 631514848

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:6d1eff5a300e04f79ac5517f89dadbcf38a0b7e603c217c7e7dc453c01ea7d06
size 590489312

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:8df0ce4a6abae12d1b271ef880c7f605f5416459431745307851cff6edb3f726
size 895663840

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:c4f2b48d84e7e0eba942e52b2746b60a9b74c08099f8be38c453d66c7c094216
size 867253984

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:c41521a25eb6d18ab2fb27c24bf9ac09c2b15b49e7bbba1fbea7614fcc8ea4dc
size 834223840

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:ab045aacf4c9bbd408ab8f7950536e9b63f3b3bc770ccf7190c792a673d519dc
size 754362080

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:947b0b86ad106396dc00c9ada0be13bbb626d512791980886b19b6de0ef46445
size 1054424800

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:b56bc959d19cc4617358cda1b9bb05037ed94abaea850bb7fe035d65db7e605a
size 1010384608

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:5f403b1abd2c08c921615cd61b95c88ec19efdd28d86f266c62ebd1504bb0052
size 777797344

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:b8ff8b5c0a1de127af17c9d7edc362225f32e3ab66e98be140cd04b8724bac54
size 732970720

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:246dff5e0124475cd26ca9179789dabe51e2f8c101002e945fda7f25619d9837
size 1003503328

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:0341271f0f9a3d7c1e82d1fbccc37666a95cc221ab8ce0834197ded945939026
size 939540192

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:bec21a832ac8f28c9871b71410a1c6aafee50bcd3f314ea22f20915b701e0f31
size 867253984

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:e710cf48187a3e8b12308fceb6942cc960bb9fcb902d6a474bce7e662e201cc1
size 1056784096

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:cb79a48a0a50aaa04ef2957b95a33d72071e650c1ec9bda1098829927518ee0f
size 1142505184

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:2934d487cdca57c17edc01875863ea74b8d2b338aee88a6c665f7b7f3717bbb9
size 1107410656

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:844a1005855e6518ccd9e46c3f29ff9417e07d2c9dede5561049aff884cd8fb5
size 1060191968

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:c51299ee4fa2d8237819168b3a5fb0832c8fb41e3dc37bd389f94d4395fa73d8
size 1257881312

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:35997860ed67857bb282731cd415da871b01715bf4271f315f8c8ccd2c8f7ae0
size 1230585568

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:0deb4a5ece4146e51ef7d7284573c10c9d03faf61fa0fe83da1d3bef8a982692
size 1417756384

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:f8f1340cc7da9c05ef274001abc2e3538fdb4d6df328f6886cd02a75efb9d408
size 2094560

96
README.md Normal file
View File

@@ -0,0 +1,96 @@
---
base_model: prithivMLmods/Panacea-MegaScience-Qwen3-1.7B
datasets:
- MegaScience/MegaScience
language:
- en
- zh
library_name: transformers
license: apache-2.0
mradermacher:
readme_rev: 1
quantized_by: mradermacher
tags:
- text-generation-inference
- moe
- trl
- biology
- chemistry
- medical
- mega-science
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
<!-- ### quants: Q2_K IQ3_M Q4_K_S IQ3_XXS Q3_K_M small-IQ4_NL Q4_K_M IQ2_M Q6_K IQ4_XS Q2_K_S IQ1_M Q3_K_S IQ2_XXS Q3_K_L IQ2_XS Q5_K_S IQ2_S IQ1_S Q5_K_M Q4_0 IQ3_XS Q4_1 IQ3_S -->
<!-- ### quants_skip: -->
<!-- ### skip_mmproj: -->
weighted/imatrix quants of https://huggingface.co/prithivMLmods/Panacea-MegaScience-Qwen3-1.7B
<!-- provided-files -->
***For a convenient overview and download list, visit our [model page for this model](https://hf.tst.eu/model#Panacea-MegaScience-Qwen3-1.7B-i1-GGUF).***
static quants are available at https://huggingface.co/mradermacher/Panacea-MegaScience-Qwen3-1.7B-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/Panacea-MegaScience-Qwen3-1.7B-i1-GGUF/resolve/main/Panacea-MegaScience-Qwen3-1.7B.imatrix.gguf) | imatrix | 0.1 | imatrix file (for creating your own quants) |
| [GGUF](https://huggingface.co/mradermacher/Panacea-MegaScience-Qwen3-1.7B-i1-GGUF/resolve/main/Panacea-MegaScience-Qwen3-1.7B.i1-IQ1_S.gguf) | i1-IQ1_S | 0.6 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/Panacea-MegaScience-Qwen3-1.7B-i1-GGUF/resolve/main/Panacea-MegaScience-Qwen3-1.7B.i1-IQ1_M.gguf) | i1-IQ1_M | 0.6 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/Panacea-MegaScience-Qwen3-1.7B-i1-GGUF/resolve/main/Panacea-MegaScience-Qwen3-1.7B.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 0.7 | |
| [GGUF](https://huggingface.co/mradermacher/Panacea-MegaScience-Qwen3-1.7B-i1-GGUF/resolve/main/Panacea-MegaScience-Qwen3-1.7B.i1-IQ2_XS.gguf) | i1-IQ2_XS | 0.7 | |
| [GGUF](https://huggingface.co/mradermacher/Panacea-MegaScience-Qwen3-1.7B-i1-GGUF/resolve/main/Panacea-MegaScience-Qwen3-1.7B.i1-IQ2_S.gguf) | i1-IQ2_S | 0.8 | |
| [GGUF](https://huggingface.co/mradermacher/Panacea-MegaScience-Qwen3-1.7B-i1-GGUF/resolve/main/Panacea-MegaScience-Qwen3-1.7B.i1-IQ2_M.gguf) | i1-IQ2_M | 0.8 | |
| [GGUF](https://huggingface.co/mradermacher/Panacea-MegaScience-Qwen3-1.7B-i1-GGUF/resolve/main/Panacea-MegaScience-Qwen3-1.7B.i1-Q2_K_S.gguf) | i1-Q2_K_S | 0.8 | very low quality |
| [GGUF](https://huggingface.co/mradermacher/Panacea-MegaScience-Qwen3-1.7B-i1-GGUF/resolve/main/Panacea-MegaScience-Qwen3-1.7B.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 0.9 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/Panacea-MegaScience-Qwen3-1.7B-i1-GGUF/resolve/main/Panacea-MegaScience-Qwen3-1.7B.i1-Q2_K.gguf) | i1-Q2_K | 0.9 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/Panacea-MegaScience-Qwen3-1.7B-i1-GGUF/resolve/main/Panacea-MegaScience-Qwen3-1.7B.i1-IQ3_XS.gguf) | i1-IQ3_XS | 0.9 | |
| [GGUF](https://huggingface.co/mradermacher/Panacea-MegaScience-Qwen3-1.7B-i1-GGUF/resolve/main/Panacea-MegaScience-Qwen3-1.7B.i1-IQ3_S.gguf) | i1-IQ3_S | 1.0 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/Panacea-MegaScience-Qwen3-1.7B-i1-GGUF/resolve/main/Panacea-MegaScience-Qwen3-1.7B.i1-Q3_K_S.gguf) | i1-Q3_K_S | 1.0 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/Panacea-MegaScience-Qwen3-1.7B-i1-GGUF/resolve/main/Panacea-MegaScience-Qwen3-1.7B.i1-IQ3_M.gguf) | i1-IQ3_M | 1.0 | |
| [GGUF](https://huggingface.co/mradermacher/Panacea-MegaScience-Qwen3-1.7B-i1-GGUF/resolve/main/Panacea-MegaScience-Qwen3-1.7B.i1-Q3_K_M.gguf) | i1-Q3_K_M | 1.0 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/Panacea-MegaScience-Qwen3-1.7B-i1-GGUF/resolve/main/Panacea-MegaScience-Qwen3-1.7B.i1-Q3_K_L.gguf) | i1-Q3_K_L | 1.1 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/Panacea-MegaScience-Qwen3-1.7B-i1-GGUF/resolve/main/Panacea-MegaScience-Qwen3-1.7B.i1-IQ4_XS.gguf) | i1-IQ4_XS | 1.1 | |
| [GGUF](https://huggingface.co/mradermacher/Panacea-MegaScience-Qwen3-1.7B-i1-GGUF/resolve/main/Panacea-MegaScience-Qwen3-1.7B.i1-IQ4_NL.gguf) | i1-IQ4_NL | 1.2 | prefer IQ4_XS |
| [GGUF](https://huggingface.co/mradermacher/Panacea-MegaScience-Qwen3-1.7B-i1-GGUF/resolve/main/Panacea-MegaScience-Qwen3-1.7B.i1-Q4_0.gguf) | i1-Q4_0 | 1.2 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/Panacea-MegaScience-Qwen3-1.7B-i1-GGUF/resolve/main/Panacea-MegaScience-Qwen3-1.7B.i1-Q4_K_S.gguf) | i1-Q4_K_S | 1.2 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/Panacea-MegaScience-Qwen3-1.7B-i1-GGUF/resolve/main/Panacea-MegaScience-Qwen3-1.7B.i1-Q4_K_M.gguf) | i1-Q4_K_M | 1.2 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Panacea-MegaScience-Qwen3-1.7B-i1-GGUF/resolve/main/Panacea-MegaScience-Qwen3-1.7B.i1-Q4_1.gguf) | i1-Q4_1 | 1.2 | |
| [GGUF](https://huggingface.co/mradermacher/Panacea-MegaScience-Qwen3-1.7B-i1-GGUF/resolve/main/Panacea-MegaScience-Qwen3-1.7B.i1-Q5_K_S.gguf) | i1-Q5_K_S | 1.3 | |
| [GGUF](https://huggingface.co/mradermacher/Panacea-MegaScience-Qwen3-1.7B-i1-GGUF/resolve/main/Panacea-MegaScience-Qwen3-1.7B.i1-Q5_K_M.gguf) | i1-Q5_K_M | 1.4 | |
| [GGUF](https://huggingface.co/mradermacher/Panacea-MegaScience-Qwen3-1.7B-i1-GGUF/resolve/main/Panacea-MegaScience-Qwen3-1.7B.i1-Q6_K.gguf) | i1-Q6_K | 1.5 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):
![image.png](https://www.nethype.de/huggingface_embed/quantpplgraph.png)
And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->