初始化项目,由ModelHub XC社区提供模型

Model: mradermacher/AstraGPTCoder-7B-i1-GGUF
Source: Original Platform
This commit is contained in:
ModelHub XC
2026-05-04 19:18:46 +08:00
commit a8572345f2
27 changed files with 236 additions and 0 deletions

60
.gitattributes vendored Normal file
View File

@@ -0,0 +1,60 @@
*.7z filter=lfs diff=lfs merge=lfs -text
*.arrow filter=lfs diff=lfs merge=lfs -text
*.bin filter=lfs diff=lfs merge=lfs -text
*.bz2 filter=lfs diff=lfs merge=lfs -text
*.ckpt filter=lfs diff=lfs merge=lfs -text
*.ftz filter=lfs diff=lfs merge=lfs -text
*.gz filter=lfs diff=lfs merge=lfs -text
*.h5 filter=lfs diff=lfs merge=lfs -text
*.joblib filter=lfs diff=lfs merge=lfs -text
*.lfs.* filter=lfs diff=lfs merge=lfs -text
*.mlmodel filter=lfs diff=lfs merge=lfs -text
*.model filter=lfs diff=lfs merge=lfs -text
*.msgpack filter=lfs diff=lfs merge=lfs -text
*.npy filter=lfs diff=lfs merge=lfs -text
*.npz filter=lfs diff=lfs merge=lfs -text
*.onnx filter=lfs diff=lfs merge=lfs -text
*.ot filter=lfs diff=lfs merge=lfs -text
*.parquet filter=lfs diff=lfs merge=lfs -text
*.pb filter=lfs diff=lfs merge=lfs -text
*.pickle filter=lfs diff=lfs merge=lfs -text
*.pkl filter=lfs diff=lfs merge=lfs -text
*.pt filter=lfs diff=lfs merge=lfs -text
*.pth filter=lfs diff=lfs merge=lfs -text
*.rar filter=lfs diff=lfs merge=lfs -text
*.safetensors filter=lfs diff=lfs merge=lfs -text
saved_model/**/* filter=lfs diff=lfs merge=lfs -text
*.tar.* filter=lfs diff=lfs merge=lfs -text
*.tar filter=lfs diff=lfs merge=lfs -text
*.tflite filter=lfs diff=lfs merge=lfs -text
*.tgz filter=lfs diff=lfs merge=lfs -text
*.wasm filter=lfs diff=lfs merge=lfs -text
*.xz filter=lfs diff=lfs merge=lfs -text
*.zip filter=lfs diff=lfs merge=lfs -text
*.zst filter=lfs diff=lfs merge=lfs -text
*tfevents* filter=lfs diff=lfs merge=lfs -text
AstraGPTCoder-7B.imatrix.gguf filter=lfs diff=lfs merge=lfs -text
AstraGPTCoder-7B.i1-Q2_K.gguf filter=lfs diff=lfs merge=lfs -text
AstraGPTCoder-7B.i1-IQ3_M.gguf filter=lfs diff=lfs merge=lfs -text
AstraGPTCoder-7B.i1-Q4_K_S.gguf filter=lfs diff=lfs merge=lfs -text
AstraGPTCoder-7B.i1-IQ3_XXS.gguf filter=lfs diff=lfs merge=lfs -text
AstraGPTCoder-7B.i1-Q3_K_M.gguf filter=lfs diff=lfs merge=lfs -text
AstraGPTCoder-7B.i1-IQ4_NL.gguf filter=lfs diff=lfs merge=lfs -text
AstraGPTCoder-7B.i1-Q4_K_M.gguf filter=lfs diff=lfs merge=lfs -text
AstraGPTCoder-7B.i1-IQ2_M.gguf filter=lfs diff=lfs merge=lfs -text
AstraGPTCoder-7B.i1-Q6_K.gguf filter=lfs diff=lfs merge=lfs -text
AstraGPTCoder-7B.i1-IQ4_XS.gguf filter=lfs diff=lfs merge=lfs -text
AstraGPTCoder-7B.i1-Q2_K_S.gguf filter=lfs diff=lfs merge=lfs -text
AstraGPTCoder-7B.i1-IQ1_M.gguf filter=lfs diff=lfs merge=lfs -text
AstraGPTCoder-7B.i1-Q3_K_S.gguf filter=lfs diff=lfs merge=lfs -text
AstraGPTCoder-7B.i1-IQ2_XXS.gguf filter=lfs diff=lfs merge=lfs -text
AstraGPTCoder-7B.i1-Q3_K_L.gguf filter=lfs diff=lfs merge=lfs -text
AstraGPTCoder-7B.i1-IQ2_XS.gguf filter=lfs diff=lfs merge=lfs -text
AstraGPTCoder-7B.i1-Q5_K_S.gguf filter=lfs diff=lfs merge=lfs -text
AstraGPTCoder-7B.i1-IQ2_S.gguf filter=lfs diff=lfs merge=lfs -text
AstraGPTCoder-7B.i1-IQ1_S.gguf filter=lfs diff=lfs merge=lfs -text
AstraGPTCoder-7B.i1-Q5_K_M.gguf filter=lfs diff=lfs merge=lfs -text
AstraGPTCoder-7B.i1-Q4_0.gguf filter=lfs diff=lfs merge=lfs -text
AstraGPTCoder-7B.i1-IQ3_XS.gguf filter=lfs diff=lfs merge=lfs -text
AstraGPTCoder-7B.i1-Q4_1.gguf filter=lfs diff=lfs merge=lfs -text
AstraGPTCoder-7B.i1-IQ3_S.gguf filter=lfs diff=lfs merge=lfs -text

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:68acddc44964ba81fa287c8dac0237225f21c489cceaae998e070274d3acbdd4
size 2042197472

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:68f078e0a1a1ef1b612efe6d3cad42cc0ae9da6805dec3b23dcedcabf773d962
size 1903668704

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:3cab75b4f1800aea1db78a5c9f17e018ff875318f4a4a06543fec3b4ebcfaea4
size 2780343776

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:1ed2f76d389703ff59c9ffc205d41c7233ae662e8fec04753e54c07eccf68e56
size 2595638752

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:665f1d5daf830bd4895d50b6f4752fadd240504bc9716c984c4cbd8a8bbd6323
size 2469023200

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:3f84a7a2cbab0703d1071fa49e5e8e46c49a33e2046d7d6b1c93123b6351d0d2
size 2273078752

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:bd3a9a9e1e74b56a4efd4370cd33b580f7b1f6aa5d7bfc801c9d7930fc0b409f
size 3574013408

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:dac4f33d8f57a3bb786ca323b01d9b95013df1ce64ba7ffa328da10dc5c28601
size 3499193824

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:ae681d4a5b72954c1f04da8f8bfb7bda4a1e7815a1b85433170b7df1f3119046
size 3346257376

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:e6eebd666cba1e4ae30d0dfe4f612ec82469cca5b57b8db247c15f4faae7348b
size 3114515936

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:5c0579238fa7f2fefde914f5959a2012c777dc964f7b1759ee71cf4d8b9afae3
size 4437814752

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:8b3512755cdd74dca4573155addd04a11f09510b5ea786099dce4f321dc2a7cf
size 4218473952

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:79a7ea9202dc3c1a1d9a4a94a3959640455d2655f0a259d811e24ea81df9d6f4
size 3015941600

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:ba2895839f220a82c7d19de24191f94bcac2aefc573e5007d95e54e90635a0e3
size 2834075104

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:14a8eeb4f9d54ab9fca9c36800c66c2a4222e749bcb97092dcb6af873adb2214
size 4088460768

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:0529f8280e081831eb969b6b1eabdb6cd6851a95c51e82094dab5b5792aec436
size 3808392672

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:634f3a50cd5fd17530862049f34838ac75767b17e9d99a94d72d5213583c1fe0
size 3492369888

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:34ee33582bde398cb0f77e9a8b42cff31a83829125f0fffcdb6054ed91d1966c
size 4444122592

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:7713741e16dbad1d776e3fff5c8892588d47c5f35f411ea4540cf6d14a50aec0
size 4873285088

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:5689df155a44cb5a58aaa2edf20d0b694ad3dfde95d3b006215fc851095cb465
size 4683075040

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:01edceb7a6c479b4f54fa93ea6deb712e049f17c59e8ae15281532b1f13322f0
size 4457770464

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:42319e3427292d8672c49e183f1f4e71f2446edda9d3fe87b89e783a24ee47dc
size 5444832736

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:2bf945fcfc9e1a8150e6824877c06ddb3e4d2befe302765624e8a97bd5c476e5
size 5315177952

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:3b2b58e3fb597468b66750a4a64ada589257fecdb3964a41519205fe36d35bdc
size 6254200288

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:e12aeaeae3a98a9f39305be33d5cbc8415c72b356bae02ba52352fe623a50558
size 4560352

101
README.md Normal file
View File

@@ -0,0 +1,101 @@
---
base_model: adityawakharkar/AstraGPTCoder-7B
language:
- en
library_name: transformers
license: apache-2.0
model_creator: Tantra AI Labs
mradermacher:
readme_rev: 1
quantized_by: mradermacher
tags:
- from-scratch
- custom-architecture
- custom-tokenizer
- reasoning
- chain-of-thought
- think-tags
- coding
- fine-tuned
- lora
- peft
- unsloth
- astragpt
- tantra-ai-labs
- rtx-4090
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
<!-- ### quants: Q2_K IQ3_M Q4_K_S IQ3_XXS Q3_K_M small-IQ4_NL Q4_K_M IQ2_M Q6_K IQ4_XS Q2_K_S IQ1_M Q3_K_S IQ2_XXS Q3_K_L IQ2_XS Q5_K_S IQ2_S IQ1_S Q5_K_M Q4_0 IQ3_XS Q4_1 IQ3_S -->
<!-- ### quants_skip: -->
<!-- ### skip_mmproj: -->
weighted/imatrix quants of https://huggingface.co/adityawakharkar/AstraGPTCoder-7B
<!-- provided-files -->
***For a convenient overview and download list, visit our [model page for this model](https://hf.tst.eu/model#AstraGPTCoder-7B-i1-GGUF).***
static quants are available at https://huggingface.co/mradermacher/AstraGPTCoder-7B-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/AstraGPTCoder-7B-i1-GGUF/resolve/main/AstraGPTCoder-7B.imatrix.gguf) | imatrix | 0.1 | imatrix file (for creating your own quants) |
| [GGUF](https://huggingface.co/mradermacher/AstraGPTCoder-7B-i1-GGUF/resolve/main/AstraGPTCoder-7B.i1-IQ1_S.gguf) | i1-IQ1_S | 2.0 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/AstraGPTCoder-7B-i1-GGUF/resolve/main/AstraGPTCoder-7B.i1-IQ1_M.gguf) | i1-IQ1_M | 2.1 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/AstraGPTCoder-7B-i1-GGUF/resolve/main/AstraGPTCoder-7B.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 2.4 | |
| [GGUF](https://huggingface.co/mradermacher/AstraGPTCoder-7B-i1-GGUF/resolve/main/AstraGPTCoder-7B.i1-IQ2_XS.gguf) | i1-IQ2_XS | 2.6 | |
| [GGUF](https://huggingface.co/mradermacher/AstraGPTCoder-7B-i1-GGUF/resolve/main/AstraGPTCoder-7B.i1-IQ2_S.gguf) | i1-IQ2_S | 2.7 | |
| [GGUF](https://huggingface.co/mradermacher/AstraGPTCoder-7B-i1-GGUF/resolve/main/AstraGPTCoder-7B.i1-IQ2_M.gguf) | i1-IQ2_M | 2.9 | |
| [GGUF](https://huggingface.co/mradermacher/AstraGPTCoder-7B-i1-GGUF/resolve/main/AstraGPTCoder-7B.i1-Q2_K_S.gguf) | i1-Q2_K_S | 2.9 | very low quality |
| [GGUF](https://huggingface.co/mradermacher/AstraGPTCoder-7B-i1-GGUF/resolve/main/AstraGPTCoder-7B.i1-Q2_K.gguf) | i1-Q2_K | 3.1 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/AstraGPTCoder-7B-i1-GGUF/resolve/main/AstraGPTCoder-7B.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 3.2 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/AstraGPTCoder-7B-i1-GGUF/resolve/main/AstraGPTCoder-7B.i1-IQ3_XS.gguf) | i1-IQ3_XS | 3.4 | |
| [GGUF](https://huggingface.co/mradermacher/AstraGPTCoder-7B-i1-GGUF/resolve/main/AstraGPTCoder-7B.i1-Q3_K_S.gguf) | i1-Q3_K_S | 3.6 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/AstraGPTCoder-7B-i1-GGUF/resolve/main/AstraGPTCoder-7B.i1-IQ3_S.gguf) | i1-IQ3_S | 3.6 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/AstraGPTCoder-7B-i1-GGUF/resolve/main/AstraGPTCoder-7B.i1-IQ3_M.gguf) | i1-IQ3_M | 3.7 | |
| [GGUF](https://huggingface.co/mradermacher/AstraGPTCoder-7B-i1-GGUF/resolve/main/AstraGPTCoder-7B.i1-Q3_K_M.gguf) | i1-Q3_K_M | 3.9 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/AstraGPTCoder-7B-i1-GGUF/resolve/main/AstraGPTCoder-7B.i1-Q3_K_L.gguf) | i1-Q3_K_L | 4.2 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/AstraGPTCoder-7B-i1-GGUF/resolve/main/AstraGPTCoder-7B.i1-IQ4_XS.gguf) | i1-IQ4_XS | 4.3 | |
| [GGUF](https://huggingface.co/mradermacher/AstraGPTCoder-7B-i1-GGUF/resolve/main/AstraGPTCoder-7B.i1-IQ4_NL.gguf) | i1-IQ4_NL | 4.5 | prefer IQ4_XS |
| [GGUF](https://huggingface.co/mradermacher/AstraGPTCoder-7B-i1-GGUF/resolve/main/AstraGPTCoder-7B.i1-Q4_0.gguf) | i1-Q4_0 | 4.5 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/AstraGPTCoder-7B-i1-GGUF/resolve/main/AstraGPTCoder-7B.i1-Q4_K_S.gguf) | i1-Q4_K_S | 4.6 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/AstraGPTCoder-7B-i1-GGUF/resolve/main/AstraGPTCoder-7B.i1-Q4_K_M.gguf) | i1-Q4_K_M | 4.8 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/AstraGPTCoder-7B-i1-GGUF/resolve/main/AstraGPTCoder-7B.i1-Q4_1.gguf) | i1-Q4_1 | 5.0 | |
| [GGUF](https://huggingface.co/mradermacher/AstraGPTCoder-7B-i1-GGUF/resolve/main/AstraGPTCoder-7B.i1-Q5_K_S.gguf) | i1-Q5_K_S | 5.4 | |
| [GGUF](https://huggingface.co/mradermacher/AstraGPTCoder-7B-i1-GGUF/resolve/main/AstraGPTCoder-7B.i1-Q5_K_M.gguf) | i1-Q5_K_M | 5.5 | |
| [GGUF](https://huggingface.co/mradermacher/AstraGPTCoder-7B-i1-GGUF/resolve/main/AstraGPTCoder-7B.i1-Q6_K.gguf) | i1-Q6_K | 6.4 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):
![image.png](https://www.nethype.de/huggingface_embed/quantpplgraph.png)
And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->