初始化项目,由ModelHub XC社区提供模型

Model: mradermacher/1.5-Pints-2K-v0.1-i1-GGUF
Source: Original Platform
This commit is contained in:
ModelHub XC
2026-05-08 15:18:57 +08:00
commit 9805b02171
26 changed files with 243 additions and 0 deletions

59
.gitattributes vendored Normal file
View File

@@ -0,0 +1,59 @@
*.7z filter=lfs diff=lfs merge=lfs -text
*.arrow filter=lfs diff=lfs merge=lfs -text
*.bin filter=lfs diff=lfs merge=lfs -text
*.bz2 filter=lfs diff=lfs merge=lfs -text
*.ckpt filter=lfs diff=lfs merge=lfs -text
*.ftz filter=lfs diff=lfs merge=lfs -text
*.gz filter=lfs diff=lfs merge=lfs -text
*.h5 filter=lfs diff=lfs merge=lfs -text
*.joblib filter=lfs diff=lfs merge=lfs -text
*.lfs.* filter=lfs diff=lfs merge=lfs -text
*.mlmodel filter=lfs diff=lfs merge=lfs -text
*.model filter=lfs diff=lfs merge=lfs -text
*.msgpack filter=lfs diff=lfs merge=lfs -text
*.npy filter=lfs diff=lfs merge=lfs -text
*.npz filter=lfs diff=lfs merge=lfs -text
*.onnx filter=lfs diff=lfs merge=lfs -text
*.ot filter=lfs diff=lfs merge=lfs -text
*.parquet filter=lfs diff=lfs merge=lfs -text
*.pb filter=lfs diff=lfs merge=lfs -text
*.pickle filter=lfs diff=lfs merge=lfs -text
*.pkl filter=lfs diff=lfs merge=lfs -text
*.pt filter=lfs diff=lfs merge=lfs -text
*.pth filter=lfs diff=lfs merge=lfs -text
*.rar filter=lfs diff=lfs merge=lfs -text
*.safetensors filter=lfs diff=lfs merge=lfs -text
saved_model/**/* filter=lfs diff=lfs merge=lfs -text
*.tar.* filter=lfs diff=lfs merge=lfs -text
*.tar filter=lfs diff=lfs merge=lfs -text
*.tflite filter=lfs diff=lfs merge=lfs -text
*.tgz filter=lfs diff=lfs merge=lfs -text
*.wasm filter=lfs diff=lfs merge=lfs -text
*.xz filter=lfs diff=lfs merge=lfs -text
*.zip filter=lfs diff=lfs merge=lfs -text
*.zst filter=lfs diff=lfs merge=lfs -text
*tfevents* filter=lfs diff=lfs merge=lfs -text
1.5-Pints-2K-v0.1.i1-IQ1_M.gguf filter=lfs diff=lfs merge=lfs -text
1.5-Pints-2K-v0.1.i1-IQ1_S.gguf filter=lfs diff=lfs merge=lfs -text
1.5-Pints-2K-v0.1.i1-IQ2_M.gguf filter=lfs diff=lfs merge=lfs -text
1.5-Pints-2K-v0.1.i1-IQ2_S.gguf filter=lfs diff=lfs merge=lfs -text
1.5-Pints-2K-v0.1.i1-IQ2_XS.gguf filter=lfs diff=lfs merge=lfs -text
1.5-Pints-2K-v0.1.i1-IQ2_XXS.gguf filter=lfs diff=lfs merge=lfs -text
1.5-Pints-2K-v0.1.i1-IQ3_M.gguf filter=lfs diff=lfs merge=lfs -text
1.5-Pints-2K-v0.1.i1-IQ3_S.gguf filter=lfs diff=lfs merge=lfs -text
1.5-Pints-2K-v0.1.i1-IQ3_XS.gguf filter=lfs diff=lfs merge=lfs -text
1.5-Pints-2K-v0.1.i1-IQ3_XXS.gguf filter=lfs diff=lfs merge=lfs -text
1.5-Pints-2K-v0.1.i1-IQ4_NL.gguf filter=lfs diff=lfs merge=lfs -text
1.5-Pints-2K-v0.1.i1-IQ4_XS.gguf filter=lfs diff=lfs merge=lfs -text
1.5-Pints-2K-v0.1.i1-Q2_K.gguf filter=lfs diff=lfs merge=lfs -text
1.5-Pints-2K-v0.1.i1-Q2_K_S.gguf filter=lfs diff=lfs merge=lfs -text
1.5-Pints-2K-v0.1.i1-Q3_K_L.gguf filter=lfs diff=lfs merge=lfs -text
1.5-Pints-2K-v0.1.i1-Q3_K_M.gguf filter=lfs diff=lfs merge=lfs -text
1.5-Pints-2K-v0.1.i1-Q3_K_S.gguf filter=lfs diff=lfs merge=lfs -text
1.5-Pints-2K-v0.1.i1-Q4_0.gguf filter=lfs diff=lfs merge=lfs -text
1.5-Pints-2K-v0.1.i1-Q4_1.gguf filter=lfs diff=lfs merge=lfs -text
1.5-Pints-2K-v0.1.i1-Q4_K_M.gguf filter=lfs diff=lfs merge=lfs -text
1.5-Pints-2K-v0.1.i1-Q4_K_S.gguf filter=lfs diff=lfs merge=lfs -text
1.5-Pints-2K-v0.1.i1-Q5_K_M.gguf filter=lfs diff=lfs merge=lfs -text
1.5-Pints-2K-v0.1.i1-Q5_K_S.gguf filter=lfs diff=lfs merge=lfs -text
1.5-Pints-2K-v0.1.i1-Q6_K.gguf filter=lfs diff=lfs merge=lfs -text

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:b52fc63f934b42982bedd5d9c6095156f80a4c5a283c4f9e3d14078c9a943725
size 395379648

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:8970b4bd1da712da2f342765fba682ac12612d4ef25e1367c814bb5174760733
size 365593536

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:c545514c88f7068448e4418bd6e51c95262028e832f819d0a381104f64a092e6
size 553535424

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:aa00ccbb523d25138467e4d3a12f7be3ed2f38ed34bf1186670ad111726a7b8f
size 513820608

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:da2b82c0a8b50e8915b2195ef5069a43d8e9c132f350abae4fac3775b9726bd7
size 487883712

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:35b28d5d868b35d46d50a5ffc670b776e93715ac95069b96e7355c4f8c601f7b
size 445023168

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:c18e553476f0659f62a3461d8571f62615a229f4f369e788df842ff7563fd792
size 721312704

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:111ec0787f3ff8f4dae0234e45400b3625ab7acc58443aa17159266cb9842ffc
size 701258688

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:1348f9f9c1c181c013c3e4acb72a83a8fe9eaddde4fc86e9de6d15d03dbbe63e
size 667638720

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:a6be48bff47d8bc123a70190f90fa4c2fc3f79ee91c6e4268b09a791c809f756
size 623527872

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:2c446fd3cc411df80395a8759c1e9980482e4a2ef2edecd697cf00abab52d8a2
size 900394944

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:66fb5101a5580c8d01265fd0553ef0c2b27d1650b5a965fc7e3e4c1db1250c97
size 853909440

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:44b309170d02dce6f634feff8505b96ca5b5e7223581bd21ac1aab4d4d92fcb4
size 601298880

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:ed055e421cc0e83f1c246d8751fce1de6f1bbec05c211bc18a97117bc850b03c
size 561977280

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:8f0b027e7e141c21e1434c734974f7061a04995e253fd7ea6aee449c5b10fc20
size 832592832

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:f63bceff6a77b6f9349a70a52d6c7b857c7321a879ae72c1ece83d63c1d64c4a
size 770333632

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:450bdc3e7e5c41d5448cd743ec38b3211bfcccf9da9666b35c3510b756dbef04
size 699587520

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:818018473f00e341af5bc60c0e4da0da5fba852f16c413dd9272bcc9bb2ea83d
size 901967808

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:c050c4359dd1645c45af03c865ea66896812a357863bd1e88afd8d9187df5793
size 992579520

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:348043704660e239b7728f342288b0cbe800ca8d4b94405637ce79d13cf82614
size 952348608

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:c0d43d01ab0d26a83ec07b4d94fbfcb23b4916567cee0064cefc6c94fe58671d
size 905375680

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:beccb4ca33453c8ce4e6661032e5017411f38e7f53da5072a0fa8e8857cbdf7a
size 1113911232

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:b3b39885f297e3297452ae0b241455ec76043adbb9569b666d486d75436fe681
size 1086336960

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:4b010f0c2194d343f487ccea57f05a020f88be2551a4186d676f725adb065abe
size 1285571520

112
README.md Normal file
View File

@@ -0,0 +1,112 @@
---
base_model: pints-ai/1.5-Pints-2K-v0.1
datasets:
- pints-ai/Expository-Prose-V1
- HuggingFaceH4/ultrachat_200k
- Open-Orca/SlimOrca-Dedup
- meta-math/MetaMathQA
- HuggingFaceH4/deita-10k-v0-sft
- WizardLM/WizardLM_evol_instruct_V2_196k
- togethercomputer/llama-instruct
- LDJnr/Capybara
- HuggingFaceH4/ultrafeedback_binarized
extra_gated_fields:
Company: text
Country: country
I agree to use this model for in accordance to the afore-mentioned Terms of Use: checkbox
I want to use this model for:
options:
- Research
- Education
- label: Other
value: other
type: select
Specific date: date_picker
extra_gated_prompt: Though best efforts has been made to ensure, as much as possible,
that all texts in the training corpora are royalty free, this does not constitute
a legal guarantee that such is the case. **By using any of the models, corpora or
part thereof, the user agrees to bear full responsibility to do the necessary due
diligence to ensure that he / she is in compliance with their local copyright laws.
Additionally, the user agrees to bear any damages arising as a direct cause (or
otherwise) of using any artifacts released by the pints research team, as well as
full responsibility for the consequences of his / her usage (or implementation)
of any such released artifacts. The user also indemnifies Pints Research Team (and
any of its members or agents) of any damage, related or unrelated, to the release
or subsequent usage of any findings, artifacts or code by the team. For the avoidance
of doubt, any artifacts released by the Pints Research team are done so in accordance
with the 'fair use' clause of Copyright Law, in hopes that this will aid the research
community in bringing LLMs to the next frontier.
language:
- en
library_name: transformers
license: mit
quantized_by: mradermacher
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
weighted/imatrix quants of https://huggingface.co/pints-ai/1.5-Pints-2K-v0.1
<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/1.5-Pints-2K-v0.1-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/1.5-Pints-2K-v0.1-i1-GGUF/resolve/main/1.5-Pints-2K-v0.1.i1-IQ1_S.gguf) | i1-IQ1_S | 0.5 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/1.5-Pints-2K-v0.1-i1-GGUF/resolve/main/1.5-Pints-2K-v0.1.i1-IQ1_M.gguf) | i1-IQ1_M | 0.5 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/1.5-Pints-2K-v0.1-i1-GGUF/resolve/main/1.5-Pints-2K-v0.1.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 0.5 | |
| [GGUF](https://huggingface.co/mradermacher/1.5-Pints-2K-v0.1-i1-GGUF/resolve/main/1.5-Pints-2K-v0.1.i1-IQ2_XS.gguf) | i1-IQ2_XS | 0.6 | |
| [GGUF](https://huggingface.co/mradermacher/1.5-Pints-2K-v0.1-i1-GGUF/resolve/main/1.5-Pints-2K-v0.1.i1-IQ2_S.gguf) | i1-IQ2_S | 0.6 | |
| [GGUF](https://huggingface.co/mradermacher/1.5-Pints-2K-v0.1-i1-GGUF/resolve/main/1.5-Pints-2K-v0.1.i1-IQ2_M.gguf) | i1-IQ2_M | 0.7 | |
| [GGUF](https://huggingface.co/mradermacher/1.5-Pints-2K-v0.1-i1-GGUF/resolve/main/1.5-Pints-2K-v0.1.i1-Q2_K_S.gguf) | i1-Q2_K_S | 0.7 | very low quality |
| [GGUF](https://huggingface.co/mradermacher/1.5-Pints-2K-v0.1-i1-GGUF/resolve/main/1.5-Pints-2K-v0.1.i1-Q2_K.gguf) | i1-Q2_K | 0.7 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/1.5-Pints-2K-v0.1-i1-GGUF/resolve/main/1.5-Pints-2K-v0.1.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 0.7 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/1.5-Pints-2K-v0.1-i1-GGUF/resolve/main/1.5-Pints-2K-v0.1.i1-IQ3_XS.gguf) | i1-IQ3_XS | 0.8 | |
| [GGUF](https://huggingface.co/mradermacher/1.5-Pints-2K-v0.1-i1-GGUF/resolve/main/1.5-Pints-2K-v0.1.i1-Q3_K_S.gguf) | i1-Q3_K_S | 0.8 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/1.5-Pints-2K-v0.1-i1-GGUF/resolve/main/1.5-Pints-2K-v0.1.i1-IQ3_S.gguf) | i1-IQ3_S | 0.8 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/1.5-Pints-2K-v0.1-i1-GGUF/resolve/main/1.5-Pints-2K-v0.1.i1-IQ3_M.gguf) | i1-IQ3_M | 0.8 | |
| [GGUF](https://huggingface.co/mradermacher/1.5-Pints-2K-v0.1-i1-GGUF/resolve/main/1.5-Pints-2K-v0.1.i1-Q3_K_M.gguf) | i1-Q3_K_M | 0.9 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/1.5-Pints-2K-v0.1-i1-GGUF/resolve/main/1.5-Pints-2K-v0.1.i1-Q3_K_L.gguf) | i1-Q3_K_L | 0.9 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/1.5-Pints-2K-v0.1-i1-GGUF/resolve/main/1.5-Pints-2K-v0.1.i1-IQ4_XS.gguf) | i1-IQ4_XS | 1.0 | |
| [GGUF](https://huggingface.co/mradermacher/1.5-Pints-2K-v0.1-i1-GGUF/resolve/main/1.5-Pints-2K-v0.1.i1-IQ4_NL.gguf) | i1-IQ4_NL | 1.0 | prefer IQ4_XS |
| [GGUF](https://huggingface.co/mradermacher/1.5-Pints-2K-v0.1-i1-GGUF/resolve/main/1.5-Pints-2K-v0.1.i1-Q4_0.gguf) | i1-Q4_0 | 1.0 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/1.5-Pints-2K-v0.1-i1-GGUF/resolve/main/1.5-Pints-2K-v0.1.i1-Q4_K_S.gguf) | i1-Q4_K_S | 1.0 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/1.5-Pints-2K-v0.1-i1-GGUF/resolve/main/1.5-Pints-2K-v0.1.i1-Q4_K_M.gguf) | i1-Q4_K_M | 1.1 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/1.5-Pints-2K-v0.1-i1-GGUF/resolve/main/1.5-Pints-2K-v0.1.i1-Q4_1.gguf) | i1-Q4_1 | 1.1 | |
| [GGUF](https://huggingface.co/mradermacher/1.5-Pints-2K-v0.1-i1-GGUF/resolve/main/1.5-Pints-2K-v0.1.i1-Q5_K_S.gguf) | i1-Q5_K_S | 1.2 | |
| [GGUF](https://huggingface.co/mradermacher/1.5-Pints-2K-v0.1-i1-GGUF/resolve/main/1.5-Pints-2K-v0.1.i1-Q5_K_M.gguf) | i1-Q5_K_M | 1.2 | |
| [GGUF](https://huggingface.co/mradermacher/1.5-Pints-2K-v0.1-i1-GGUF/resolve/main/1.5-Pints-2K-v0.1.i1-Q6_K.gguf) | i1-Q6_K | 1.4 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):
![image.png](https://www.nethype.de/huggingface_embed/quantpplgraph.png)
And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->