Compare commits
10 Commits
f2add78150
...
b308f05ac6
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
b308f05ac6 | ||
|
|
d2deaff3d2 | ||
|
|
2a70eff394 | ||
|
|
64d7a7bc84 | ||
|
|
1677386abf | ||
|
|
f18d48e47a | ||
|
|
577cef7249 | ||
|
|
d4213ed446 | ||
|
|
f6aae4eee4 | ||
|
|
1729b71641 |
9
.gitattributes
vendored
9
.gitattributes
vendored
@@ -53,3 +53,12 @@ mistral-small-24b-instruct-2501-abliterated-i1-Q2_K.gguf filter=lfs diff=lfs mer
|
||||
mistral-small-24b-instruct-2501-abliterated-i1-Q3_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
mistral-small-24b-instruct-2501-abliterated-i1-Q3_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
mistral-small-24b-instruct-2501-abliterated-i1-Q3_K_L.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
mistral-small-24b-instruct-2501-abliterated-i1-Q4_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
mistral-small-24b-instruct-2501-abliterated-i1-Q4_0.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
mistral-small-24b-instruct-2501-abliterated-i1-Q4_1.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
mistral-small-24b-instruct-2501-abliterated-i1-Q4_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
mistral-small-24b-instruct-2501-abliterated-i1-Q5_0.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
mistral-small-24b-instruct-2501-abliterated-i1-Q5_1.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
mistral-small-24b-instruct-2501-abliterated-i1-Q5_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
mistral-small-24b-instruct-2501-abliterated-i1-Q5_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
mistral-small-24b-instruct-2501-abliterated-i1-Q6_K.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
|
||||
24
README.md
24
README.md
@@ -12,30 +12,6 @@ tags:
|
||||
|
||||
Quantized to `i1-GGUF` using [SpongeQuant](https://github.com/SpongeEngine/SpongeQuant), the Oobabooga of LLM quantization.
|
||||
|
||||
<div style="display: flex; gap: 20px; align-items: center; margin-top:0;">
|
||||
<a href="https://github.com/SpongeEngine/SpongeQuant">
|
||||
<img src="https://huggingface.co/spaces/SpongeEngine/README/resolve/main/github-button.png" width="173">
|
||||
</a>
|
||||
<a href="https://discord.gg/azNmr2Gdgy">
|
||||
<img src="https://huggingface.co/spaces/SpongeEngine/README/resolve/main/discord-button.png" width="173">
|
||||
</a>
|
||||
</div>
|
||||
|
||||
***
|
||||
<figure>
|
||||
<img src="https://huggingface.co/spaces/SpongeEngine/README/resolve/main/101.png" alt="Street scene, Asia (Pakistan)">
|
||||
<figcaption>Street scene, Asia (Pakistan)</figcaption>
|
||||
</figure>
|
||||
|
||||
<figure>
|
||||
<audio controls>
|
||||
<source src="https://huggingface.co/spaces/SpongeEngine/README/resolve/main/011.mp3" type="audio/mp3">
|
||||
Your browser does not support the audio element.
|
||||
</audio>
|
||||
<figcaption>Chuck Berry – Johnny B. Goode (USA, 1958)</figcaption>
|
||||
</figure>
|
||||
|
||||
***
|
||||
|
||||
### What is a GGUF?
|
||||
GGUF is a file format used for running large language models (LLMs) on different types of computers. It supports both regular processors (CPUs) and graphics cards (GPUs), making it easier to run models across a wide range of hardware. Many LLMs require powerful and expensive GPUs, but GGUF improves compatibility and efficiency by optimizing how models are loaded and executed. If a GPU doesn't have enough memory, GGUF can offload parts of the model to the CPU, allowing it to run even when GPU resources are limited. GGUF is designed to work well with quantized models, which use less memory and run faster, making them ideal for lower-end hardware. However, it can also store full-precision models when needed. Thanks to these optimizations, GGUF allows LLMs to run efficiently on everything from high-end GPUs to laptops and even CPU-only systems.
|
||||
|
||||
3
mistral-small-24b-instruct-2501-abliterated-i1-Q4_0.gguf
Normal file
3
mistral-small-24b-instruct-2501-abliterated-i1-Q4_0.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:fc1234845ee9aaa019dca1fc5dbeff8a32c777e3a6d5b2619b93c74465865fc9
|
||||
size 13494230016
|
||||
3
mistral-small-24b-instruct-2501-abliterated-i1-Q4_1.gguf
Normal file
3
mistral-small-24b-instruct-2501-abliterated-i1-Q4_1.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:3df4799e2b6ea71445a915e495ecf34a82f21978b04ee32e5c3270f9c3482265
|
||||
size 14873107456
|
||||
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:ecb6875aa55a3e45d4cc1c7e46da0c729067e2b90b5f4c4ebfa963739c3c113b
|
||||
size 14333910016
|
||||
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:0a4e2f47dd9f017bb12c45f01feccee3038907db56379519e13b847d2c54d984
|
||||
size 13549280256
|
||||
3
mistral-small-24b-instruct-2501-abliterated-i1-Q5_0.gguf
Normal file
3
mistral-small-24b-instruct-2501-abliterated-i1-Q5_0.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:2ec5ca87d9d84d08d1e8c76d0997145ff3d8b0000b5cb02a2b33479af7e6c745
|
||||
size 16356842496
|
||||
3
mistral-small-24b-instruct-2501-abliterated-i1-Q5_1.gguf
Normal file
3
mistral-small-24b-instruct-2501-abliterated-i1-Q5_1.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:b198c006749b46dc5db88cae18517fb0df90dc3d03ed836ee405fcd7c1c7eea7
|
||||
size 17735719936
|
||||
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:47d5425f2cb81d815fbf68cfce3dddb5f5a284f761cbca86313b4ac83e6c0c09
|
||||
size 16763984896
|
||||
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:2e440f978c33fec0307f90f47cc8c3c589749805022da0a90091186f7ee0599b
|
||||
size 16304413696
|
||||
3
mistral-small-24b-instruct-2501-abliterated-i1-Q6_K.gguf
Normal file
3
mistral-small-24b-instruct-2501-abliterated-i1-Q6_K.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:6cd26a947cd0187a1023fa9a3dba721b7ac5bc70d333d9ba959bdaf088d36fea
|
||||
size 19345939456
|
||||
Reference in New Issue
Block a user