初始化项目,由ModelHub XC社区提供模型

Model: mradermacher/FluffyTail-i1-GGUF
Source: Original Platform
This commit is contained in:
ModelHub XC
2026-05-09 02:00:02 +08:00
commit cba4abe581
27 changed files with 225 additions and 0 deletions

60
.gitattributes vendored Normal file
View File

@@ -0,0 +1,60 @@
*.7z filter=lfs diff=lfs merge=lfs -text
*.arrow filter=lfs diff=lfs merge=lfs -text
*.bin filter=lfs diff=lfs merge=lfs -text
*.bz2 filter=lfs diff=lfs merge=lfs -text
*.ckpt filter=lfs diff=lfs merge=lfs -text
*.ftz filter=lfs diff=lfs merge=lfs -text
*.gz filter=lfs diff=lfs merge=lfs -text
*.h5 filter=lfs diff=lfs merge=lfs -text
*.joblib filter=lfs diff=lfs merge=lfs -text
*.lfs.* filter=lfs diff=lfs merge=lfs -text
*.mlmodel filter=lfs diff=lfs merge=lfs -text
*.model filter=lfs diff=lfs merge=lfs -text
*.msgpack filter=lfs diff=lfs merge=lfs -text
*.npy filter=lfs diff=lfs merge=lfs -text
*.npz filter=lfs diff=lfs merge=lfs -text
*.onnx filter=lfs diff=lfs merge=lfs -text
*.ot filter=lfs diff=lfs merge=lfs -text
*.parquet filter=lfs diff=lfs merge=lfs -text
*.pb filter=lfs diff=lfs merge=lfs -text
*.pickle filter=lfs diff=lfs merge=lfs -text
*.pkl filter=lfs diff=lfs merge=lfs -text
*.pt filter=lfs diff=lfs merge=lfs -text
*.pth filter=lfs diff=lfs merge=lfs -text
*.rar filter=lfs diff=lfs merge=lfs -text
*.safetensors filter=lfs diff=lfs merge=lfs -text
saved_model/**/* filter=lfs diff=lfs merge=lfs -text
*.tar.* filter=lfs diff=lfs merge=lfs -text
*.tar filter=lfs diff=lfs merge=lfs -text
*.tflite filter=lfs diff=lfs merge=lfs -text
*.tgz filter=lfs diff=lfs merge=lfs -text
*.wasm filter=lfs diff=lfs merge=lfs -text
*.xz filter=lfs diff=lfs merge=lfs -text
*.zip filter=lfs diff=lfs merge=lfs -text
*.zst filter=lfs diff=lfs merge=lfs -text
*tfevents* filter=lfs diff=lfs merge=lfs -text
FluffyTail.imatrix.gguf filter=lfs diff=lfs merge=lfs -text
FluffyTail.i1-IQ1_M.gguf filter=lfs diff=lfs merge=lfs -text
FluffyTail.i1-IQ1_S.gguf filter=lfs diff=lfs merge=lfs -text
FluffyTail.i1-IQ2_M.gguf filter=lfs diff=lfs merge=lfs -text
FluffyTail.i1-IQ2_S.gguf filter=lfs diff=lfs merge=lfs -text
FluffyTail.i1-IQ2_XS.gguf filter=lfs diff=lfs merge=lfs -text
FluffyTail.i1-IQ2_XXS.gguf filter=lfs diff=lfs merge=lfs -text
FluffyTail.i1-IQ3_M.gguf filter=lfs diff=lfs merge=lfs -text
FluffyTail.i1-IQ3_S.gguf filter=lfs diff=lfs merge=lfs -text
FluffyTail.i1-IQ3_XS.gguf filter=lfs diff=lfs merge=lfs -text
FluffyTail.i1-IQ3_XXS.gguf filter=lfs diff=lfs merge=lfs -text
FluffyTail.i1-IQ4_NL.gguf filter=lfs diff=lfs merge=lfs -text
FluffyTail.i1-IQ4_XS.gguf filter=lfs diff=lfs merge=lfs -text
FluffyTail.i1-Q2_K.gguf filter=lfs diff=lfs merge=lfs -text
FluffyTail.i1-Q2_K_S.gguf filter=lfs diff=lfs merge=lfs -text
FluffyTail.i1-Q3_K_L.gguf filter=lfs diff=lfs merge=lfs -text
FluffyTail.i1-Q3_K_M.gguf filter=lfs diff=lfs merge=lfs -text
FluffyTail.i1-Q3_K_S.gguf filter=lfs diff=lfs merge=lfs -text
FluffyTail.i1-Q4_0.gguf filter=lfs diff=lfs merge=lfs -text
FluffyTail.i1-Q4_1.gguf filter=lfs diff=lfs merge=lfs -text
FluffyTail.i1-Q4_K_M.gguf filter=lfs diff=lfs merge=lfs -text
FluffyTail.i1-Q4_K_S.gguf filter=lfs diff=lfs merge=lfs -text
FluffyTail.i1-Q5_K_M.gguf filter=lfs diff=lfs merge=lfs -text
FluffyTail.i1-Q5_K_S.gguf filter=lfs diff=lfs merge=lfs -text
FluffyTail.i1-Q6_K.gguf filter=lfs diff=lfs merge=lfs -text

3
FluffyTail.i1-IQ1_M.gguf Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:670a24a2a949a36dd3cf28852501a635c8854d122d587518dd3c86cd97d0e5be
size 464461920

3
FluffyTail.i1-IQ1_S.gguf Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:1a309d81223c9c3478841f84b71039e0a374e517a76557e2a8be422a5a5cad19
size 436528224

3
FluffyTail.i1-IQ2_M.gguf Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:98b0b4ee2f74003aca6e3ecde71f3aa9d0fb56dc297d0a74f98dacb604205e42
size 601055328

3
FluffyTail.i1-IQ2_S.gguf Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:0bd6b139c704befcbffbba3da7f6568511985488da9c06500c768155f2d27e6a
size 563810400

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:32943b049a58f9714b7ec58c0c308c1859e056e60dc091b0d3fc08fb1bab183c
size 550327392

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:ce3e648baa03e4c66ee223abb77bc1fbdaed6c087ffb17d4001d9afca5b5f0e1
size 511018080

3
FluffyTail.i1-IQ3_M.gguf Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:1e8b08516bec8d298fd56dcf5fac1d861a14a42cf2e05341ad62905bea4abed8
size 776664672

3
FluffyTail.i1-IQ3_S.gguf Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:0a6c85d99ed3a93a3af57ac82ecf0e169c91dadae93f2301edf0d92c625c1643
size 762407520

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:8d878ef096be89100794088f84c107e6280ff312dddf7ac111ea7e4e90a6f9b5
size 731699808

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:a4e835dbe93972486c79fba2d6dbce4b6485d394af7c0c153252f8b55f52b4c9
size 668792928

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:21220aea9d798bce72dece72f54f1f37123bb20224e9cfde4a187766e069459e
size 936331872

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:ac02ba201c032f574b26b37c3ae7dd1a7dfd1c68b3289f70f7fa99faa3004c35
size 895732320

3
FluffyTail.i1-Q2_K.gguf Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:1be5ae061881e2f37a5c46cf1b98a0b4166bd5a0870a9bed02d94b5797eededb
size 676305504

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:b810816ee04a678e82b2a9cfad9c5e70bd5e9d9a542d609e50f38f43b35bdec5
size 640135776

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:28516968346e8f769ba793779f03ab65e78f4b172d449edd96b904492acc4195
size 880163424

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:7313323d0029c9fc02e1afd09ea6caaac44361c8885d6d717ffbf8eac3b38bfc
size 824179296

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:406774e527e7926f25043f7866bd9763aa30559dc716f384e73a41e4ecf7b8cb
size 760945248

3
FluffyTail.i1-Q4_0.gguf Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:97cc35c01eef40504e7b7e45dd1726680ef15527f80c7a140933f9b90ed4dfbe
size 937536096

3
FluffyTail.i1-Q4_1.gguf Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:5cc57e9dac7694238a90d2a6864cc4a40f4a3b5d2e4d7a113d32dc96f4900b4e
size 1016842848

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:2ee3d63b47cd4e83722c72faedca0128481f4658dd7feee68a6f043f9e2e7264
size 986049120

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:c0ba5bcf9a34650d00b425332e3537fbde7a1a92e0ce42e4571645607fb6c088
size 940313184

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:90dcdbdaa99bb04f7443042e724e3bd9ae6c0a5bbab8217455f73aa37b24f0c4
size 1125050976

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:8207af941a335799db27a5776c5df30ed417bab626d915ca8fb48ebff6f4cb25
size 1098730080

3
FluffyTail.i1-Q6_K.gguf Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:e990897d41f28de2dea5a092bf0df8601ca03d57e4b2a5355527413348ddd2ab
size 1272740448

3
FluffyTail.imatrix.gguf Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:64cbe6bff0e56c51d8d9379252b28e1ff7049cd1b1a384e41a3751e22bcbcc2b
size 2065888

90
README.md Normal file
View File

@@ -0,0 +1,90 @@
---
base_model: MarkProMaster229/FluffyTail
language:
- ru
library_name: transformers
license: apache-2.0
mradermacher:
readme_rev: 1
quantized_by: mradermacher
tags:
- conversational
- Furry
- merge
- LoRA
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
<!-- ### quants: Q2_K IQ3_M Q4_K_S IQ3_XXS Q3_K_M small-IQ4_NL Q4_K_M IQ2_M Q6_K IQ4_XS Q2_K_S IQ1_M Q3_K_S IQ2_XXS Q3_K_L IQ2_XS Q5_K_S IQ2_S IQ1_S Q5_K_M Q4_0 IQ3_XS Q4_1 IQ3_S -->
<!-- ### quants_skip: -->
<!-- ### skip_mmproj: -->
weighted/imatrix quants of https://huggingface.co/MarkProMaster229/FluffyTail
<!-- provided-files -->
***For a convenient overview and download list, visit our [model page for this model](https://hf.tst.eu/model#FluffyTail-i1-GGUF).***
static quants are available at https://huggingface.co/mradermacher/FluffyTail-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/FluffyTail-i1-GGUF/resolve/main/FluffyTail.imatrix.gguf) | imatrix | 0.1 | imatrix file (for creating your own quants) |
| [GGUF](https://huggingface.co/mradermacher/FluffyTail-i1-GGUF/resolve/main/FluffyTail.i1-IQ1_S.gguf) | i1-IQ1_S | 0.5 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/FluffyTail-i1-GGUF/resolve/main/FluffyTail.i1-IQ1_M.gguf) | i1-IQ1_M | 0.6 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/FluffyTail-i1-GGUF/resolve/main/FluffyTail.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 0.6 | |
| [GGUF](https://huggingface.co/mradermacher/FluffyTail-i1-GGUF/resolve/main/FluffyTail.i1-IQ2_XS.gguf) | i1-IQ2_XS | 0.7 | |
| [GGUF](https://huggingface.co/mradermacher/FluffyTail-i1-GGUF/resolve/main/FluffyTail.i1-IQ2_S.gguf) | i1-IQ2_S | 0.7 | |
| [GGUF](https://huggingface.co/mradermacher/FluffyTail-i1-GGUF/resolve/main/FluffyTail.i1-IQ2_M.gguf) | i1-IQ2_M | 0.7 | |
| [GGUF](https://huggingface.co/mradermacher/FluffyTail-i1-GGUF/resolve/main/FluffyTail.i1-Q2_K_S.gguf) | i1-Q2_K_S | 0.7 | very low quality |
| [GGUF](https://huggingface.co/mradermacher/FluffyTail-i1-GGUF/resolve/main/FluffyTail.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 0.8 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/FluffyTail-i1-GGUF/resolve/main/FluffyTail.i1-Q2_K.gguf) | i1-Q2_K | 0.8 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/FluffyTail-i1-GGUF/resolve/main/FluffyTail.i1-IQ3_XS.gguf) | i1-IQ3_XS | 0.8 | |
| [GGUF](https://huggingface.co/mradermacher/FluffyTail-i1-GGUF/resolve/main/FluffyTail.i1-Q3_K_S.gguf) | i1-Q3_K_S | 0.9 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/FluffyTail-i1-GGUF/resolve/main/FluffyTail.i1-IQ3_S.gguf) | i1-IQ3_S | 0.9 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/FluffyTail-i1-GGUF/resolve/main/FluffyTail.i1-IQ3_M.gguf) | i1-IQ3_M | 0.9 | |
| [GGUF](https://huggingface.co/mradermacher/FluffyTail-i1-GGUF/resolve/main/FluffyTail.i1-Q3_K_M.gguf) | i1-Q3_K_M | 0.9 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/FluffyTail-i1-GGUF/resolve/main/FluffyTail.i1-Q3_K_L.gguf) | i1-Q3_K_L | 1.0 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/FluffyTail-i1-GGUF/resolve/main/FluffyTail.i1-IQ4_XS.gguf) | i1-IQ4_XS | 1.0 | |
| [GGUF](https://huggingface.co/mradermacher/FluffyTail-i1-GGUF/resolve/main/FluffyTail.i1-IQ4_NL.gguf) | i1-IQ4_NL | 1.0 | prefer IQ4_XS |
| [GGUF](https://huggingface.co/mradermacher/FluffyTail-i1-GGUF/resolve/main/FluffyTail.i1-Q4_0.gguf) | i1-Q4_0 | 1.0 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/FluffyTail-i1-GGUF/resolve/main/FluffyTail.i1-Q4_K_S.gguf) | i1-Q4_K_S | 1.0 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/FluffyTail-i1-GGUF/resolve/main/FluffyTail.i1-Q4_K_M.gguf) | i1-Q4_K_M | 1.1 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/FluffyTail-i1-GGUF/resolve/main/FluffyTail.i1-Q4_1.gguf) | i1-Q4_1 | 1.1 | |
| [GGUF](https://huggingface.co/mradermacher/FluffyTail-i1-GGUF/resolve/main/FluffyTail.i1-Q5_K_S.gguf) | i1-Q5_K_S | 1.2 | |
| [GGUF](https://huggingface.co/mradermacher/FluffyTail-i1-GGUF/resolve/main/FluffyTail.i1-Q5_K_M.gguf) | i1-Q5_K_M | 1.2 | |
| [GGUF](https://huggingface.co/mradermacher/FluffyTail-i1-GGUF/resolve/main/FluffyTail.i1-Q6_K.gguf) | i1-Q6_K | 1.4 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):
![image.png](https://www.nethype.de/huggingface_embed/quantpplgraph.png)
And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->