初始化项目,由ModelHub XC社区提供模型
Model: mradermacher/danube2-upscale-1.7-i1-GGUF Source: Original Platform
This commit is contained in:
60
.gitattributes
vendored
Normal file
60
.gitattributes
vendored
Normal file
@@ -0,0 +1,60 @@
|
||||
*.7z filter=lfs diff=lfs merge=lfs -text
|
||||
*.arrow filter=lfs diff=lfs merge=lfs -text
|
||||
*.bin filter=lfs diff=lfs merge=lfs -text
|
||||
*.bz2 filter=lfs diff=lfs merge=lfs -text
|
||||
*.ckpt filter=lfs diff=lfs merge=lfs -text
|
||||
*.ftz filter=lfs diff=lfs merge=lfs -text
|
||||
*.gz filter=lfs diff=lfs merge=lfs -text
|
||||
*.h5 filter=lfs diff=lfs merge=lfs -text
|
||||
*.joblib filter=lfs diff=lfs merge=lfs -text
|
||||
*.lfs.* filter=lfs diff=lfs merge=lfs -text
|
||||
*.mlmodel filter=lfs diff=lfs merge=lfs -text
|
||||
*.model filter=lfs diff=lfs merge=lfs -text
|
||||
*.msgpack filter=lfs diff=lfs merge=lfs -text
|
||||
*.npy filter=lfs diff=lfs merge=lfs -text
|
||||
*.npz filter=lfs diff=lfs merge=lfs -text
|
||||
*.onnx filter=lfs diff=lfs merge=lfs -text
|
||||
*.ot filter=lfs diff=lfs merge=lfs -text
|
||||
*.parquet filter=lfs diff=lfs merge=lfs -text
|
||||
*.pb filter=lfs diff=lfs merge=lfs -text
|
||||
*.pickle filter=lfs diff=lfs merge=lfs -text
|
||||
*.pkl filter=lfs diff=lfs merge=lfs -text
|
||||
*.pt filter=lfs diff=lfs merge=lfs -text
|
||||
*.pth filter=lfs diff=lfs merge=lfs -text
|
||||
*.rar filter=lfs diff=lfs merge=lfs -text
|
||||
*.safetensors filter=lfs diff=lfs merge=lfs -text
|
||||
saved_model/**/* filter=lfs diff=lfs merge=lfs -text
|
||||
*.tar.* filter=lfs diff=lfs merge=lfs -text
|
||||
*.tar filter=lfs diff=lfs merge=lfs -text
|
||||
*.tflite filter=lfs diff=lfs merge=lfs -text
|
||||
*.tgz filter=lfs diff=lfs merge=lfs -text
|
||||
*.wasm filter=lfs diff=lfs merge=lfs -text
|
||||
*.xz filter=lfs diff=lfs merge=lfs -text
|
||||
*.zip filter=lfs diff=lfs merge=lfs -text
|
||||
*.zst filter=lfs diff=lfs merge=lfs -text
|
||||
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
||||
imatrix.dat filter=lfs diff=lfs merge=lfs -text
|
||||
danube2-upscale-1.7.i1-IQ1_M.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
danube2-upscale-1.7.i1-IQ1_S.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
danube2-upscale-1.7.i1-IQ2_M.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
danube2-upscale-1.7.i1-IQ2_S.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
danube2-upscale-1.7.i1-IQ2_XS.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
danube2-upscale-1.7.i1-IQ2_XXS.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
danube2-upscale-1.7.i1-IQ3_M.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
danube2-upscale-1.7.i1-IQ3_S.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
danube2-upscale-1.7.i1-IQ3_XS.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
danube2-upscale-1.7.i1-IQ3_XXS.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
danube2-upscale-1.7.i1-IQ4_NL.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
danube2-upscale-1.7.i1-IQ4_XS.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
danube2-upscale-1.7.i1-Q2_K.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
danube2-upscale-1.7.i1-Q2_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
danube2-upscale-1.7.i1-Q3_K_L.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
danube2-upscale-1.7.i1-Q3_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
danube2-upscale-1.7.i1-Q3_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
danube2-upscale-1.7.i1-Q4_0.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
danube2-upscale-1.7.i1-Q4_1.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
danube2-upscale-1.7.i1-Q4_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
danube2-upscale-1.7.i1-Q4_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
danube2-upscale-1.7.i1-Q5_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
danube2-upscale-1.7.i1-Q5_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
danube2-upscale-1.7.i1-Q6_K.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
87
README.md
Normal file
87
README.md
Normal file
@@ -0,0 +1,87 @@
|
||||
---
|
||||
base_model: Lambent/danube2-upscale-1.7
|
||||
datasets:
|
||||
- HuggingFaceTB/cosmopedia-100k
|
||||
- Vezora/Tested-22k-Python-Alpaca
|
||||
- sordonia/redpajama-sample_from_valid_all
|
||||
- nampdn-ai/tiny-bridgedict
|
||||
- teknium/GPTeacher-General-Instruct
|
||||
- Severian/Internal-Knowledge-Map
|
||||
- Severian/Internal-Knowledge-Map-StoryWriter-RolePlaying
|
||||
language:
|
||||
- en
|
||||
library_name: transformers
|
||||
license: apache-2.0
|
||||
quantized_by: mradermacher
|
||||
tags:
|
||||
- mergekit
|
||||
- merge
|
||||
---
|
||||
## About
|
||||
|
||||
<!-- ### quantize_version: 2 -->
|
||||
<!-- ### output_tensor_quantised: 1 -->
|
||||
<!-- ### convert_type: hf -->
|
||||
<!-- ### vocab_type: -->
|
||||
<!-- ### tags: nicoboss -->
|
||||
weighted/imatrix quants of https://huggingface.co/Lambent/danube2-upscale-1.7
|
||||
|
||||
<!-- provided-files -->
|
||||
static quants are available at https://huggingface.co/mradermacher/danube2-upscale-1.7-GGUF
|
||||
## Usage
|
||||
|
||||
If you are unsure how to use GGUF files, refer to one of [TheBloke's
|
||||
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
|
||||
more details, including on how to concatenate multi-part files.
|
||||
|
||||
## Provided Quants
|
||||
|
||||
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
|
||||
|
||||
| Link | Type | Size/GB | Notes |
|
||||
|:-----|:-----|--------:|:------|
|
||||
| [GGUF](https://huggingface.co/mradermacher/danube2-upscale-1.7-i1-GGUF/resolve/main/danube2-upscale-1.7.i1-IQ1_S.gguf) | i1-IQ1_S | 0.6 | for the desperate |
|
||||
| [GGUF](https://huggingface.co/mradermacher/danube2-upscale-1.7-i1-GGUF/resolve/main/danube2-upscale-1.7.i1-IQ1_M.gguf) | i1-IQ1_M | 0.7 | mostly desperate |
|
||||
| [GGUF](https://huggingface.co/mradermacher/danube2-upscale-1.7-i1-GGUF/resolve/main/danube2-upscale-1.7.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 0.7 | |
|
||||
| [GGUF](https://huggingface.co/mradermacher/danube2-upscale-1.7-i1-GGUF/resolve/main/danube2-upscale-1.7.i1-IQ2_XS.gguf) | i1-IQ2_XS | 0.8 | |
|
||||
| [GGUF](https://huggingface.co/mradermacher/danube2-upscale-1.7-i1-GGUF/resolve/main/danube2-upscale-1.7.i1-IQ2_S.gguf) | i1-IQ2_S | 0.8 | |
|
||||
| [GGUF](https://huggingface.co/mradermacher/danube2-upscale-1.7-i1-GGUF/resolve/main/danube2-upscale-1.7.i1-IQ2_M.gguf) | i1-IQ2_M | 0.9 | |
|
||||
| [GGUF](https://huggingface.co/mradermacher/danube2-upscale-1.7-i1-GGUF/resolve/main/danube2-upscale-1.7.i1-Q2_K_S.gguf) | i1-Q2_K_S | 0.9 | very low quality |
|
||||
| [GGUF](https://huggingface.co/mradermacher/danube2-upscale-1.7-i1-GGUF/resolve/main/danube2-upscale-1.7.i1-Q2_K.gguf) | i1-Q2_K | 1.0 | IQ3_XXS probably better |
|
||||
| [GGUF](https://huggingface.co/mradermacher/danube2-upscale-1.7-i1-GGUF/resolve/main/danube2-upscale-1.7.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 1.0 | lower quality |
|
||||
| [GGUF](https://huggingface.co/mradermacher/danube2-upscale-1.7-i1-GGUF/resolve/main/danube2-upscale-1.7.i1-IQ3_XS.gguf) | i1-IQ3_XS | 1.1 | |
|
||||
| [GGUF](https://huggingface.co/mradermacher/danube2-upscale-1.7-i1-GGUF/resolve/main/danube2-upscale-1.7.i1-Q3_K_S.gguf) | i1-Q3_K_S | 1.1 | IQ3_XS probably better |
|
||||
| [GGUF](https://huggingface.co/mradermacher/danube2-upscale-1.7-i1-GGUF/resolve/main/danube2-upscale-1.7.i1-IQ3_S.gguf) | i1-IQ3_S | 1.1 | beats Q3_K* |
|
||||
| [GGUF](https://huggingface.co/mradermacher/danube2-upscale-1.7-i1-GGUF/resolve/main/danube2-upscale-1.7.i1-IQ3_M.gguf) | i1-IQ3_M | 1.1 | |
|
||||
| [GGUF](https://huggingface.co/mradermacher/danube2-upscale-1.7-i1-GGUF/resolve/main/danube2-upscale-1.7.i1-Q3_K_M.gguf) | i1-Q3_K_M | 1.2 | IQ3_S probably better |
|
||||
| [GGUF](https://huggingface.co/mradermacher/danube2-upscale-1.7-i1-GGUF/resolve/main/danube2-upscale-1.7.i1-Q3_K_L.gguf) | i1-Q3_K_L | 1.3 | IQ3_M probably better |
|
||||
| [GGUF](https://huggingface.co/mradermacher/danube2-upscale-1.7-i1-GGUF/resolve/main/danube2-upscale-1.7.i1-IQ4_XS.gguf) | i1-IQ4_XS | 1.3 | |
|
||||
| [GGUF](https://huggingface.co/mradermacher/danube2-upscale-1.7-i1-GGUF/resolve/main/danube2-upscale-1.7.i1-Q4_0.gguf) | i1-Q4_0 | 1.4 | fast, low quality |
|
||||
| [GGUF](https://huggingface.co/mradermacher/danube2-upscale-1.7-i1-GGUF/resolve/main/danube2-upscale-1.7.i1-IQ4_NL.gguf) | i1-IQ4_NL | 1.4 | prefer IQ4_XS |
|
||||
| [GGUF](https://huggingface.co/mradermacher/danube2-upscale-1.7-i1-GGUF/resolve/main/danube2-upscale-1.7.i1-Q4_K_S.gguf) | i1-Q4_K_S | 1.4 | optimal size/speed/quality |
|
||||
| [GGUF](https://huggingface.co/mradermacher/danube2-upscale-1.7-i1-GGUF/resolve/main/danube2-upscale-1.7.i1-Q4_K_M.gguf) | i1-Q4_K_M | 1.5 | fast, recommended |
|
||||
| [GGUF](https://huggingface.co/mradermacher/danube2-upscale-1.7-i1-GGUF/resolve/main/danube2-upscale-1.7.i1-Q4_1.gguf) | i1-Q4_1 | 1.5 | |
|
||||
| [GGUF](https://huggingface.co/mradermacher/danube2-upscale-1.7-i1-GGUF/resolve/main/danube2-upscale-1.7.i1-Q5_K_S.gguf) | i1-Q5_K_S | 1.7 | |
|
||||
| [GGUF](https://huggingface.co/mradermacher/danube2-upscale-1.7-i1-GGUF/resolve/main/danube2-upscale-1.7.i1-Q5_K_M.gguf) | i1-Q5_K_M | 1.7 | |
|
||||
| [GGUF](https://huggingface.co/mradermacher/danube2-upscale-1.7-i1-GGUF/resolve/main/danube2-upscale-1.7.i1-Q6_K.gguf) | i1-Q6_K | 1.9 | practically like static Q6_K |
|
||||
|
||||
Here is a handy graph by ikawrakow comparing some lower-quality quant
|
||||
types (lower is better):
|
||||
|
||||

|
||||
|
||||
And here are Artefact2's thoughts on the matter:
|
||||
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
|
||||
|
||||
## FAQ / Model Request
|
||||
|
||||
See https://huggingface.co/mradermacher/model_requests for some answers to
|
||||
questions you might have and/or if you want some other model quantized.
|
||||
|
||||
## Thanks
|
||||
|
||||
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
|
||||
me use its servers and providing upgrades to my workstation to enable
|
||||
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
|
||||
|
||||
<!-- end -->
|
||||
3
danube2-upscale-1.7.i1-IQ1_M.gguf
Normal file
3
danube2-upscale-1.7.i1-IQ1_M.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:ab4d635e5e239c05447cd4940743186fe5fd6c20b278e1191c939ca7a9f296c0
|
||||
size 570829248
|
||||
3
danube2-upscale-1.7.i1-IQ1_S.gguf
Normal file
3
danube2-upscale-1.7.i1-IQ1_S.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:8ae9dac944874c9499e083180957cc5e0b03e1022762b3386f8207399c203152
|
||||
size 528988608
|
||||
3
danube2-upscale-1.7.i1-IQ2_M.gguf
Normal file
3
danube2-upscale-1.7.i1-IQ2_M.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:f2607a98c0076852825d0b10e1121e1877a0f9fca3d25c8bdba4425ef4c34cce
|
||||
size 799642048
|
||||
3
danube2-upscale-1.7.i1-IQ2_S.gguf
Normal file
3
danube2-upscale-1.7.i1-IQ2_S.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:9ef1e5e9990d42a4a689b392b54bbf6ff9d9f9370ad77e5ca0c9a663f7d6f1e7
|
||||
size 743854528
|
||||
3
danube2-upscale-1.7.i1-IQ2_XS.gguf
Normal file
3
danube2-upscale-1.7.i1-IQ2_XS.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:a72dfc9b60f8116d45fcecdd525271308d8a7a127deb5aad7f5c8631750320e4
|
||||
size 702495168
|
||||
3
danube2-upscale-1.7.i1-IQ2_XXS.gguf
Normal file
3
danube2-upscale-1.7.i1-IQ2_XXS.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:2a6ead6c920d722b58400050742d4434fa717bcc3513ee22c62d4d030a82af9c
|
||||
size 640563648
|
||||
3
danube2-upscale-1.7.i1-IQ3_M.gguf
Normal file
3
danube2-upscale-1.7.i1-IQ3_M.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:54c9a08d82ef6cf56f0eaa70e9dc1d1563ae1ddc811ea005b126e0139b6a52a3
|
||||
size 1038940608
|
||||
3
danube2-upscale-1.7.i1-IQ3_S.gguf
Normal file
3
danube2-upscale-1.7.i1-IQ3_S.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:4bef6e68864ff4df51c79c1f19de84f691ac8c5d779c56af0842509770e89168
|
||||
size 1005778368
|
||||
3
danube2-upscale-1.7.i1-IQ3_XS.gguf
Normal file
3
danube2-upscale-1.7.i1-IQ3_XS.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:93dbe0f8c14929f6ce48b7269b671b6e0c57573a03a6aea78b32b489474f9a57
|
||||
size 956104128
|
||||
3
danube2-upscale-1.7.i1-IQ3_XXS.gguf
Normal file
3
danube2-upscale-1.7.i1-IQ3_XXS.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:595e041489e865ba536d32f66958fa37dfdef1bd77a84c526bcf76ff763be178
|
||||
size 893368768
|
||||
3
danube2-upscale-1.7.i1-IQ4_NL.gguf
Normal file
3
danube2-upscale-1.7.i1-IQ4_NL.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:a640c1c0ea95e7797827e42e9cf714241d1d7f0e5626d0c832555770245ee9c3
|
||||
size 1293061568
|
||||
3
danube2-upscale-1.7.i1-IQ4_XS.gguf
Normal file
3
danube2-upscale-1.7.i1-IQ4_XS.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:c321488e5842a509672d800648f584f2ce2fdebd374d286e0873368049c4339c
|
||||
size 1226911168
|
||||
3
danube2-upscale-1.7.i1-Q2_K.gguf
Normal file
3
danube2-upscale-1.7.i1-Q2_K.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:42060190f4c28a7d2512bd519de294e7f8f5807b89312bdf6216c040fa2cd9ff
|
||||
size 864671168
|
||||
3
danube2-upscale-1.7.i1-Q2_K_S.gguf
Normal file
3
danube2-upscale-1.7.i1-Q2_K_S.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:e1b2e17761fd93eb4e49ff7b2e02c73471e03408a6bfe48b59b15987b534a16a
|
||||
size 803231168
|
||||
3
danube2-upscale-1.7.i1-Q3_K_L.gguf
Normal file
3
danube2-upscale-1.7.i1-Q3_K_L.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:9a14eba969c7a961e71e893c8b20a80735bdf0f1525d6b183566022e849d8d0c
|
||||
size 1199467968
|
||||
3
danube2-upscale-1.7.i1-Q3_K_M.gguf
Normal file
3
danube2-upscale-1.7.i1-Q3_K_M.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:24de95fd13e3dd160e52cabce6304f0afb105b7a33b7be54366dda42159fec03
|
||||
size 1105014208
|
||||
3
danube2-upscale-1.7.i1-Q3_K_S.gguf
Normal file
3
danube2-upscale-1.7.i1-Q3_K_S.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:9f6374333019dd083ed7b73abb0604088fe9677cbf1ef95beb76b877bb295579
|
||||
size 999250368
|
||||
3
danube2-upscale-1.7.i1-Q4_0.gguf
Normal file
3
danube2-upscale-1.7.i1-Q4_0.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:c0b8ae755fdaddf6509ea73177a0b0937b462e3210b5bf40502041e778af7f14
|
||||
size 1290235328
|
||||
3
danube2-upscale-1.7.i1-Q4_1.gguf
Normal file
3
danube2-upscale-1.7.i1-Q4_1.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:79160bb91d773213f888441d9f0cabab9cbd39c88eba3fdae22587e58d385046
|
||||
size 1422290368
|
||||
3
danube2-upscale-1.7.i1-Q4_K_M.gguf
Normal file
3
danube2-upscale-1.7.i1-Q4_K_M.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:aafec3e6453280f6525b2da471fcd5c99e3fb9935caf384eafc228e70f87314a
|
||||
size 1356698048
|
||||
3
danube2-upscale-1.7.i1-Q4_K_S.gguf
Normal file
3
danube2-upscale-1.7.i1-Q4_K_S.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:3a800dd0a3b397a764012c4a98cbd02c98c7aca7dd08fb74e7a810ed94fa0a7c
|
||||
size 1294372288
|
||||
3
danube2-upscale-1.7.i1-Q5_K_M.gguf
Normal file
3
danube2-upscale-1.7.i1-Q5_K_M.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:a84529106a987fe681d315a5a3d153921d3f0b2bea24747d43a86656bac0e422
|
||||
size 1593610688
|
||||
3
danube2-upscale-1.7.i1-Q5_K_S.gguf
Normal file
3
danube2-upscale-1.7.i1-Q5_K_S.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:804090d040623db7bbba219d63000bdfe42b880de4200618fd7fb78737744474
|
||||
size 1557663168
|
||||
3
danube2-upscale-1.7.i1-Q6_K.gguf
Normal file
3
danube2-upscale-1.7.i1-Q6_K.gguf
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:520d8ff2c61ece0c54a57443663c3a92fa516eaf7430a7b67898723641d25aec
|
||||
size 1845330368
|
||||
3
imatrix.dat
Normal file
3
imatrix.dat
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:b7b45c3743988b75e5e23d54657a1d17c5683ed75170b6bb55112905de181a32
|
||||
size 2679595
|
||||
Reference in New Issue
Block a user