初始化项目,由ModelHub XC社区提供模型
Model: mradermacher/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA-i1-GGUF Source: Original Platform
This commit is contained in:
60
.gitattributes
vendored
Normal file
60
.gitattributes
vendored
Normal file
@@ -0,0 +1,60 @@
|
||||
*.7z filter=lfs diff=lfs merge=lfs -text
|
||||
*.arrow filter=lfs diff=lfs merge=lfs -text
|
||||
*.bin filter=lfs diff=lfs merge=lfs -text
|
||||
*.bz2 filter=lfs diff=lfs merge=lfs -text
|
||||
*.ckpt filter=lfs diff=lfs merge=lfs -text
|
||||
*.ftz filter=lfs diff=lfs merge=lfs -text
|
||||
*.gz filter=lfs diff=lfs merge=lfs -text
|
||||
*.h5 filter=lfs diff=lfs merge=lfs -text
|
||||
*.joblib filter=lfs diff=lfs merge=lfs -text
|
||||
*.lfs.* filter=lfs diff=lfs merge=lfs -text
|
||||
*.mlmodel filter=lfs diff=lfs merge=lfs -text
|
||||
*.model filter=lfs diff=lfs merge=lfs -text
|
||||
*.msgpack filter=lfs diff=lfs merge=lfs -text
|
||||
*.npy filter=lfs diff=lfs merge=lfs -text
|
||||
*.npz filter=lfs diff=lfs merge=lfs -text
|
||||
*.onnx filter=lfs diff=lfs merge=lfs -text
|
||||
*.ot filter=lfs diff=lfs merge=lfs -text
|
||||
*.parquet filter=lfs diff=lfs merge=lfs -text
|
||||
*.pb filter=lfs diff=lfs merge=lfs -text
|
||||
*.pickle filter=lfs diff=lfs merge=lfs -text
|
||||
*.pkl filter=lfs diff=lfs merge=lfs -text
|
||||
*.pt filter=lfs diff=lfs merge=lfs -text
|
||||
*.pth filter=lfs diff=lfs merge=lfs -text
|
||||
*.rar filter=lfs diff=lfs merge=lfs -text
|
||||
*.safetensors filter=lfs diff=lfs merge=lfs -text
|
||||
saved_model/**/* filter=lfs diff=lfs merge=lfs -text
|
||||
*.tar.* filter=lfs diff=lfs merge=lfs -text
|
||||
*.tar filter=lfs diff=lfs merge=lfs -text
|
||||
*.tflite filter=lfs diff=lfs merge=lfs -text
|
||||
*.tgz filter=lfs diff=lfs merge=lfs -text
|
||||
*.wasm filter=lfs diff=lfs merge=lfs -text
|
||||
*.xz filter=lfs diff=lfs merge=lfs -text
|
||||
*.zip filter=lfs diff=lfs merge=lfs -text
|
||||
*.zst filter=lfs diff=lfs merge=lfs -text
|
||||
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
||||
imatrix.dat filter=lfs diff=lfs merge=lfs -text
|
||||
OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-Q2_K.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-IQ3_M.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-Q4_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-IQ3_XXS.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-Q3_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-Q4_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-IQ2_M.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-Q6_K.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-IQ1_M.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-Q3_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-IQ4_XS.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-IQ2_XXS.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-Q3_K_L.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-IQ2_XS.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-Q5_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-IQ2_S.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-Q4_0_4_4.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-IQ1_S.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-Q4_0.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-Q5_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-IQ3_XS.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-Q4_0_8_8.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-Q4_0_4_8.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-IQ3_S.gguf filter=lfs diff=lfs merge=lfs -text
|
||||
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:9781bb2a23117d37630fa080e976bc6c67f2ee274d73827b43e4adb58419a7c4
|
||||
size 1754455840
|
||||
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:dd3b96d8588f34796491b796e955716c9fe35486b31495d5caeae0fb62a9ee42
|
||||
size 1612111648
|
||||
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:19780ad72c2092f2746cd0a96ddbc5c85c2d47b6cca0c36ada284e4d49da3ed8
|
||||
size 2500723296
|
||||
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:6eca8a3c8e1c4be99a1d40efc83483acffcb7638e70873162e4130de42fe3ab4
|
||||
size 2310931040
|
||||
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:4e13609adfc7db6f5bec4899684af214bf4fe222371f0b56f69fe06c16834376
|
||||
size 2198265632
|
||||
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:ef875438579e80a7ce4a832a44dd9ca2169004a2aa775f155c9e164ff0288bd3
|
||||
size 1991696160
|
||||
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:72d6371b9b6df03c073db5015b824908fd4d25cc4838f7d7e62eba1e783c544c
|
||||
size 3284903584
|
||||
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:63c71228e2a1355848d7e1e36d062c6f7e6116d7909978c9ed909705ad2ea631
|
||||
size 3182405280
|
||||
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:af5e64eaeeb59e23c1c02a076aca3f784b785db562d2160b50364af3ba75dd44
|
||||
size 3018827424
|
||||
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:3360251367c996ee6d1e49616f692c82b070488ffc4dc216ac7f3419946eca76
|
||||
size 2827354720
|
||||
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:d64fe962dbb510ff177e32af68526b6d8c1a73e882efb3bd91f468f01e781973
|
||||
size 3907701216
|
||||
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:bc5b5a00a3df40607f6c091145d56e1e8e71478252a9e3a931ad49a4741986be
|
||||
size 2719253344
|
||||
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:f173463d348e2223d83bcd8d82b06fe5918563e239bd18c498cf78f906b10fc7
|
||||
size 3822036640
|
||||
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:020a3e630d1307e55377bb787f30ebc43662aae65d8e45d86818e333c17bb1eb
|
||||
size 3518998176
|
||||
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:457910108fcfa5facfe26f6a9ffcea4529943e03074599698501d7e7b80901ba
|
||||
size 3164579488
|
||||
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:1f97c910e43c223541a0a7ec15bff2a883fe9d3cac6dff67c3de258ad77370b6
|
||||
size 4123609824
|
||||
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:fcc7cc9963c53ac272e5480295daa7d824b6e795001c0ed7b696855e64ae874b
|
||||
size 4108929760
|
||||
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:945deae3d1367a3d25aa378a89cd3f5a822b1e4745c96aacf8524029f4a5806d
|
||||
size 4108929760
|
||||
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:c095e0102e0b2f5e07748b10dc6033126e8e25e516a23ea51cedbbe44706c22c
|
||||
size 4108929760
|
||||
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:e96668da975d160442439b45c53accefe54e8599d9daa137c0d62c49f81870fd
|
||||
size 4368452320
|
||||
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:d3908c5fc03f22b16b7c025070d7cda9c0dca65d638850fee8d77a2c75655df3
|
||||
size 4140387040
|
||||
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:d8d3188cb9ac425b584ea8291165737227a8f16c2b82c8cb428d72fcbbbb540c
|
||||
size 5131423456
|
||||
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:cd862f3d65a9e994026fe744a2e312f57b412dfc2d3fac1291c9ad86a932e657
|
||||
size 4997730016
|
||||
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:aeea2df37fb886af9dd748a961f4c721bbf4bc6b338f1ce88c03d4c1c75ba171
|
||||
size 5942080288
|
||||
84
README.md
Normal file
84
README.md
Normal file
@@ -0,0 +1,84 @@
|
||||
---
|
||||
base_model: sayhan/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA
|
||||
datasets:
|
||||
- sayhan/strix-philosophy-qa
|
||||
language:
|
||||
- en
|
||||
library_name: transformers
|
||||
license: apache-2.0
|
||||
quantized_by: mradermacher
|
||||
tags:
|
||||
- trl
|
||||
- text-generation-inference
|
||||
- unsloth
|
||||
- mistral
|
||||
- gguf
|
||||
---
|
||||
## About
|
||||
|
||||
<!-- ### quantize_version: 2 -->
|
||||
<!-- ### output_tensor_quantised: 1 -->
|
||||
<!-- ### convert_type: hf -->
|
||||
<!-- ### vocab_type: -->
|
||||
<!-- ### tags: nicoboss -->
|
||||
weighted/imatrix quants of https://huggingface.co/sayhan/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA
|
||||
|
||||
<!-- provided-files -->
|
||||
static quants are available at https://huggingface.co/mradermacher/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA-GGUF
|
||||
## Usage
|
||||
|
||||
If you are unsure how to use GGUF files, refer to one of [TheBloke's
|
||||
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
|
||||
more details, including on how to concatenate multi-part files.
|
||||
|
||||
## Provided Quants
|
||||
|
||||
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
|
||||
|
||||
| Link | Type | Size/GB | Notes |
|
||||
|:-----|:-----|--------:|:------|
|
||||
| [GGUF](https://huggingface.co/mradermacher/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA-i1-GGUF/resolve/main/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-IQ1_S.gguf) | i1-IQ1_S | 1.7 | for the desperate |
|
||||
| [GGUF](https://huggingface.co/mradermacher/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA-i1-GGUF/resolve/main/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-IQ1_M.gguf) | i1-IQ1_M | 1.9 | mostly desperate |
|
||||
| [GGUF](https://huggingface.co/mradermacher/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA-i1-GGUF/resolve/main/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 2.1 | |
|
||||
| [GGUF](https://huggingface.co/mradermacher/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA-i1-GGUF/resolve/main/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-IQ2_XS.gguf) | i1-IQ2_XS | 2.3 | |
|
||||
| [GGUF](https://huggingface.co/mradermacher/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA-i1-GGUF/resolve/main/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-IQ2_S.gguf) | i1-IQ2_S | 2.4 | |
|
||||
| [GGUF](https://huggingface.co/mradermacher/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA-i1-GGUF/resolve/main/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-IQ2_M.gguf) | i1-IQ2_M | 2.6 | |
|
||||
| [GGUF](https://huggingface.co/mradermacher/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA-i1-GGUF/resolve/main/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-Q2_K.gguf) | i1-Q2_K | 2.8 | IQ3_XXS probably better |
|
||||
| [GGUF](https://huggingface.co/mradermacher/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA-i1-GGUF/resolve/main/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 2.9 | lower quality |
|
||||
| [GGUF](https://huggingface.co/mradermacher/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA-i1-GGUF/resolve/main/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-IQ3_XS.gguf) | i1-IQ3_XS | 3.1 | |
|
||||
| [GGUF](https://huggingface.co/mradermacher/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA-i1-GGUF/resolve/main/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-Q3_K_S.gguf) | i1-Q3_K_S | 3.3 | IQ3_XS probably better |
|
||||
| [GGUF](https://huggingface.co/mradermacher/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA-i1-GGUF/resolve/main/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-IQ3_S.gguf) | i1-IQ3_S | 3.3 | beats Q3_K* |
|
||||
| [GGUF](https://huggingface.co/mradermacher/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA-i1-GGUF/resolve/main/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-IQ3_M.gguf) | i1-IQ3_M | 3.4 | |
|
||||
| [GGUF](https://huggingface.co/mradermacher/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA-i1-GGUF/resolve/main/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-Q3_K_M.gguf) | i1-Q3_K_M | 3.6 | IQ3_S probably better |
|
||||
| [GGUF](https://huggingface.co/mradermacher/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA-i1-GGUF/resolve/main/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-Q3_K_L.gguf) | i1-Q3_K_L | 3.9 | IQ3_M probably better |
|
||||
| [GGUF](https://huggingface.co/mradermacher/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA-i1-GGUF/resolve/main/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-IQ4_XS.gguf) | i1-IQ4_XS | 4.0 | |
|
||||
| [GGUF](https://huggingface.co/mradermacher/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA-i1-GGUF/resolve/main/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-Q4_0_4_4.gguf) | i1-Q4_0_4_4 | 4.2 | fast on arm, low quality |
|
||||
| [GGUF](https://huggingface.co/mradermacher/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA-i1-GGUF/resolve/main/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-Q4_0_4_8.gguf) | i1-Q4_0_4_8 | 4.2 | fast on arm+i8mm, low quality |
|
||||
| [GGUF](https://huggingface.co/mradermacher/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA-i1-GGUF/resolve/main/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-Q4_0_8_8.gguf) | i1-Q4_0_8_8 | 4.2 | fast on arm+sve, low quality |
|
||||
| [GGUF](https://huggingface.co/mradermacher/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA-i1-GGUF/resolve/main/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-Q4_0.gguf) | i1-Q4_0 | 4.2 | fast, low quality |
|
||||
| [GGUF](https://huggingface.co/mradermacher/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA-i1-GGUF/resolve/main/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-Q4_K_S.gguf) | i1-Q4_K_S | 4.2 | optimal size/speed/quality |
|
||||
| [GGUF](https://huggingface.co/mradermacher/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA-i1-GGUF/resolve/main/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-Q4_K_M.gguf) | i1-Q4_K_M | 4.5 | fast, recommended |
|
||||
| [GGUF](https://huggingface.co/mradermacher/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA-i1-GGUF/resolve/main/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-Q5_K_S.gguf) | i1-Q5_K_S | 5.1 | |
|
||||
| [GGUF](https://huggingface.co/mradermacher/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA-i1-GGUF/resolve/main/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-Q5_K_M.gguf) | i1-Q5_K_M | 5.2 | |
|
||||
| [GGUF](https://huggingface.co/mradermacher/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA-i1-GGUF/resolve/main/OpenHermes-2.5-Strix-Philosophy-Mistral-7B-LoRA.i1-Q6_K.gguf) | i1-Q6_K | 6.0 | practically like static Q6_K |
|
||||
|
||||
Here is a handy graph by ikawrakow comparing some lower-quality quant
|
||||
types (lower is better):
|
||||
|
||||

|
||||
|
||||
And here are Artefact2's thoughts on the matter:
|
||||
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
|
||||
|
||||
## FAQ / Model Request
|
||||
|
||||
See https://huggingface.co/mradermacher/model_requests for some answers to
|
||||
questions you might have and/or if you want some other model quantized.
|
||||
|
||||
## Thanks
|
||||
|
||||
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
|
||||
me use its servers and providing upgrades to my workstation to enable
|
||||
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
|
||||
|
||||
<!-- end -->
|
||||
3
imatrix.dat
Normal file
3
imatrix.dat
Normal file
@@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:dc033fe8da5886136730a33e849615052afb38ad84fd4c735947323eb45886e5
|
||||
size 4988157
|
||||
Reference in New Issue
Block a user