commit 0d72bf1bb2c8d3b4e5bf2348a5624bdb7b7911d6 Author: ModelHub XC Date: Sat May 9 12:22:16 2026 +0800 初始化项目,由ModelHub XC社区提供模型 Model: mradermacher/Gelato-30B-A3B-i1-GGUF Source: Original Platform diff --git a/.gitattributes b/.gitattributes new file mode 100644 index 0000000..e32f707 --- /dev/null +++ b/.gitattributes @@ -0,0 +1,59 @@ +*.7z filter=lfs diff=lfs merge=lfs -text +*.arrow filter=lfs diff=lfs merge=lfs -text +*.bin filter=lfs diff=lfs merge=lfs -text +*.bz2 filter=lfs diff=lfs merge=lfs -text +*.ckpt filter=lfs diff=lfs merge=lfs -text +*.ftz filter=lfs diff=lfs merge=lfs -text +*.gz filter=lfs diff=lfs merge=lfs -text +*.h5 filter=lfs diff=lfs merge=lfs -text +*.joblib filter=lfs diff=lfs merge=lfs -text +*.lfs.* filter=lfs diff=lfs merge=lfs -text +*.mlmodel filter=lfs diff=lfs merge=lfs -text +*.model filter=lfs diff=lfs merge=lfs -text +*.msgpack filter=lfs diff=lfs merge=lfs -text +*.npy filter=lfs diff=lfs merge=lfs -text +*.npz filter=lfs diff=lfs merge=lfs -text +*.onnx filter=lfs diff=lfs merge=lfs -text +*.ot filter=lfs diff=lfs merge=lfs -text +*.parquet filter=lfs diff=lfs merge=lfs -text +*.pb filter=lfs diff=lfs merge=lfs -text +*.pickle filter=lfs diff=lfs merge=lfs -text +*.pkl filter=lfs diff=lfs merge=lfs -text +*.pt filter=lfs diff=lfs merge=lfs -text +*.pth filter=lfs diff=lfs merge=lfs -text +*.rar filter=lfs diff=lfs merge=lfs -text +*.safetensors filter=lfs diff=lfs merge=lfs -text +saved_model/**/* filter=lfs diff=lfs merge=lfs -text +*.tar.* filter=lfs diff=lfs merge=lfs -text +*.tar filter=lfs diff=lfs merge=lfs -text +*.tflite filter=lfs diff=lfs merge=lfs -text +*.tgz filter=lfs diff=lfs merge=lfs -text +*.wasm filter=lfs diff=lfs merge=lfs -text +*.xz filter=lfs diff=lfs merge=lfs -text +*.zip filter=lfs diff=lfs merge=lfs -text +*.zst filter=lfs diff=lfs merge=lfs -text +*tfevents* filter=lfs diff=lfs merge=lfs -text +Gelato-30B-A3B.imatrix.gguf filter=lfs diff=lfs merge=lfs -text +Gelato-30B-A3B.i1-Q2_K.gguf filter=lfs diff=lfs merge=lfs -text +Gelato-30B-A3B.i1-IQ3_M.gguf filter=lfs diff=lfs merge=lfs -text +Gelato-30B-A3B.i1-Q4_K_S.gguf filter=lfs diff=lfs merge=lfs -text +Gelato-30B-A3B.i1-IQ3_XXS.gguf filter=lfs diff=lfs merge=lfs -text +Gelato-30B-A3B.i1-Q3_K_M.gguf filter=lfs diff=lfs merge=lfs -text +Gelato-30B-A3B.i1-Q4_K_M.gguf filter=lfs diff=lfs merge=lfs -text +Gelato-30B-A3B.i1-IQ2_M.gguf filter=lfs diff=lfs merge=lfs -text +Gelato-30B-A3B.i1-Q6_K.gguf filter=lfs diff=lfs merge=lfs -text +Gelato-30B-A3B.i1-IQ4_XS.gguf filter=lfs diff=lfs merge=lfs -text +Gelato-30B-A3B.i1-Q2_K_S.gguf filter=lfs diff=lfs merge=lfs -text +Gelato-30B-A3B.i1-IQ1_M.gguf filter=lfs diff=lfs merge=lfs -text +Gelato-30B-A3B.i1-Q3_K_S.gguf filter=lfs diff=lfs merge=lfs -text +Gelato-30B-A3B.i1-IQ2_XXS.gguf filter=lfs diff=lfs merge=lfs -text +Gelato-30B-A3B.i1-Q3_K_L.gguf filter=lfs diff=lfs merge=lfs -text +Gelato-30B-A3B.i1-IQ2_XS.gguf filter=lfs diff=lfs merge=lfs -text +Gelato-30B-A3B.i1-Q5_K_S.gguf filter=lfs diff=lfs merge=lfs -text +Gelato-30B-A3B.i1-IQ2_S.gguf filter=lfs diff=lfs merge=lfs -text +Gelato-30B-A3B.i1-IQ1_S.gguf filter=lfs diff=lfs merge=lfs -text +Gelato-30B-A3B.i1-Q5_K_M.gguf filter=lfs diff=lfs merge=lfs -text +Gelato-30B-A3B.i1-Q4_0.gguf filter=lfs diff=lfs merge=lfs -text +Gelato-30B-A3B.i1-IQ3_XS.gguf filter=lfs diff=lfs merge=lfs -text +Gelato-30B-A3B.i1-Q4_1.gguf filter=lfs diff=lfs merge=lfs -text +Gelato-30B-A3B.i1-IQ3_S.gguf filter=lfs diff=lfs merge=lfs -text diff --git a/Gelato-30B-A3B.i1-IQ1_M.gguf b/Gelato-30B-A3B.i1-IQ1_M.gguf new file mode 100644 index 0000000..1edf912 --- /dev/null +++ b/Gelato-30B-A3B.i1-IQ1_M.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:ab5ebb873595683dfb7004cfe0df29409292f5705253f7a3381eaa7c902bb304 +size 7078293792 diff --git a/Gelato-30B-A3B.i1-IQ1_S.gguf b/Gelato-30B-A3B.i1-IQ1_S.gguf new file mode 100644 index 0000000..3e76f4c --- /dev/null +++ b/Gelato-30B-A3B.i1-IQ1_S.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:d98766a8bdf4ce04962e1af112b11f55919a98e895fbd4871e06376f45677edb +size 6416511264 diff --git a/Gelato-30B-A3B.i1-IQ2_M.gguf b/Gelato-30B-A3B.i1-IQ2_M.gguf new file mode 100644 index 0000000..8267ac3 --- /dev/null +++ b/Gelato-30B-A3B.i1-IQ2_M.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:37fc5d799f88e2c515cf6a992ef4787d34250eb5161e6d88337956e7b41e8de7 +size 10169510176 diff --git a/Gelato-30B-A3B.i1-IQ2_S.gguf b/Gelato-30B-A3B.i1-IQ2_S.gguf new file mode 100644 index 0000000..9c40131 --- /dev/null +++ b/Gelato-30B-A3B.i1-IQ2_S.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:87318cf9951fc73301ccf1c7da08f2b2a978342fc75cd317cdf20d69e5df6a67 +size 9287133472 diff --git a/Gelato-30B-A3B.i1-IQ2_XS.gguf b/Gelato-30B-A3B.i1-IQ2_XS.gguf new file mode 100644 index 0000000..e505a94 --- /dev/null +++ b/Gelato-30B-A3B.i1-IQ2_XS.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:ef522eacc9cb239fc162928bf554e9dd182c39f476ee2620cf5ad9c026c05b54 +size 9076224288 diff --git a/Gelato-30B-A3B.i1-IQ2_XXS.gguf b/Gelato-30B-A3B.i1-IQ2_XXS.gguf new file mode 100644 index 0000000..dd96644 --- /dev/null +++ b/Gelato-30B-A3B.i1-IQ2_XXS.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:5df870adb1402c97d2cb72db5bb82e8d97ff6c75b705ce2fce29e8192e4acaa2 +size 8181264672 diff --git a/Gelato-30B-A3B.i1-IQ3_M.gguf b/Gelato-30B-A3B.i1-IQ3_M.gguf new file mode 100644 index 0000000..389de6f --- /dev/null +++ b/Gelato-30B-A3B.i1-IQ3_M.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:11e2166fe82f7d306a5ae0ae87fea630a285390d245011a88dba27e67dd4a797 +size 13513064736 diff --git a/Gelato-30B-A3B.i1-IQ3_S.gguf b/Gelato-30B-A3B.i1-IQ3_S.gguf new file mode 100644 index 0000000..980e800 --- /dev/null +++ b/Gelato-30B-A3B.i1-IQ3_S.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:7c349a64f401c53620230404a3f6eea3a34e115ccbbf9eba8d7e7ace4235844b +size 13299155232 diff --git a/Gelato-30B-A3B.i1-IQ3_XS.gguf b/Gelato-30B-A3B.i1-IQ3_XS.gguf new file mode 100644 index 0000000..fe94607 --- /dev/null +++ b/Gelato-30B-A3B.i1-IQ3_XS.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:0fab9d8455b940e22364e4242f399fbf0a2381ab511571f61721ca03a1284818 +size 12598444320 diff --git a/Gelato-30B-A3B.i1-IQ3_XXS.gguf b/Gelato-30B-A3B.i1-IQ3_XXS.gguf new file mode 100644 index 0000000..437dbb4 --- /dev/null +++ b/Gelato-30B-A3B.i1-IQ3_XXS.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:5f0767f22fa6bdd1925cbeede5c33ef359f3ebaf828c48145cdde530cca2010e +size 11849328928 diff --git a/Gelato-30B-A3B.i1-IQ4_XS.gguf b/Gelato-30B-A3B.i1-IQ4_XS.gguf new file mode 100644 index 0000000..847944a --- /dev/null +++ b/Gelato-30B-A3B.i1-IQ4_XS.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:d6dd1eb6e19b386adf2d5752283984b0496f60f52bc76c58baec59187b702e1b +size 16368351520 diff --git a/Gelato-30B-A3B.i1-Q2_K.gguf b/Gelato-30B-A3B.i1-Q2_K.gguf new file mode 100644 index 0000000..cf4ecd5 --- /dev/null +++ b/Gelato-30B-A3B.i1-Q2_K.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:c2b65cf3d6b2341484cb0c6d96d2b6aab4d1fdf62843b59c73b80e53c9774a71 +size 11258612000 diff --git a/Gelato-30B-A3B.i1-Q2_K_S.gguf b/Gelato-30B-A3B.i1-Q2_K_S.gguf new file mode 100644 index 0000000..367dde5 --- /dev/null +++ b/Gelato-30B-A3B.i1-Q2_K_S.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:553f2aaae6517a4d8e1e30a24b593055a80eed8a228e36cd4eaa1930ea103fcf +size 10519365920 diff --git a/Gelato-30B-A3B.i1-Q3_K_L.gguf b/Gelato-30B-A3B.i1-Q3_K_L.gguf new file mode 100644 index 0000000..ef8856c --- /dev/null +++ b/Gelato-30B-A3B.i1-Q3_K_L.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:85717d2272bbde165b7121e5fbd94561ffe273f03695a01bea7b9c913d11cba5 +size 15900672288 diff --git a/Gelato-30B-A3B.i1-Q3_K_M.gguf b/Gelato-30B-A3B.i1-Q3_K_M.gguf new file mode 100644 index 0000000..62effd0 --- /dev/null +++ b/Gelato-30B-A3B.i1-Q3_K_M.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:153dd9792e54a88aa252ed86c6722e5d3fdd4c82538b2c0e52eb938d335ef9e6 +size 14711849248 diff --git a/Gelato-30B-A3B.i1-Q3_K_S.gguf b/Gelato-30B-A3B.i1-Q3_K_S.gguf new file mode 100644 index 0000000..b3f26d6 --- /dev/null +++ b/Gelato-30B-A3B.i1-Q3_K_S.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:0c7f9258e547e91ea2adf2166954973a84ba8aa6f6461f5ba0ddda47d887a319 +size 13292470560 diff --git a/Gelato-30B-A3B.i1-Q4_0.gguf b/Gelato-30B-A3B.i1-Q4_0.gguf new file mode 100644 index 0000000..640a270 --- /dev/null +++ b/Gelato-30B-A3B.i1-Q4_0.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:64bffaa224622f8df1052832f9fe85028ee0088933971e5cf89f3ea23025a559 +size 17379989792 diff --git a/Gelato-30B-A3B.i1-Q4_1.gguf b/Gelato-30B-A3B.i1-Q4_1.gguf new file mode 100644 index 0000000..addcb98 --- /dev/null +++ b/Gelato-30B-A3B.i1-Q4_1.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:90e172a67c699ad8e53027712bf7b97cfcd66bee10bf7918706d2c1fd60397c3 +size 19192502560 diff --git a/Gelato-30B-A3B.i1-Q4_K_M.gguf b/Gelato-30B-A3B.i1-Q4_K_M.gguf new file mode 100644 index 0000000..73b0e61 --- /dev/null +++ b/Gelato-30B-A3B.i1-Q4_K_M.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:b353b25d0e193340dbf68261d930f5456adb2933a85d74be5296757d85337f45 +size 18556688672 diff --git a/Gelato-30B-A3B.i1-Q4_K_S.gguf b/Gelato-30B-A3B.i1-Q4_K_S.gguf new file mode 100644 index 0000000..3419753 --- /dev/null +++ b/Gelato-30B-A3B.i1-Q4_K_S.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:0fd531609c71404333ae3d1335e04e65d16150e6b3e8e2e82d4dbd07dfc67d33 +size 17456011552 diff --git a/Gelato-30B-A3B.i1-Q5_K_M.gguf b/Gelato-30B-A3B.i1-Q5_K_M.gguf new file mode 100644 index 0000000..f2fdc76 --- /dev/null +++ b/Gelato-30B-A3B.i1-Q5_K_M.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:aa893e4a5cbe02de2944086c5f7329474f11448d095031c09ed6241df21aa0d8 +size 21725583648 diff --git a/Gelato-30B-A3B.i1-Q5_K_S.gguf b/Gelato-30B-A3B.i1-Q5_K_S.gguf new file mode 100644 index 0000000..6c6f965 --- /dev/null +++ b/Gelato-30B-A3B.i1-Q5_K_S.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:4ff0245af95431eb41c7240beff432053efd797ee587ebb0f9687b4624d1833c +size 21080512800 diff --git a/Gelato-30B-A3B.i1-Q6_K.gguf b/Gelato-30B-A3B.i1-Q6_K.gguf new file mode 100644 index 0000000..a9deb82 --- /dev/null +++ b/Gelato-30B-A3B.i1-Q6_K.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:0729abbe0d6769168c9fa7c0284e4595f551def016666de6b9942a81757294a9 +size 25092534560 diff --git a/Gelato-30B-A3B.imatrix.gguf b/Gelato-30B-A3B.imatrix.gguf new file mode 100644 index 0000000..49287a5 --- /dev/null +++ b/Gelato-30B-A3B.imatrix.gguf @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:52588e065587cd30cd4734e9b8236fadffb4a32d633a402f927b8baef7f89024 +size 122029312 diff --git a/README.md b/README.md new file mode 100644 index 0000000..ebe40f2 --- /dev/null +++ b/README.md @@ -0,0 +1,88 @@ +--- +base_model: mlfoundations/Gelato-30B-A3B +datasets: +- mlfoundations/Click-100k +language: +- en +library_name: transformers +license: apache-2.0 +mradermacher: + readme_rev: 1 +quantized_by: mradermacher +--- +## About + + + + + + + + + +weighted/imatrix quants of https://huggingface.co/mlfoundations/Gelato-30B-A3B + + + +***For a convenient overview and download list, visit our [model page for this model](https://hf.tst.eu/model#Gelato-30B-A3B-i1-GGUF).*** + +static quants are available at https://huggingface.co/mradermacher/Gelato-30B-A3B-GGUF + +**This is a vision model - mmproj files (if any) will be in the [static repository](https://huggingface.co/mradermacher/Gelato-30B-A3B-GGUF).** +## Usage + +If you are unsure how to use GGUF files, refer to one of [TheBloke's +READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for +more details, including on how to concatenate multi-part files. + +## Provided Quants + +(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants) + +| Link | Type | Size/GB | Notes | +|:-----|:-----|--------:|:------| +| [GGUF](https://huggingface.co/mradermacher/Gelato-30B-A3B-i1-GGUF/resolve/main/Gelato-30B-A3B.imatrix.gguf) | imatrix | 0.2 | imatrix file (for creating your own quants) | +| [GGUF](https://huggingface.co/mradermacher/Gelato-30B-A3B-i1-GGUF/resolve/main/Gelato-30B-A3B.i1-IQ1_S.gguf) | i1-IQ1_S | 6.5 | for the desperate | +| [GGUF](https://huggingface.co/mradermacher/Gelato-30B-A3B-i1-GGUF/resolve/main/Gelato-30B-A3B.i1-IQ1_M.gguf) | i1-IQ1_M | 7.2 | mostly desperate | +| [GGUF](https://huggingface.co/mradermacher/Gelato-30B-A3B-i1-GGUF/resolve/main/Gelato-30B-A3B.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 8.3 | | +| [GGUF](https://huggingface.co/mradermacher/Gelato-30B-A3B-i1-GGUF/resolve/main/Gelato-30B-A3B.i1-IQ2_XS.gguf) | i1-IQ2_XS | 9.2 | | +| [GGUF](https://huggingface.co/mradermacher/Gelato-30B-A3B-i1-GGUF/resolve/main/Gelato-30B-A3B.i1-IQ2_S.gguf) | i1-IQ2_S | 9.4 | | +| [GGUF](https://huggingface.co/mradermacher/Gelato-30B-A3B-i1-GGUF/resolve/main/Gelato-30B-A3B.i1-IQ2_M.gguf) | i1-IQ2_M | 10.3 | | +| [GGUF](https://huggingface.co/mradermacher/Gelato-30B-A3B-i1-GGUF/resolve/main/Gelato-30B-A3B.i1-Q2_K_S.gguf) | i1-Q2_K_S | 10.6 | very low quality | +| [GGUF](https://huggingface.co/mradermacher/Gelato-30B-A3B-i1-GGUF/resolve/main/Gelato-30B-A3B.i1-Q2_K.gguf) | i1-Q2_K | 11.4 | IQ3_XXS probably better | +| [GGUF](https://huggingface.co/mradermacher/Gelato-30B-A3B-i1-GGUF/resolve/main/Gelato-30B-A3B.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 11.9 | lower quality | +| [GGUF](https://huggingface.co/mradermacher/Gelato-30B-A3B-i1-GGUF/resolve/main/Gelato-30B-A3B.i1-IQ3_XS.gguf) | i1-IQ3_XS | 12.7 | | +| [GGUF](https://huggingface.co/mradermacher/Gelato-30B-A3B-i1-GGUF/resolve/main/Gelato-30B-A3B.i1-Q3_K_S.gguf) | i1-Q3_K_S | 13.4 | IQ3_XS probably better | +| [GGUF](https://huggingface.co/mradermacher/Gelato-30B-A3B-i1-GGUF/resolve/main/Gelato-30B-A3B.i1-IQ3_S.gguf) | i1-IQ3_S | 13.4 | beats Q3_K* | +| [GGUF](https://huggingface.co/mradermacher/Gelato-30B-A3B-i1-GGUF/resolve/main/Gelato-30B-A3B.i1-IQ3_M.gguf) | i1-IQ3_M | 13.6 | | +| [GGUF](https://huggingface.co/mradermacher/Gelato-30B-A3B-i1-GGUF/resolve/main/Gelato-30B-A3B.i1-Q3_K_M.gguf) | i1-Q3_K_M | 14.8 | IQ3_S probably better | +| [GGUF](https://huggingface.co/mradermacher/Gelato-30B-A3B-i1-GGUF/resolve/main/Gelato-30B-A3B.i1-Q3_K_L.gguf) | i1-Q3_K_L | 16.0 | IQ3_M probably better | +| [GGUF](https://huggingface.co/mradermacher/Gelato-30B-A3B-i1-GGUF/resolve/main/Gelato-30B-A3B.i1-IQ4_XS.gguf) | i1-IQ4_XS | 16.5 | | +| [GGUF](https://huggingface.co/mradermacher/Gelato-30B-A3B-i1-GGUF/resolve/main/Gelato-30B-A3B.i1-Q4_0.gguf) | i1-Q4_0 | 17.5 | fast, low quality | +| [GGUF](https://huggingface.co/mradermacher/Gelato-30B-A3B-i1-GGUF/resolve/main/Gelato-30B-A3B.i1-Q4_K_S.gguf) | i1-Q4_K_S | 17.6 | optimal size/speed/quality | +| [GGUF](https://huggingface.co/mradermacher/Gelato-30B-A3B-i1-GGUF/resolve/main/Gelato-30B-A3B.i1-Q4_K_M.gguf) | i1-Q4_K_M | 18.7 | fast, recommended | +| [GGUF](https://huggingface.co/mradermacher/Gelato-30B-A3B-i1-GGUF/resolve/main/Gelato-30B-A3B.i1-Q4_1.gguf) | i1-Q4_1 | 19.3 | | +| [GGUF](https://huggingface.co/mradermacher/Gelato-30B-A3B-i1-GGUF/resolve/main/Gelato-30B-A3B.i1-Q5_K_S.gguf) | i1-Q5_K_S | 21.2 | | +| [GGUF](https://huggingface.co/mradermacher/Gelato-30B-A3B-i1-GGUF/resolve/main/Gelato-30B-A3B.i1-Q5_K_M.gguf) | i1-Q5_K_M | 21.8 | | +| [GGUF](https://huggingface.co/mradermacher/Gelato-30B-A3B-i1-GGUF/resolve/main/Gelato-30B-A3B.i1-Q6_K.gguf) | i1-Q6_K | 25.2 | practically like static Q6_K | + +Here is a handy graph by ikawrakow comparing some lower-quality quant +types (lower is better): + +![image.png](https://www.nethype.de/huggingface_embed/quantpplgraph.png) + +And here are Artefact2's thoughts on the matter: +https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9 + +## FAQ / Model Request + +See https://huggingface.co/mradermacher/model_requests for some answers to +questions you might have and/or if you want some other model quantized. + +## Thanks + +I thank my company, [nethype GmbH](https://www.nethype.de/), for letting +me use its servers and providing upgrades to my workstation to enable +this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to. + +