diff --git a/README.md b/README.md index 67e413e..11036d1 100644 --- a/README.md +++ b/README.md @@ -42,6 +42,7 @@ more details, including on how to concatenate multi-part files. | [GGUF](https://huggingface.co/mradermacher/IQuest-Coder-V1-40B-Base-i1-GGUF/resolve/main/IQuest-Coder-V1-40B-Base.imatrix.gguf) | imatrix | 0.1 | imatrix file (for creating your own quants) | | [GGUF](https://huggingface.co/mradermacher/IQuest-Coder-V1-40B-Base-i1-GGUF/resolve/main/IQuest-Coder-V1-40B-Base.i1-IQ1_M.gguf) | i1-IQ1_M | 9.4 | mostly desperate | | [GGUF](https://huggingface.co/mradermacher/IQuest-Coder-V1-40B-Base-i1-GGUF/resolve/main/IQuest-Coder-V1-40B-Base.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 10.8 | | +| [GGUF](https://huggingface.co/mradermacher/IQuest-Coder-V1-40B-Base-i1-GGUF/resolve/main/IQuest-Coder-V1-40B-Base.i1-IQ2_XS.gguf) | i1-IQ2_XS | 11.9 | | | [GGUF](https://huggingface.co/mradermacher/IQuest-Coder-V1-40B-Base-i1-GGUF/resolve/main/IQuest-Coder-V1-40B-Base.i1-IQ2_M.gguf) | i1-IQ2_M | 13.5 | | | [GGUF](https://huggingface.co/mradermacher/IQuest-Coder-V1-40B-Base-i1-GGUF/resolve/main/IQuest-Coder-V1-40B-Base.i1-Q2_K_S.gguf) | i1-Q2_K_S | 13.8 | very low quality | | [GGUF](https://huggingface.co/mradermacher/IQuest-Coder-V1-40B-Base-i1-GGUF/resolve/main/IQuest-Coder-V1-40B-Base.i1-Q2_K.gguf) | i1-Q2_K | 14.9 | IQ3_XXS probably better | @@ -49,9 +50,11 @@ more details, including on how to concatenate multi-part files. | [GGUF](https://huggingface.co/mradermacher/IQuest-Coder-V1-40B-Base-i1-GGUF/resolve/main/IQuest-Coder-V1-40B-Base.i1-Q3_K_S.gguf) | i1-Q3_K_S | 17.5 | IQ3_XS probably better | | [GGUF](https://huggingface.co/mradermacher/IQuest-Coder-V1-40B-Base-i1-GGUF/resolve/main/IQuest-Coder-V1-40B-Base.i1-IQ3_M.gguf) | i1-IQ3_M | 17.9 | | | [GGUF](https://huggingface.co/mradermacher/IQuest-Coder-V1-40B-Base-i1-GGUF/resolve/main/IQuest-Coder-V1-40B-Base.i1-Q3_K_M.gguf) | i1-Q3_K_M | 19.3 | IQ3_S probably better | +| [GGUF](https://huggingface.co/mradermacher/IQuest-Coder-V1-40B-Base-i1-GGUF/resolve/main/IQuest-Coder-V1-40B-Base.i1-Q3_K_L.gguf) | i1-Q3_K_L | 20.9 | IQ3_M probably better | | [GGUF](https://huggingface.co/mradermacher/IQuest-Coder-V1-40B-Base-i1-GGUF/resolve/main/IQuest-Coder-V1-40B-Base.i1-IQ4_XS.gguf) | i1-IQ4_XS | 21.4 | | | [GGUF](https://huggingface.co/mradermacher/IQuest-Coder-V1-40B-Base-i1-GGUF/resolve/main/IQuest-Coder-V1-40B-Base.i1-Q4_K_S.gguf) | i1-Q4_K_S | 22.8 | optimal size/speed/quality | | [GGUF](https://huggingface.co/mradermacher/IQuest-Coder-V1-40B-Base-i1-GGUF/resolve/main/IQuest-Coder-V1-40B-Base.i1-Q4_K_M.gguf) | i1-Q4_K_M | 24.1 | fast, recommended | +| [GGUF](https://huggingface.co/mradermacher/IQuest-Coder-V1-40B-Base-i1-GGUF/resolve/main/IQuest-Coder-V1-40B-Base.i1-Q5_K_S.gguf) | i1-Q5_K_S | 27.5 | | | [GGUF](https://huggingface.co/mradermacher/IQuest-Coder-V1-40B-Base-i1-GGUF/resolve/main/IQuest-Coder-V1-40B-Base.i1-Q6_K.gguf) | i1-Q6_K | 32.7 | practically like static Q6_K | Here is a handy graph by ikawrakow comparing some lower-quality quant