auto-patch README.md

This commit is contained in:
team mradermacher
2025-07-13 04:51:44 +00:00
committed by system
parent 8a9a5a8d5b
commit eb7aaf7863

View File

@@ -1,5 +1,7 @@
--- ---
base_model: SWE-bench/SWE-agent-LM-7B base_model: SWE-bench/SWE-agent-LM-7B
datasets:
- SWE-bench/SWE-smith
language: language:
- en - en
library_name: transformers library_name: transformers
@@ -9,11 +11,8 @@ mradermacher:
readme_rev: 1 readme_rev: 1
quantized_by: mradermacher quantized_by: mradermacher
tags: tags:
- code - agent
- codeqwen - software engineering
- chat
- qwen
- qwen-coder
--- ---
## About ## About
@@ -28,7 +27,7 @@ static quants of https://huggingface.co/SWE-bench/SWE-agent-LM-7B
***For a convenient overview and download list, visit our [model page for this model](https://hf.tst.eu/model#SWE-agent-LM-7B-GGUF).*** ***For a convenient overview and download list, visit our [model page for this model](https://hf.tst.eu/model#SWE-agent-LM-7B-GGUF).***
weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion. weighted/imatrix quants are available at https://huggingface.co/mradermacher/SWE-agent-LM-7B-i1-GGUF
## Usage ## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's If you are unsure how to use GGUF files, refer to one of [TheBloke's
@@ -45,6 +44,7 @@ more details, including on how to concatenate multi-part files.
| [GGUF](https://huggingface.co/mradermacher/SWE-agent-LM-7B-GGUF/resolve/main/SWE-agent-LM-7B.Q3_K_S.gguf) | Q3_K_S | 3.6 | | | [GGUF](https://huggingface.co/mradermacher/SWE-agent-LM-7B-GGUF/resolve/main/SWE-agent-LM-7B.Q3_K_S.gguf) | Q3_K_S | 3.6 | |
| [GGUF](https://huggingface.co/mradermacher/SWE-agent-LM-7B-GGUF/resolve/main/SWE-agent-LM-7B.Q3_K_M.gguf) | Q3_K_M | 3.9 | lower quality | | [GGUF](https://huggingface.co/mradermacher/SWE-agent-LM-7B-GGUF/resolve/main/SWE-agent-LM-7B.Q3_K_M.gguf) | Q3_K_M | 3.9 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/SWE-agent-LM-7B-GGUF/resolve/main/SWE-agent-LM-7B.Q3_K_L.gguf) | Q3_K_L | 4.2 | | | [GGUF](https://huggingface.co/mradermacher/SWE-agent-LM-7B-GGUF/resolve/main/SWE-agent-LM-7B.Q3_K_L.gguf) | Q3_K_L | 4.2 | |
| [GGUF](https://huggingface.co/mradermacher/SWE-agent-LM-7B-GGUF/resolve/main/SWE-agent-LM-7B.IQ4_XS.gguf) | IQ4_XS | 4.4 | |
| [GGUF](https://huggingface.co/mradermacher/SWE-agent-LM-7B-GGUF/resolve/main/SWE-agent-LM-7B.Q4_K_S.gguf) | Q4_K_S | 4.6 | fast, recommended | | [GGUF](https://huggingface.co/mradermacher/SWE-agent-LM-7B-GGUF/resolve/main/SWE-agent-LM-7B.Q4_K_S.gguf) | Q4_K_S | 4.6 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/SWE-agent-LM-7B-GGUF/resolve/main/SWE-agent-LM-7B.Q4_K_M.gguf) | Q4_K_M | 4.8 | fast, recommended | | [GGUF](https://huggingface.co/mradermacher/SWE-agent-LM-7B-GGUF/resolve/main/SWE-agent-LM-7B.Q4_K_M.gguf) | Q4_K_M | 4.8 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/SWE-agent-LM-7B-GGUF/resolve/main/SWE-agent-LM-7B.Q5_K_S.gguf) | Q5_K_S | 5.4 | | | [GGUF](https://huggingface.co/mradermacher/SWE-agent-LM-7B-GGUF/resolve/main/SWE-agent-LM-7B.Q5_K_S.gguf) | Q5_K_S | 5.4 | |