auto-patch README.md

This commit is contained in:
team mradermacher
2025-07-31 09:11:07 +00:00
committed by system
parent 10e9dfec56
commit f060cbff37

View File

@@ -3,6 +3,8 @@ base_model: Mantis2024/Dirty-Shirley-v1-9B-Uncensored-TIES
language: language:
- en - en
library_name: transformers library_name: transformers
mradermacher:
readme_rev: 1
quantized_by: mradermacher quantized_by: mradermacher
tags: tags:
- mergekit - mergekit
@@ -18,6 +20,9 @@ tags:
static quants of https://huggingface.co/Mantis2024/Dirty-Shirley-v1-9B-Uncensored-TIES static quants of https://huggingface.co/Mantis2024/Dirty-Shirley-v1-9B-Uncensored-TIES
<!-- provided-files --> <!-- provided-files -->
***For a convenient overview and download list, visit our [model page for this model](https://hf.tst.eu/model#Dirty-Shirley-v1-9B-Uncensored-TIES-GGUF).***
weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion. weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
## Usage ## Usage
@@ -61,6 +66,6 @@ questions you might have and/or if you want some other model quantized.
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to. this work in my free time.
<!-- end --> <!-- end -->