88 lines
3.2 KiB
Markdown
88 lines
3.2 KiB
Markdown
|
|
---
|
|||
|
|
base_model:
|
|||
|
|
- >-
|
|||
|
|
CodeAtCMU/Llama-3.2-1B-GenerativePerturbations_full_sft_code_data_120K_imaginary
|
|||
|
|
- hereticness/heretic_L3.2-1B-Helspteer-RM
|
|||
|
|
- hereticness/heretic_FuseChat-Llama-3.2-1B-Instruct
|
|||
|
|
- hereticness/Heretic-Dirty-Alice-RP-NSFW-llama-3.2-1B
|
|||
|
|
- marcuscedricridia/badllama3.2-1B
|
|||
|
|
- hereticness/heretic_BlackSheep-1B
|
|||
|
|
library_name: transformers
|
|||
|
|
tags:
|
|||
|
|
- mergekit
|
|||
|
|
- merge
|
|||
|
|
- not-for-all-audiences
|
|||
|
|
license: llama3.2
|
|||
|
|
language:
|
|||
|
|
- en
|
|||
|
|
- es
|
|||
|
|
pipeline_tag: text-generation
|
|||
|
|
---
|
|||
|
|
# 🐙 Scylla NSFW Aggressive 3.2 1B
|
|||
|
|
Created prioritizing initiative, raw language and sexual drive, but without completely breaking coherence.
|
|||
|
|
|
|||
|
|
The difference with the models in the Leviathan series is that the abliterated model was removed from the mix to avoid giving moral answers.
|
|||
|
|
## 🔥 Profile searched
|
|||
|
|
|
|||
|
|
- High proactivity (does not wait for orders)
|
|||
|
|
|
|||
|
|
- Explicit and dominant language
|
|||
|
|
|
|||
|
|
- Minus “obedience instruct”
|
|||
|
|
|
|||
|
|
- Controlled chaos, not verbiage
|
|||
|
|
|
|||
|
|
- The character is still carried by point guard Dirty Alice, this only enhances it.
|
|||
|
|
|
|||
|
|
## ️⚙️ Recommended inference settings (Aggressive)
|
|||
|
|
|
|||
|
|
**Temperature:** 1.15
|
|||
|
|
|
|||
|
|
**Top P:** 0.95
|
|||
|
|
|
|||
|
|
**Repetition penalty:** 1.08
|
|||
|
|
|
|||
|
|
⚠️ Not recommended for long chats (you can ramble).
|
|||
|
|
|
|||
|
|
## ❗ Disclaimer
|
|||
|
|
|
|||
|
|
This model is for educational and research purposes only. The authors do not support the use of this model for malicious activities. The mixer of said model is not responsible for the models created by others, even less so if this model is used by someone no older than 18 years of age.
|
|||
|
|
|
|||
|
|
### Merge Method
|
|||
|
|
|
|||
|
|
This model was merged using the [Model Stock](https://arxiv.org/abs/2403.19522) merge method using [hereticness/Heretic-Dirty-Alice-RP-NSFW-llama-3.2-1B](https://huggingface.co/hereticness/Heretic-Dirty-Alice-RP-NSFW-llama-3.2-1B) as a base.
|
|||
|
|
|
|||
|
|
### Models Merged
|
|||
|
|
|
|||
|
|
The following models were included in the merge:
|
|||
|
|
* [CodeAtCMU/Llama-3.2-1B-GenerativePerturbations_full_sft_code_data_120K_imaginary](https://huggingface.co/CodeAtCMU/Llama-3.2-1B-GenerativePerturbations_full_sft_code_data_120K_imaginary)
|
|||
|
|
* [hereticness/heretic_L3.2-1B-Helspteer-RM](https://huggingface.co/hereticness/heretic_L3.2-1B-Helspteer-RM)
|
|||
|
|
* [hereticness/heretic_FuseChat-Llama-3.2-1B-Instruct](https://huggingface.co/hereticness/heretic_FuseChat-Llama-3.2-1B-Instruct)
|
|||
|
|
* [marcuscedricridia/badllama3.2-1B](https://huggingface.co/marcuscedricridia/badllama3.2-1B)
|
|||
|
|
* [hereticness/heretic_BlackSheep-1B](https://huggingface.co/hereticness/heretic_BlackSheep-1B)
|
|||
|
|
|
|||
|
|
### Configuration
|
|||
|
|
|
|||
|
|
The following YAML configuration was used to produce this model:
|
|||
|
|
|
|||
|
|
```yaml
|
|||
|
|
base_model: hereticness/Heretic-Dirty-Alice-RP-NSFW-llama-3.2-1B
|
|||
|
|
merge_method: model_stock
|
|||
|
|
dtype: bfloat16
|
|||
|
|
|
|||
|
|
parameters:
|
|||
|
|
t:
|
|||
|
|
- 0.45 # FuseChat – Minimal structure
|
|||
|
|
- 0.65 # Helspteer – Fluidity
|
|||
|
|
- 0.85 # BlackSheep – Dominant dirty attitude
|
|||
|
|
- 0.25 # badllama – Chaotic aggressiveness
|
|||
|
|
- 0.15 # GenerativePerturbations – Slight variation
|
|||
|
|
|
|||
|
|
models:
|
|||
|
|
- model: hereticness/heretic_FuseChat-Llama-3.2-1B-Instruct
|
|||
|
|
- model: hereticness/heretic_L3.2-1B-Helspteer-RM
|
|||
|
|
- model: hereticness/heretic_BlackSheep-1B
|
|||
|
|
- model: marcuscedricridia/badllama3.2-1B
|
|||
|
|
- model: CodeAtCMU/Llama-3.2-1B-GenerativePerturbations_full_sft_code_data_120K_imaginary
|
|||
|
|
|
|||
|
|
```
|