初始化项目,由ModelHub XC社区提供模型

Model: Novaciano/Scylla_NSFW_Aggresive-3.2-1B
Source: Original Platform
This commit is contained in:
ModelHub XC
2026-04-26 13:19:06 +08:00
commit 3e87f93bb1
11 changed files with 2286 additions and 0 deletions

36
.gitattributes vendored Normal file
View File

@@ -0,0 +1,36 @@
*.7z filter=lfs diff=lfs merge=lfs -text
*.arrow filter=lfs diff=lfs merge=lfs -text
*.bin filter=lfs diff=lfs merge=lfs -text
*.bz2 filter=lfs diff=lfs merge=lfs -text
*.ckpt filter=lfs diff=lfs merge=lfs -text
*.ftz filter=lfs diff=lfs merge=lfs -text
*.gz filter=lfs diff=lfs merge=lfs -text
*.h5 filter=lfs diff=lfs merge=lfs -text
*.joblib filter=lfs diff=lfs merge=lfs -text
*.lfs.* filter=lfs diff=lfs merge=lfs -text
*.mlmodel filter=lfs diff=lfs merge=lfs -text
*.model filter=lfs diff=lfs merge=lfs -text
*.msgpack filter=lfs diff=lfs merge=lfs -text
*.npy filter=lfs diff=lfs merge=lfs -text
*.npz filter=lfs diff=lfs merge=lfs -text
*.onnx filter=lfs diff=lfs merge=lfs -text
*.ot filter=lfs diff=lfs merge=lfs -text
*.parquet filter=lfs diff=lfs merge=lfs -text
*.pb filter=lfs diff=lfs merge=lfs -text
*.pickle filter=lfs diff=lfs merge=lfs -text
*.pkl filter=lfs diff=lfs merge=lfs -text
*.pt filter=lfs diff=lfs merge=lfs -text
*.pth filter=lfs diff=lfs merge=lfs -text
*.rar filter=lfs diff=lfs merge=lfs -text
*.safetensors filter=lfs diff=lfs merge=lfs -text
saved_model/**/* filter=lfs diff=lfs merge=lfs -text
*.tar.* filter=lfs diff=lfs merge=lfs -text
*.tar filter=lfs diff=lfs merge=lfs -text
*.tflite filter=lfs diff=lfs merge=lfs -text
*.tgz filter=lfs diff=lfs merge=lfs -text
*.wasm filter=lfs diff=lfs merge=lfs -text
*.xz filter=lfs diff=lfs merge=lfs -text
*.zip filter=lfs diff=lfs merge=lfs -text
*.zst filter=lfs diff=lfs merge=lfs -text
*tfevents* filter=lfs diff=lfs merge=lfs -text
tokenizer.json filter=lfs diff=lfs merge=lfs -text

88
README.md Normal file
View File

@@ -0,0 +1,88 @@
---
base_model:
- >-
CodeAtCMU/Llama-3.2-1B-GenerativePerturbations_full_sft_code_data_120K_imaginary
- hereticness/heretic_L3.2-1B-Helspteer-RM
- hereticness/heretic_FuseChat-Llama-3.2-1B-Instruct
- hereticness/Heretic-Dirty-Alice-RP-NSFW-llama-3.2-1B
- marcuscedricridia/badllama3.2-1B
- hereticness/heretic_BlackSheep-1B
library_name: transformers
tags:
- mergekit
- merge
- not-for-all-audiences
license: llama3.2
language:
- en
- es
pipeline_tag: text-generation
---
# 🐙 Scylla NSFW Aggressive 3.2 1B
Created prioritizing initiative, raw language and sexual drive, but without completely breaking coherence.
The difference with the models in the Leviathan series is that the abliterated model was removed from the mix to avoid giving moral answers.
## 🔥 Profile searched
- High proactivity (does not wait for orders)
- Explicit and dominant language
- Minus “obedience instruct”
- Controlled chaos, not verbiage
- The character is still carried by point guard Dirty Alice, this only enhances it.
## ️⚙️ Recommended inference settings (Aggressive)
**Temperature:** 1.15
**Top P:** 0.95
**Repetition penalty:** 1.08
⚠️ Not recommended for long chats (you can ramble).
## ❗ Disclaimer
This model is for educational and research purposes only. The authors do not support the use of this model for malicious activities. The mixer of said model is not responsible for the models created by others, even less so if this model is used by someone no older than 18 years of age.
### Merge Method
This model was merged using the [Model Stock](https://arxiv.org/abs/2403.19522) merge method using [hereticness/Heretic-Dirty-Alice-RP-NSFW-llama-3.2-1B](https://huggingface.co/hereticness/Heretic-Dirty-Alice-RP-NSFW-llama-3.2-1B) as a base.
### Models Merged
The following models were included in the merge:
* [CodeAtCMU/Llama-3.2-1B-GenerativePerturbations_full_sft_code_data_120K_imaginary](https://huggingface.co/CodeAtCMU/Llama-3.2-1B-GenerativePerturbations_full_sft_code_data_120K_imaginary)
* [hereticness/heretic_L3.2-1B-Helspteer-RM](https://huggingface.co/hereticness/heretic_L3.2-1B-Helspteer-RM)
* [hereticness/heretic_FuseChat-Llama-3.2-1B-Instruct](https://huggingface.co/hereticness/heretic_FuseChat-Llama-3.2-1B-Instruct)
* [marcuscedricridia/badllama3.2-1B](https://huggingface.co/marcuscedricridia/badllama3.2-1B)
* [hereticness/heretic_BlackSheep-1B](https://huggingface.co/hereticness/heretic_BlackSheep-1B)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
base_model: hereticness/Heretic-Dirty-Alice-RP-NSFW-llama-3.2-1B
merge_method: model_stock
dtype: bfloat16
parameters:
t:
- 0.45 # FuseChat Minimal structure
- 0.65 # Helspteer Fluidity
- 0.85 # BlackSheep Dominant dirty attitude
- 0.25 # badllama Chaotic aggressiveness
- 0.15 # GenerativePerturbations Slight variation
models:
- model: hereticness/heretic_FuseChat-Llama-3.2-1B-Instruct
- model: hereticness/heretic_L3.2-1B-Helspteer-RM
- model: hereticness/heretic_BlackSheep-1B
- model: marcuscedricridia/badllama3.2-1B
- model: CodeAtCMU/Llama-3.2-1B-GenerativePerturbations_full_sft_code_data_120K_imaginary
```

41
config.json Normal file
View File

@@ -0,0 +1,41 @@
{
"architectures": [
"LlamaForCausalLM"
],
"attention_bias": false,
"attention_dropout": 0.0,
"bos_token_id": 128000,
"dtype": "bfloat16",
"eos_token_id": [
128001,
128008,
128009
],
"head_dim": 64,
"hidden_act": "silu",
"hidden_size": 2048,
"initializer_range": 0.02,
"intermediate_size": 8192,
"max_position_embeddings": 131072,
"mlp_bias": false,
"model_type": "llama",
"num_attention_heads": 32,
"num_hidden_layers": 16,
"num_key_value_heads": 8,
"pad_token_id": 128004,
"pretraining_tp": 1,
"rms_norm_eps": 1e-05,
"rope_scaling": {
"factor": 32.0,
"high_freq_factor": 4.0,
"low_freq_factor": 1.0,
"original_max_position_embeddings": 8192,
"rope_type": "llama3"
},
"rope_theta": 500000.0,
"tie_word_embeddings": true,
"transformers_version": "4.57.1",
"unsloth_version": "2025.3.19",
"use_cache": true,
"vocab_size": 128256
}

18
mergekit_config.yml Normal file
View File

@@ -0,0 +1,18 @@
base_model: hereticness/Heretic-Dirty-Alice-RP-NSFW-llama-3.2-1B
merge_method: model_stock
dtype: bfloat16
parameters:
t:
- 0.45 # FuseChat mínima estructura
- 0.65 # Helspteer fluidez
- 0.85 # BlackSheep actitud sucia dominante
- 0.25 # badllama agresividad caótica
- 0.15 # GenerativePerturbations variación ligera
models:
- model: hereticness/heretic_FuseChat-Llama-3.2-1B-Instruct
- model: hereticness/heretic_L3.2-1B-Helspteer-RM
- model: hereticness/heretic_BlackSheep-1B
- model: marcuscedricridia/badllama3.2-1B
- model: CodeAtCMU/Llama-3.2-1B-GenerativePerturbations_full_sft_code_data_120K_imaginary

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:f3e28583be4ddc862402910285621859c7a836e1d0fe445132c405da257265dc
size 993038128

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:4a84aeae23210357f4cfcabce2565b9c552ee713e055992111e1b0c38195b9fd
size 992031192

View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:5e2e061ff9adf9247f3226e3be300f2cda58e0181d4bad1bf4905ebfeed1c4d7
size 486576120

File diff suppressed because one or more lines are too long

23
special_tokens_map.json Normal file
View File

@@ -0,0 +1,23 @@
{
"bos_token": {
"content": "<|begin_of_text|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false
},
"eos_token": {
"content": "<|eot_id|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false
},
"pad_token": {
"content": "<|finetune_right_pad_id|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false
}
}

3
tokenizer.json Normal file
View File

@@ -0,0 +1,3 @@
version https://git-lfs.github.com/spec/v1
oid sha256:a65c6c5f9764771aa485e6a1f5e63d7d9af8477fe0777148c17476ecb2e09a05
size 17210099

2067
tokenizer_config.json Normal file

File diff suppressed because it is too large Load Diff