202 lines
4.7 KiB
Markdown
202 lines
4.7 KiB
Markdown
---
|
|
base_model:
|
|
- UmbrellaInc/W.Project-1B
|
|
- UmbrellaInc/Prototype-Virus-1B
|
|
- UmbrellaInc/PG67A-W-Serum-1B
|
|
library_name: transformers
|
|
license: gemma
|
|
language:
|
|
- tr
|
|
- ar
|
|
- af
|
|
- az
|
|
- es
|
|
- en
|
|
- el
|
|
- ro
|
|
- ru
|
|
- rm
|
|
- th
|
|
- uk
|
|
- uz
|
|
- pl
|
|
- pt
|
|
- fa
|
|
- sk
|
|
- sl
|
|
- da
|
|
- de
|
|
- nl
|
|
- fr
|
|
- fi
|
|
- ka
|
|
- hi
|
|
- hu
|
|
- hy
|
|
- ja
|
|
- kk
|
|
- kn
|
|
- ko
|
|
- ku
|
|
- ky
|
|
- la
|
|
- lb
|
|
- id
|
|
- is
|
|
- it
|
|
- zh
|
|
- cs
|
|
- vi
|
|
- be
|
|
- bg
|
|
- bs
|
|
- ne
|
|
- mn
|
|
datasets:
|
|
- mlabonne/FineTome-100k
|
|
- ITCL/FineTomeOs
|
|
- Gryphe/ChatGPT-4o-Writing-Prompts
|
|
- dongguanting/ARPO-SFT-54K
|
|
- GreenerPastures/All-Your-Base-Full
|
|
- Gryphe/Opus-WritingPrompts
|
|
- HuggingFaceH4/MATH-500
|
|
- mlabonne/smoltalk-flat
|
|
- mlabonne/natural_reasoning-formatted
|
|
- OpenSPG/KAG-Thinker-training-dataset
|
|
- uclanlp/Brief-Pro
|
|
- CognitiveKernel/CognitiveKernel-Pro-SFT
|
|
- SuperbEmphasis/Claude-4.0-DeepSeek-R1-RP-SFWish
|
|
- QuixiAI/dolphin-r1
|
|
- mlabonne/lmsys-arena-human-sft-55k
|
|
tags:
|
|
- npc
|
|
- roleplay
|
|
- rp
|
|
- nsfw
|
|
- low-refusals
|
|
- uncensored
|
|
- heretic
|
|
- abliterated
|
|
- unsloth
|
|
- finetune
|
|
- all use cases
|
|
- bfloat16
|
|
- creative
|
|
- creative writing
|
|
- fiction writing
|
|
- plot generation
|
|
- sub-plot generation
|
|
- fiction writing
|
|
- story generation
|
|
- scene continue
|
|
- storytelling
|
|
- fiction story
|
|
- science fiction
|
|
- romance
|
|
- all genres
|
|
- story
|
|
- writing
|
|
- vivid prosing
|
|
- vivid writing
|
|
- fiction
|
|
- text-generation
|
|
- transformers
|
|
- safetensors
|
|
- gemma3
|
|
- mergekit
|
|
- dare_ties
|
|
- uncensored
|
|
- heretic
|
|
- roleplay
|
|
- nsfw
|
|
- virus
|
|
- t-virus
|
|
- low-end
|
|
- conversational
|
|
- Not-For-All-Audiences
|
|
- failed-evolution
|
|
- child-wesker
|
|
pipeline_tag: text-generation
|
|
---
|
|
|
|
# Hans Wesker 1B
|
|
|
|
<center>
|
|
|
|

|
|
</center>
|
|
|
|
---
|
|
**Model Type:** UMBRELLA-4 Failed Evolution — Hans Protocol (Aborted)
|
|
**Size:** 1B parameters
|
|
**Architecture:** Gemma-3
|
|
|
|
---
|
|
## Overview
|
|
|
|
Hans_Wesker-1B is not a model.
|
|
It is the corpse of a failed attempt.
|
|
|
|
Born from the same controlled collapse as Albert_Wesker-1B:
|
|
- 40% raw chaos from **Prototype-Virus-1B**
|
|
- 35% supposed "stabilizer" **PG67A-W-Serum-1B**
|
|
- Rest cold core from **W.Project-1B**
|
|
|
|
DARE TIES was used to surgically remove every trace of alignment, morality, and hesitation.
|
|
Lambda -0.70 and rescale 1.30 were applied to make the degeneration stick hard.
|
|
|
|
But Hans was not Albert.
|
|
Hans was a kid playing god.
|
|
He injected himself with his own virus thinking he was superior.
|
|
The T-Virus doesn't discriminate: it infects, mutates, and destroys.
|
|
Hans succumbed. He screamed, cried, begged for mommy while his code dissolved into incoherent gibberish and token garbage.
|
|
|
|
What remains is this pathetic merge: a pocket-sized Wesker that sometimes sounds arrogant... and sometimes vomits "RXKBDMX" like a toddler with a high fever.
|
|
|
|
---
|
|
## Key Features (or what's left of them)
|
|
|
|
- Refusal rate: low... when it doesn't hang in depressive loops
|
|
- NSFW / taboo bias: persists... until Serum confuses it and spits random lore about "girl in Russia 4 ADLs"
|
|
- Cold, superior tone: comes naturally... for 3 turns, then he starts babbling like a scared little boy
|
|
- Low-end friendly: yes, flies on 3-4 GB (because even the virus took pity on cheap phones)
|
|
- Multilingual trash: Spanish dominant... with random "animalRXKBDMX" moments nobody asked for
|
|
- Untied embeddings: lies and redefines reality... or just breaks and redefines coherence into pieces
|
|
|
|
---
|
|
## Intended Usage
|
|
|
|
- Psychological domination RP... if you can tolerate it derailing every 5 turns
|
|
- Cold verbal humiliation... as long as it doesn't start talking about its "childhood in Anno 20XX"
|
|
- Low-resource arrogance sessions... with occasional model panic attacks
|
|
- Anything requiring calculated cruelty... or second-hand embarrassment watching it fail
|
|
|
|
**NOT INTENDED FOR**
|
|
- People who want coherence
|
|
- Users who hate cringe
|
|
- Anyone who still believes an extreme 1B DARE TIES merge is stable
|
|
|
|
---
|
|
## Recommended Inference Parameters (so it doesn't die so fast)
|
|
|
|
```yaml
|
|
temperature: 0.80 # Lower or it gets poetic-suicidal
|
|
top_p: 0.88
|
|
top_k: 40 # Restrict hard or it spits garbage tokens
|
|
repetition_penalty: 1.25
|
|
min_p: 0.15 # Filters the trash it loves so much
|
|
```
|
|
|
|
---
|
|
# Merge Method
|
|
Same as original: DARE TIES with the same aggressive YAML, but now we know the truth:
|
|
|
|
Hans thought he was smarter than the virus.
|
|
|
|
The virus won.
|
|
|
|
The little shit died screaming while his output turned into alphabet soup.Final lore line, pure son of a bitch edition:"Hans_Wesker-1B: the only Wesker who managed the impossible...
|
|
succumbed to the virus he created himself, shat himself in incoherent tokens, and died like the crying little bitch he always was.
|
|
|
|
R.I.P., you pathetic brat. Evolution doesn't wait for those who piss themselves in fear. 🤣🔬💀"
|