Files
FusionPulse-24B/README.md
ModelHub XC bd47d5e126 初始化项目,由ModelHub XC社区提供模型
Model: MrRikyz/FusionPulse-24B
Source: Original Platform
2026-04-10 19:34:55 +08:00

135 lines
3.0 KiB
Markdown

---
base_model:
- TheDrummer/Magidonia-24B-v4.3
- Ateron/Sketch-Cydonia
- OddTheGreat/Rotor_24B_V.1
- DarkArtsForge/Magistaroth-24B-v1.1
- MrRikyz/Rei-Pulse-24B
- sophosympatheia/Magistry-24B-v1.0
- TheDrummer/Cydonia-24B-v4.3
library_name: transformers
tags:
- mergekit
- merge
- RP
- NSFW
- roleplay
---
<div style="width: 100%; text-align: center; margin-bottom: 20px;">
<h1 style="font-size: 2.5em; font-weight: bold; color: #4A90E2;">FusionPulse-24B</h1>
<hr style="border: 0; height: 1px; background: linear-gradient(to right, transparent, #4A90E2, transparent); margin: 20px 0;">
</div>
## 🌟 Overview
**FusionPulse-24B** is a merge built on top of `Magidonia-24B-v4.3`.
The model uses the **TIES** method
### 🧩 Models Merged
This model results from a merge between:
* **TheDrummer/Magidonia-24B-v4.3** (Base)
* **Ateron/Sketch-Cydonia**
* **OddTheGreat/Rotor_24B_V.1**
* **DarkArtsForge/Magistaroth-24B-v1.1**
* **MrRikyz/Rei-Pulse-24B**
* **sophosympatheia/Magistry-24B-v1.0**
* **TheDrummer/Cydonia-24B-v4.3**
---
## 🛠️ Merge Details
### Method: TIES
The merge was performed using `mergekit` with the following parameters:
- **Base Model:** TheDrummer/Magidonia-24B-v4.3
- **Dtype:** float32
- **ODtype** BFloat16
- **tokenizer source** base
### ⚙️ Configuration
<details>
<summary><b>View Full Mergekit YAML</b></summary>
```yaml
base_model: TheDrummer/Magidonia-24B-v4.3
dtype: float32
merge_method: ties
modules:
default:
slices:
- sources:
- layer_range: [0, 40]
model: Ateron/Sketch-Cydonia
parameters:
density: 0.55
weight: 0.18
- layer_range: [0, 40]
model: OddTheGreat/Rotor_24B_V.1
parameters:
density: 0.65
weight: 0.22
- layer_range: [0, 40]
model: DarkArtsForge/Magistaroth-24B-v1.1
parameters:
density: 0.7
weight: 0.27
- layer_range: [0, 40]
model: MrRikyz/Rei-Pulse-24B
parameters:
density: 0.6
weight: 0.19
- layer_range: [0, 40]
model: sophosympatheia/Magistry-24B-v1.0
parameters:
density: 0.44
weight: 0.23
- layer_range: [0, 40]
model: TheDrummer/Cydonia-24B-v4.3
parameters:
density: 0.25
weight: 0.18
- layer_range: [0, 40]
model: TheDrummer/Magidonia-24B-v4.3
base_model_alpha: 0.85
ties:
merge_strategy: sum
normalize: true
sparsity: 0.17
rescale: true
layer_wise:
- filter: "layers.0-8.*"
scale: 0.75
- filter: "layers.9-20.*"
scale: 1.05
- filter: "layers.21-31.*"
scale: 1.15
tensor_factors:
attention: 1.1
mlp: 1.2
post:
normalize: true
clamp: 2.5
out_dtype: bfloat16
tokenizer:
source: base
```
</details>
# ✨ Acknowledgements
Thanks to the authors of the original models for their incredible work:
- Ateron for `Sketch-Cydonia`
- OddTheGreat for `Rotor_24B_V.1`
- DarkArtsForge for `Magistaroth-24B-v1.1`
- sophosympatheia for `Magistry-24B-v1.0`
- TheDrummer for `Cydonia-24B-v4.3` and `Magidonia-24B-v4.3`