37 lines
994 B
Markdown
37 lines
994 B
Markdown
---
|
|
license: apache-2.0
|
|
tags:
|
|
- merge
|
|
- mergekit
|
|
- mlabonne/AlphaMonarch-7B
|
|
- bardsai/jaskier-7b-dpo-v5.6
|
|
---
|
|
|
|
# neurotic-crown-clown-7B-ties
|
|
|
|
neurotic-crown-clown-7B-ties is a TRIM, ELECT SIGN & MERGE (TIES) merge of the following models using [mergekit](https://github.com/cg123/mergekit):
|
|
* [mlabonne/AlphaMonarch-7B](https://huggingface.co/mlabonne/AlphaMonarch-7B)
|
|
* [bardsai/jaskier-7b-dpo-v5.6](https://huggingface.co/bardsai/jaskier-7b-dpo-v5.6)
|
|
|
|
See the paper [TIES-Merging: Resolving Interference When Merging Models](https://arxiv.org/abs/2306.01708) for more on the method.
|
|
|
|
## 🧩 Configuration
|
|
|
|
```yaml
|
|
models:
|
|
- model: mlabonne/NeuralMonarch-7B
|
|
# no parameters necessary for base model
|
|
- model: mlabonne/AlphaMonarch-7B
|
|
parameters:
|
|
density: 0.5
|
|
weight: 0.5
|
|
- model: bardsai/jaskier-7b-dpo-v5.6
|
|
parameters:
|
|
density: 0.5
|
|
weight: 0.3
|
|
merge_method: ties
|
|
base_model: mlabonne/NeuralMonarch-7B
|
|
parameters:
|
|
normalize: true
|
|
dtype: float16
|
|
``` |