34 lines
755 B
Markdown
34 lines
755 B
Markdown
---
|
|
pipeline_tag: text-generation
|
|
tags:
|
|
- mistral
|
|
- merge
|
|
license: cc-by-4.0
|
|
---
|
|
# Model Card for Sina-Thor-7b-Merge
|
|
<!-- Provide a quick summary of what the model is/does. -->
|
|
Part of a series of experimental DARE merges.
|
|
|
|
.yaml file for mergekit
|
|
```.yaml:
|
|
models:
|
|
- model: mistralai/Mistral-7B-v0.1
|
|
# no parameters necessary for base model
|
|
- model: rishiraj/smol-7b #75
|
|
parameters:
|
|
weight: 0.2
|
|
density: 0.41
|
|
- model: SanjiWatsuki/openchat-3.5-1210-starling-slerp #125
|
|
parameters:
|
|
weight: 0.33
|
|
density: 0.54
|
|
- model: Azazelle/Dumb-Maidlet #200
|
|
parameters:
|
|
weight: 0.53
|
|
density: 0.71
|
|
merge_method: dare_ties
|
|
base_model: mistralai/Mistral-7B-v0.1
|
|
parameters:
|
|
int8_mask: true
|
|
dtype: bfloat16
|
|
``` |