Files
mistral-merge-7b/README.md

41 lines
991 B
Markdown
Raw Normal View History

---
base_model:
- teknium/OpenHermes-2.5-Mistral-7B
- Open-Orca/Mistral-7B-SlimOrca
tags:
- mergekit
- merge
license: unlicense
---
# out2
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the [linear](https://arxiv.org/abs/2203.05482) merge method.
### Models Merged
The following models were included in the merge:
* [teknium/OpenHermes-2.5-Mistral-7B](https://huggingface.co/teknium/OpenHermes-2.5-Mistral-7B)
* [Open-Orca/Mistral-7B-SlimOrca](https://huggingface.co/Open-Orca/Mistral-7B-SlimOrca)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
merge_method: linear
dtype: float16
models:
- model: teknium/OpenHermes-2.5-Mistral-7B
parameters:
weight: 1.0
- model: Open-Orca/Mistral-7B-SlimOrca
parameters:
weight: 1.0
#tokenizer_source: union
```