37 lines
870 B
Markdown
37 lines
870 B
Markdown
---
|
|
license: apache-2.0
|
|
tags:
|
|
- merge
|
|
- mergekit
|
|
- lazymergekit
|
|
- EmbeddedLLM/Mistral-7B-Merge-14-v0.1
|
|
- amazon/MistralLite
|
|
---
|
|
|
|
# Mistral-7B-Merge-14-v0.2
|
|
|
|
Mistral-7B-Merge-14-v0.2 is a merge of the following models using [mergekit](https://github.com/cg123/mergekit):
|
|
* [EmbeddedLLM/Mistral-7B-Merge-14-v0.1](https://huggingface.co/EmbeddedLLM/Mistral-7B-Merge-14-v0.1)
|
|
* [amazon/MistralLite](https://huggingface.co/amazon/MistralLite)
|
|
|
|
## 🧩 Configuration
|
|
|
|
```yaml
|
|
slices:
|
|
- sources:
|
|
- model: EmbeddedLLM/Mistral-7B-Merge-14-v0.1
|
|
layer_range: [0, 32]
|
|
- model: amazon/MistralLite
|
|
layer_range: [0, 32]
|
|
merge_method: slerp
|
|
base_model: EmbeddedLLM/Mistral-7B-Merge-14-v0.1
|
|
parameters:
|
|
t:
|
|
- filter: self_attn
|
|
value: [0, 0.5, 0.3, 0.7, 1]
|
|
- filter: mlp
|
|
value: [1, 0.5, 0.7, 0.3, 0]
|
|
- value: 0.5
|
|
dtype: bfloat16
|
|
|
|
``` |