2025-03-18 02:48:07 +00:00
|
|
|
---
|
2025-03-18 02:51:39 +00:00
|
|
|
library_name: transformers
|
|
|
|
|
license: cc-by-nc-4.0
|
|
|
|
|
tags:
|
|
|
|
|
- merge
|
|
|
|
|
- automerger
|
|
|
|
|
---
|
2025-03-18 02:48:07 +00:00
|
|
|
|
2025-03-18 02:51:39 +00:00
|
|
|
# UltraMerge-7B
|
2025-03-18 02:48:07 +00:00
|
|
|
|
2025-03-18 02:51:39 +00:00
|
|
|
This model is an experimental DPO fine-tune of [automerger/YamShadow-7B](https://huggingface.co/automerger/YamShadow-7B) on the following datasets:
|
2025-03-18 02:48:07 +00:00
|
|
|
|
2025-03-18 02:51:39 +00:00
|
|
|
- mlabonne/truthy-dpo-v0.1
|
|
|
|
|
- mlabonne/distilabel-intel-orca-dpo-pairs
|
|
|
|
|
- mlabonne/chatml-OpenHermes2.5-dpo-binarized-alpha
|
|
|
|
|
- mlabonne/ultrafeedback-binarized-preferences-cleaned
|
2025-03-18 02:48:07 +00:00
|
|
|
|
2025-03-18 02:51:39 +00:00
|
|
|
I have no idea about what's the best chat template. Probably Mistral-Instruct or ChatML.
|