Files
Mixtral_7Bx2_MoE_13B_DPO/README.md
ModelHub XC e1b3001165 初始化项目,由ModelHub XC社区提供模型
Model: yunconglong/Mixtral_7Bx2_MoE_13B_DPO
Source: Original Platform
2026-04-13 03:36:00 +08:00

15 lines
410 B
Markdown

---
license: cc-by-nc-4.0
tags:
- moe
---
# Mixtral MOE 2x7B
MOE the following models by mergekit and then fine tuned by DPO.
* [mistralai/Mistral-7B-Instruct-v0.2](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2)
* [NurtureAI/neural-chat-7b-v3-16k](https://huggingface.co/NurtureAI/neural-chat-7b-v3-16k)
* [jondurbin/bagel-dpo-7b-v0.1](https://huggingface.co/jondurbin/bagel-dpo-7b-v0.1)