初始化项目,由ModelHub XC社区提供模型

Model: yunconglong/Mixtral_7Bx2_MoE_13B_DPO
Source: Original Platform
This commit is contained in:
ModelHub XC
2026-04-13 03:36:00 +08:00
commit e1b3001165
15 changed files with 91732 additions and 0 deletions

15
README.md Normal file
View File

@@ -0,0 +1,15 @@
---
license: cc-by-nc-4.0
tags:
- moe
---
# Mixtral MOE 2x7B
MOE the following models by mergekit and then fine tuned by DPO.
* [mistralai/Mistral-7B-Instruct-v0.2](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2)
* [NurtureAI/neural-chat-7b-v3-16k](https://huggingface.co/NurtureAI/neural-chat-7b-v3-16k)
* [jondurbin/bagel-dpo-7b-v0.1](https://huggingface.co/jondurbin/bagel-dpo-7b-v0.1)