初始化项目,由ModelHub XC社区提供模型
Model: yunconglong/Mixtral_7Bx2_MoE_13B_DPO Source: Original Platform
This commit is contained in:
15
README.md
Normal file
15
README.md
Normal file
@@ -0,0 +1,15 @@
|
||||
---
|
||||
license: cc-by-nc-4.0
|
||||
tags:
|
||||
- moe
|
||||
---
|
||||
|
||||
# Mixtral MOE 2x7B
|
||||
|
||||
|
||||
|
||||
MOE the following models by mergekit and then fine tuned by DPO.
|
||||
|
||||
* [mistralai/Mistral-7B-Instruct-v0.2](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2)
|
||||
* [NurtureAI/neural-chat-7b-v3-16k](https://huggingface.co/NurtureAI/neural-chat-7b-v3-16k)
|
||||
* [jondurbin/bagel-dpo-7b-v0.1](https://huggingface.co/jondurbin/bagel-dpo-7b-v0.1)
|
||||
Reference in New Issue
Block a user