初始化项目,由ModelHub XC社区提供模型
Model: Radiantloom/radintloom-mistral-7b-fusion-dpo Source: Original Platform
This commit is contained in:
10
README.md
Normal file
10
README.md
Normal file
@@ -0,0 +1,10 @@
|
||||
---
|
||||
library_name: transformers
|
||||
license: apache-2.0
|
||||
---
|
||||
|
||||
<img src="https://huggingface.co/Radiantloom/radintloom-mistral-7b-fusion/resolve/main/Radiantloom Mistral 7B Fusion.png" alt="Radiantloom Mistral 7B Fusion" width="800" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
|
||||
|
||||
## Radiantloom Mistral 7B Fusion DPO
|
||||
|
||||
This model is a finetuned version of [Radiantloom Mistral 7B Fusion](https://huggingface.co/Radiantloom/radintloom-mistral-7b-fusion). It was finetuned using Direct Preference Optimization (DPO).
|
||||
Reference in New Issue
Block a user