初始化项目,由ModelHub XC社区提供模型
Model: Kukedlc/NeuralMaxime-7B-DPO Source: Original Platform
This commit is contained in:
16
README.md
Normal file
16
README.md
Normal file
@@ -0,0 +1,16 @@
|
||||
---
|
||||
license: apache-2.0
|
||||
datasets:
|
||||
- Intel/orca_dpo_pairs
|
||||
tags:
|
||||
- code
|
||||
---
|
||||
# NeuralMaxime 7b DPO
|
||||

|
||||
## DPO Intel - Orca
|
||||
|
||||
## Merge - MergeKit
|
||||
|
||||
## Models : NeuralMonarch & AlphaMonarch (MLabonne)
|
||||
|
||||
|
||||
Reference in New Issue
Block a user