Files
NeuralMaxime-7B-DPO/README.md
ModelHub XC c1c791b669 初始化项目,由ModelHub XC社区提供模型
Model: Kukedlc/NeuralMaxime-7B-DPO
Source: Original Platform
2026-05-01 16:59:39 +08:00

17 lines
499 B
Markdown

---
license: apache-2.0
datasets:
- Intel/orca_dpo_pairs
tags:
- code
---
# NeuralMaxime 7b DPO
![](https://raw.githubusercontent.com/kukedlc87/imagenes/main/DALL%C2%B7E%202024-02-19%2001.43.13%20-%20A%20futuristic%20and%20technological%20image%20featuring%20a%20robot%20whose%20face%20is%20a%20screen%20displaying%20the%20text%20'DPO'.%20The%20scene%20symbolizes%20the%20technique%20for%20fine-t.webp)
## DPO Intel - Orca
## Merge - MergeKit
## Models : NeuralMonarch & AlphaMonarch (MLabonne)