Files
NeuralMaxime-7B-DPO/README.md
ModelHub XC c1c791b669 初始化项目,由ModelHub XC社区提供模型
Model: Kukedlc/NeuralMaxime-7B-DPO
Source: Original Platform
2026-05-01 16:59:39 +08:00

499 B

license, datasets, tags
license datasets tags
apache-2.0
Intel/orca_dpo_pairs
code

NeuralMaxime 7b DPO

DPO Intel - Orca

Merge - MergeKit

Models : NeuralMonarch & AlphaMonarch (MLabonne)