Files
llama-3-8b-base-epsilon-dpo…/train_results.json

9 lines
232 B
JSON
Raw Permalink Normal View History

{
"epoch": 0.9989528795811519,
"total_flos": 0.0,
"train_loss": 2.463846208664356,
"train_runtime": 4358.2481,
"train_samples": 61135,
"train_samples_per_second": 14.027,
"train_steps_per_second": 0.109
}