mythomax-13b-sft-lora / all_results.json
youndukn's picture
Model save
cef86ea verified
raw
history blame
362 Bytes
{
"epoch": 1.0,
"eval_loss": 0.8712352514266968,
"eval_runtime": 636.6369,
"eval_samples": 1769,
"eval_samples_per_second": 0.996,
"eval_steps_per_second": 0.126,
"train_loss": 0.9507392455421587,
"train_runtime": 19724.4028,
"train_samples": 15899,
"train_samples_per_second": 0.288,
"train_steps_per_second": 0.036
}