llama3-8b-math-sft-full-chatgpt / train_results.json
Dynosaur's picture
Model save
81541af verified
raw
history blame contribute delete
252 Bytes
{
"epoch": 1.9977298524404086,
"total_flos": 1.4353773986473574e+17,
"train_loss": 0.4103141857367573,
"train_runtime": 2367.2182,
"train_samples": 42274,
"train_samples_per_second": 35.716,
"train_steps_per_second": 0.279
}