wisdom-llama3-8b / all_results.json
Brown
Upload folder using huggingface_hub
1cc32b5 verified
raw
history blame
207 Bytes
{
"epoch": 3.0,
"total_flos": 7122048987955200.0,
"train_loss": 0.13241534099762473,
"train_runtime": 27380.0053,
"train_samples_per_second": 19.876,
"train_steps_per_second": 0.124
}