marcelovidigal
commited on
Commit
•
ebaa724
1
Parent(s):
a46c7b5
Training in progress, epoch 14
Browse files- model.safetensors +1 -1
- wandb/debug-internal.log +0 -0
- wandb/run-20240924_172630-x9iddikd/files/output.log +1 -0
- wandb/run-20240924_172630-x9iddikd/files/wandb-summary.json +1 -1
- wandb/run-20240924_172630-x9iddikd/logs/debug-internal.log +0 -0
- wandb/run-20240924_172630-x9iddikd/run-x9iddikd.wandb +0 -0
model.safetensors
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 267832560
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:a0505695cf303da48c7fdc01b3e844c17229610edb63ad1c8b203405e2fd9c2e
|
3 |
size 267832560
|
wandb/debug-internal.log
CHANGED
The diff for this file is too large to render.
See raw diff
|
|
wandb/run-20240924_172630-x9iddikd/files/output.log
CHANGED
@@ -46,3 +46,4 @@ You should probably TRAIN this model on a down-stream task to be able to use it
|
|
46 |
{'eval_loss': 0.5506279468536377, 'eval_accuracy': 0.901, 'eval_runtime': 37.4907, 'eval_samples_per_second': 26.673, 'eval_steps_per_second': 0.854, 'epoch': 11.0}
|
47 |
{'loss': 0.0442, 'grad_norm': 0.6364777684211731, 'learning_rate': 7.600000000000001e-06, 'epoch': 12.0}
|
48 |
{'eval_loss': 0.578902006149292, 'eval_accuracy': 0.903, 'eval_runtime': 38.2969, 'eval_samples_per_second': 26.112, 'eval_steps_per_second': 0.836, 'epoch': 12.0}
|
|
|
|
46 |
{'eval_loss': 0.5506279468536377, 'eval_accuracy': 0.901, 'eval_runtime': 37.4907, 'eval_samples_per_second': 26.673, 'eval_steps_per_second': 0.854, 'epoch': 11.0}
|
47 |
{'loss': 0.0442, 'grad_norm': 0.6364777684211731, 'learning_rate': 7.600000000000001e-06, 'epoch': 12.0}
|
48 |
{'eval_loss': 0.578902006149292, 'eval_accuracy': 0.903, 'eval_runtime': 38.2969, 'eval_samples_per_second': 26.112, 'eval_steps_per_second': 0.836, 'epoch': 12.0}
|
49 |
+
{'eval_loss': 0.47741687297821045, 'eval_accuracy': 0.92, 'eval_runtime': 37.7268, 'eval_samples_per_second': 26.506, 'eval_steps_per_second': 0.848, 'epoch': 13.0}
|
wandb/run-20240924_172630-x9iddikd/files/wandb-summary.json
CHANGED
@@ -1 +1 @@
|
|
1 |
-
{"eval/loss": 0.
|
|
|
1 |
+
{"eval/loss": 0.5484298467636108, "eval/accuracy": 0.894, "eval/runtime": 38.013, "eval/samples_per_second": 26.307, "eval/steps_per_second": 0.842, "train/epoch": 14.0, "train/global_step": 1750, "_timestamp": 1727244298.3548748, "_runtime": 34707.48196578026, "_step": 24, "train/loss": 0.0442, "train/grad_norm": 0.6364777684211731, "train/learning_rate": 7.600000000000001e-06, "train_runtime": 8026.8642, "train_samples_per_second": 2.492, "train_steps_per_second": 0.156, "total_flos": 2396475988298112.0, "train_loss": 0.11480112991333008}
|
wandb/run-20240924_172630-x9iddikd/logs/debug-internal.log
CHANGED
The diff for this file is too large to render.
See raw diff
|
|
wandb/run-20240924_172630-x9iddikd/run-x9iddikd.wandb
CHANGED
Binary files a/wandb/run-20240924_172630-x9iddikd/run-x9iddikd.wandb and b/wandb/run-20240924_172630-x9iddikd/run-x9iddikd.wandb differ
|
|