marcelovidigal
commited on
Commit
•
57f5c8c
1
Parent(s):
99549f6
Training in progress, epoch 4
Browse files- model.safetensors +1 -1
- wandb/debug-internal.log +0 -0
- wandb/run-20240924_172630-x9iddikd/files/output.log +2 -0
- wandb/run-20240924_172630-x9iddikd/files/wandb-summary.json +1 -1
- wandb/run-20240924_172630-x9iddikd/logs/debug-internal.log +0 -0
- wandb/run-20240924_172630-x9iddikd/run-x9iddikd.wandb +0 -0
model.safetensors
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 267832560
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:9db619b5890ee02e32aa5b3b4461ed2b1be4ee2f6b5b4c3bf3c45d0e47385936
|
3 |
size 267832560
|
wandb/debug-internal.log
CHANGED
The diff for this file is too large to render.
See raw diff
|
|
wandb/run-20240924_172630-x9iddikd/files/output.log
CHANGED
@@ -22,3 +22,5 @@ You should probably TRAIN this model on a down-stream task to be able to use it
|
|
22 |
{'eval_loss': 0.21834564208984375, 'eval_accuracy': 0.938, 'eval_runtime': 30.3683, 'eval_samples_per_second': 32.929, 'eval_steps_per_second': 2.075, 'epoch': 1.0}
|
23 |
{'loss': 0.2031, 'grad_norm': 1.1480563879013062, 'learning_rate': 1.2e-05, 'epoch': 2.0}
|
24 |
{'eval_loss': 0.19427122175693512, 'eval_accuracy': 0.938, 'eval_runtime': 42.2287, 'eval_samples_per_second': 23.681, 'eval_steps_per_second': 1.492, 'epoch': 2.0}
|
|
|
|
|
|
22 |
{'eval_loss': 0.21834564208984375, 'eval_accuracy': 0.938, 'eval_runtime': 30.3683, 'eval_samples_per_second': 32.929, 'eval_steps_per_second': 2.075, 'epoch': 1.0}
|
23 |
{'loss': 0.2031, 'grad_norm': 1.1480563879013062, 'learning_rate': 1.2e-05, 'epoch': 2.0}
|
24 |
{'eval_loss': 0.19427122175693512, 'eval_accuracy': 0.938, 'eval_runtime': 42.2287, 'eval_samples_per_second': 23.681, 'eval_steps_per_second': 1.492, 'epoch': 2.0}
|
25 |
+
{'eval_loss': 0.3195326626300812, 'eval_accuracy': 0.921, 'eval_runtime': 26.5577, 'eval_samples_per_second': 37.654, 'eval_steps_per_second': 2.372, 'epoch': 3.0}
|
26 |
+
{'loss': 0.0672, 'grad_norm': 1.1029362678527832, 'learning_rate': 4.000000000000001e-06, 'epoch': 4.0}
|
wandb/run-20240924_172630-x9iddikd/files/wandb-summary.json
CHANGED
@@ -1 +1 @@
|
|
1 |
-
{"eval/loss": 0.
|
|
|
1 |
+
{"eval/loss": 0.36123067140579224, "eval/accuracy": 0.925, "eval/runtime": 26.675, "eval/samples_per_second": 37.488, "eval/steps_per_second": 2.362, "train/epoch": 4.0, "train/global_step": 1000, "_timestamp": 1727215983.911635, "_runtime": 6393.038725852966, "_step": 5, "train/loss": 0.0672, "train/grad_norm": 1.1029362678527832, "train/learning_rate": 4.000000000000001e-06}
|
wandb/run-20240924_172630-x9iddikd/logs/debug-internal.log
CHANGED
The diff for this file is too large to render.
See raw diff
|
|
wandb/run-20240924_172630-x9iddikd/run-x9iddikd.wandb
CHANGED
Binary files a/wandb/run-20240924_172630-x9iddikd/run-x9iddikd.wandb and b/wandb/run-20240924_172630-x9iddikd/run-x9iddikd.wandb differ
|
|