wav2vec2-base-960h-paper
This model is a fine-tuned version of facebook/wav2vec2-base-960h on the HTS98/ORIGINAL_VER1.2 - NA dataset. It achieves the following results on the evaluation set:
- Loss: 0.8214
- Wer: 0.9640
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 420
- num_epochs: 50.0
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
No log | 0.99 | 104 | 5.6840 | 1.0 |
No log | 2.0 | 209 | 3.9772 | 1.0 |
No log | 3.0 | 314 | 3.4204 | 1.0 |
No log | 4.0 | 419 | 3.3692 | 1.0 |
5.5612 | 4.99 | 523 | 3.3945 | 1.0 |
5.5612 | 6.0 | 628 | 3.3426 | 1.0 |
5.5612 | 7.0 | 733 | 3.3333 | 1.0 |
5.5612 | 8.0 | 838 | 3.3296 | 1.0001 |
5.5612 | 8.99 | 942 | 3.1853 | 0.9999 |
3.2743 | 10.0 | 1047 | 2.1381 | 1.0245 |
3.2743 | 11.0 | 1152 | 1.6965 | 1.0142 |
3.2743 | 12.0 | 1257 | 1.4230 | 1.0011 |
3.2743 | 12.99 | 1361 | 1.2679 | 0.9873 |
3.2743 | 14.0 | 1466 | 1.1570 | 0.9836 |
1.5432 | 15.0 | 1571 | 1.0858 | 0.9784 |
1.5432 | 16.0 | 1676 | 1.0303 | 0.9769 |
1.5432 | 16.99 | 1780 | 0.9855 | 0.9746 |
1.5432 | 18.0 | 1885 | 0.9559 | 0.9709 |
1.5432 | 19.0 | 1990 | 0.9328 | 0.9728 |
0.902 | 20.0 | 2095 | 0.9166 | 0.9738 |
0.902 | 20.99 | 2199 | 0.8991 | 0.9698 |
0.902 | 22.0 | 2304 | 0.8717 | 0.9681 |
0.902 | 23.0 | 2409 | 0.8665 | 0.9669 |
0.7003 | 24.0 | 2514 | 0.8589 | 0.9670 |
0.7003 | 24.99 | 2618 | 0.8420 | 0.9659 |
0.7003 | 26.0 | 2723 | 0.8473 | 0.9661 |
0.7003 | 27.0 | 2828 | 0.8543 | 0.9666 |
0.7003 | 28.0 | 2933 | 0.8315 | 0.9623 |
0.5914 | 28.99 | 3037 | 0.8281 | 0.9626 |
0.5914 | 30.0 | 3142 | 0.8315 | 0.9625 |
0.5914 | 31.0 | 3247 | 0.8261 | 0.9620 |
0.5914 | 32.0 | 3352 | 0.8214 | 0.9640 |
0.5914 | 32.99 | 3456 | 0.8310 | 0.9634 |
0.5157 | 34.0 | 3561 | 0.8252 | 0.9635 |
0.5157 | 35.0 | 3666 | 0.8373 | 0.9638 |
0.5157 | 36.0 | 3771 | 0.8422 | 0.9629 |
0.5157 | 36.99 | 3875 | 0.8294 | 0.9632 |
0.5157 | 38.0 | 3980 | 0.8332 | 0.9576 |
0.4655 | 39.0 | 4085 | 0.8330 | 0.9595 |
0.4655 | 40.0 | 4190 | 0.8297 | 0.9625 |
0.4655 | 40.99 | 4294 | 0.8365 | 0.9621 |
0.4655 | 42.0 | 4399 | 0.8361 | 0.9621 |
0.4266 | 43.0 | 4504 | 0.8416 | 0.9625 |
0.4266 | 44.0 | 4609 | 0.8381 | 0.9634 |
0.4266 | 44.99 | 4713 | 0.8448 | 0.9645 |
0.4266 | 46.0 | 4818 | 0.8447 | 0.9625 |
0.4266 | 47.0 | 4923 | 0.8464 | 0.9641 |
0.4019 | 48.0 | 5028 | 0.8449 | 0.9628 |
0.4019 | 48.99 | 5132 | 0.8487 | 0.9626 |
0.4019 | 49.64 | 5200 | 0.8465 | 0.9629 |
Framework versions
- Transformers 4.31.0.dev0
- Pytorch 2.0.1+cu118
- Datasets 2.7.0
- Tokenizers 0.13.3
- Downloads last month
- 8
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.