hindi_wav2vec2_final
This model is a fine-tuned version of Harveenchadha/vakyansh-wav2vec2-hindi-him-4200 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.0000
- Wer: 0.0699
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100
- num_epochs: 1000
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
No log | 10.0 | 25 | 2.4458 | 0.9376 |
No log | 20.0 | 50 | 0.5012 | 0.1820 |
No log | 30.0 | 75 | 0.2451 | 0.1511 |
No log | 40.0 | 100 | 0.0827 | 0.1211 |
No log | 50.0 | 125 | 0.0438 | 0.0902 |
No log | 60.0 | 150 | 0.0271 | 0.0827 |
No log | 70.0 | 175 | 0.0282 | 0.0759 |
No log | 80.0 | 200 | 0.0089 | 0.0677 |
No log | 90.0 | 225 | 0.0217 | 0.0752 |
No log | 100.0 | 250 | 0.0088 | 0.0609 |
No log | 110.0 | 275 | 0.0076 | 0.0947 |
No log | 120.0 | 300 | 0.0085 | 0.0624 |
No log | 130.0 | 325 | 0.0060 | 0.0632 |
No log | 140.0 | 350 | 0.0101 | 0.0654 |
No log | 150.0 | 375 | 0.0031 | 0.0579 |
No log | 160.0 | 400 | 0.0035 | 0.0624 |
No log | 170.0 | 425 | 0.0048 | 0.0744 |
No log | 180.0 | 450 | 0.0035 | 0.0744 |
No log | 190.0 | 475 | 0.0196 | 0.0707 |
1.3647 | 200.0 | 500 | 0.0036 | 0.0624 |
1.3647 | 210.0 | 525 | 0.0019 | 0.0617 |
1.3647 | 220.0 | 550 | 0.0022 | 0.0714 |
1.3647 | 230.0 | 575 | 0.0020 | 0.0639 |
1.3647 | 240.0 | 600 | 0.0022 | 0.0594 |
1.3647 | 250.0 | 625 | 0.0014 | 0.0857 |
1.3647 | 260.0 | 650 | 0.0053 | 0.0744 |
1.3647 | 270.0 | 675 | 0.0011 | 0.0609 |
1.3647 | 280.0 | 700 | 0.0007 | 0.0602 |
1.3647 | 290.0 | 725 | 0.0034 | 0.0632 |
1.3647 | 300.0 | 750 | 0.0021 | 0.0609 |
1.3647 | 310.0 | 775 | 0.0009 | 0.0624 |
1.3647 | 320.0 | 800 | 0.0025 | 0.0609 |
1.3647 | 330.0 | 825 | 0.0004 | 0.0579 |
1.3647 | 340.0 | 850 | 0.0002 | 0.0594 |
1.3647 | 350.0 | 875 | 0.0002 | 0.0594 |
1.3647 | 360.0 | 900 | 0.0054 | 0.0617 |
1.3647 | 370.0 | 925 | 0.0007 | 0.0602 |
1.3647 | 380.0 | 950 | 0.0005 | 0.0609 |
1.3647 | 390.0 | 975 | 0.0002 | 0.0594 |
0.0095 | 400.0 | 1000 | 0.0005 | 0.0609 |
0.0095 | 410.0 | 1025 | 0.0002 | 0.0609 |
0.0095 | 420.0 | 1050 | 0.0002 | 0.0632 |
0.0095 | 430.0 | 1075 | 0.0003 | 0.0647 |
0.0095 | 440.0 | 1100 | 0.0010 | 0.0617 |
0.0095 | 450.0 | 1125 | 0.0055 | 0.0654 |
0.0095 | 460.0 | 1150 | 0.0002 | 0.0602 |
0.0095 | 470.0 | 1175 | 0.0001 | 0.0602 |
0.0095 | 480.0 | 1200 | 0.0002 | 0.0617 |
0.0095 | 490.0 | 1225 | 0.0001 | 0.0609 |
0.0095 | 500.0 | 1250 | 0.0001 | 0.0624 |
0.0095 | 510.0 | 1275 | 0.0001 | 0.0632 |
0.0095 | 520.0 | 1300 | 0.0001 | 0.0632 |
0.0095 | 530.0 | 1325 | 0.0001 | 0.0669 |
0.0095 | 540.0 | 1350 | 0.0001 | 0.0654 |
0.0095 | 550.0 | 1375 | 0.0001 | 0.0677 |
0.0095 | 560.0 | 1400 | 0.0001 | 0.0632 |
0.0095 | 570.0 | 1425 | 0.0001 | 0.0609 |
0.0095 | 580.0 | 1450 | 0.0001 | 0.0609 |
0.0095 | 590.0 | 1475 | 0.0001 | 0.0632 |
0.0031 | 600.0 | 1500 | 0.0001 | 0.0714 |
0.0031 | 610.0 | 1525 | 0.0001 | 0.0692 |
0.0031 | 620.0 | 1550 | 0.0001 | 0.0707 |
0.0031 | 630.0 | 1575 | 0.0001 | 0.0677 |
0.0031 | 640.0 | 1600 | 0.0001 | 0.0902 |
0.0031 | 650.0 | 1625 | 0.0000 | 0.0714 |
0.0031 | 660.0 | 1650 | 0.0000 | 0.0767 |
0.0031 | 670.0 | 1675 | 0.0000 | 0.0737 |
0.0031 | 680.0 | 1700 | 0.0000 | 0.0669 |
0.0031 | 690.0 | 1725 | 0.0000 | 0.0677 |
0.0031 | 700.0 | 1750 | 0.0000 | 0.0669 |
0.0031 | 710.0 | 1775 | 0.0000 | 0.0684 |
0.0031 | 720.0 | 1800 | 0.0000 | 0.0669 |
0.0031 | 730.0 | 1825 | 0.0000 | 0.0677 |
0.0031 | 740.0 | 1850 | 0.0000 | 0.0647 |
0.0031 | 750.0 | 1875 | 0.0000 | 0.0647 |
0.0031 | 760.0 | 1900 | 0.0000 | 0.0647 |
0.0031 | 770.0 | 1925 | 0.0000 | 0.0699 |
0.0031 | 780.0 | 1950 | 0.0000 | 0.0699 |
0.0031 | 790.0 | 1975 | 0.0000 | 0.0692 |
0.0014 | 800.0 | 2000 | 0.0000 | 0.0699 |
Framework versions
- Transformers 4.34.0
- Pytorch 2.0.1+cu118
- Datasets 1.18.3
- Tokenizers 0.14.1
- Downloads last month
- 6
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.